Author Topic: Wi-fi backup methods not using a router or wifi drive - How do you backup?  (Read 6187 times)

0 Members and 1 Guest are viewing this topic.

Offline ez24

  • Super Contributor
  • ***
  • Posts: 3091
  • Country: us
  • L.D.A.
While reading this article    (towards end of article)

http://www.ganssle.com/tem/tem311.html

I started thinking about my backups.  I use dozens of USB drives and keep one set about 20 miles away but I only update them once a year.

I keep other drives in the garage and different rooms and in cans in the back yard.  I use Sync Toy to do the backups by backing up to a USB drive every few days as needed then backing up the USB drive to an external drive in another room.   And use a new USB drive as needed.

But all these backups are occurring inside my house.  I like the idea of being reminded to backup weekly and get the drive as far away as possible.

Another method I would like to look into is use wi-fi and send the signals outside of my house.  I would arrange some sort of shed or box and run power to it and set something up.

I know that there are routers that you can connect a backup drive to but my router is inside so this will not work.  I think there are wifi hard drives that might work.  But I want to use USB  drives so I can swap them out as they get full.  So I am looking for something that can receive a wifi signal and connect to an USB drive that can be used as a backup. Not a router or modem, an independent wifi device that can connect to a USB drive.

Does anyone have any ideas?  Thanks


YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline System Error Message

  • Frequent Contributor
  • **
  • Posts: 469
  • Country: gb
you need 3 mediums, online, local and offline.

Online like the cloud (dont put sensitive stuff). Local is where you back up to immediately first which syncs online. Offline is where you put the drive in a fireproof safe that will still be in 1 piece even if a building collapses or if your house gets hit by a missile or flattened by a tank or gets hit by a nuke.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
My first thought was something like a Compute stick or Barebone.

If their WiFi wasn't up to it, then use a WiFi extender and/or an Access Point.
 

Offline Flenser

  • Regular Contributor
  • *
  • Posts: 60
Any computer hardware that will run the backup/sync software you plan to use should work. e.g. I have used rsync which makes linux the most convenient OS for which the hardware could be almost anything. E.g. a computestick, a Raspberry PI with USB WiFi dongle, and old laptop or desktop with a WiFi card you have lying around, all would work. If your preferred sync software is Windows then it will run on most of these as well.

If you want to keep using SyncToy then all you need is something which you can setup to expose the mounted USB drive as a network share. You should be able to do this on any of the above hardware using either Windows or Linux.
 

Online borjam

  • Supporter
  • ****
  • Posts: 803
  • Country: es
  • EA2EKH
FreeNAS is a great NAS implementation. Among its nice features, you can set up two servers and run automated, incremental replications from one to another. Those replicas are asynchronous, which means that a slowish replica over your typical Internet access line won't make your backups slower.

A good piece of hardware to run it (and cheap!) is the HP Proliant Microserver Gen8, about $200 on Amazon. You need to upgrade memory (10 GB minimum, 16 GB is better) and of course the price doesn't include disks. WD Reds are a popular choice.

Regarding the Microserver, get the basic model, you don't need a Xeon processor to run FreeNAS.

 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 3547
  • Country: gb
FWIW, I respect Jack and his approach to backup is pretty good but there's one major flaw. that a lot of professionals, let alone individuals also suffer with.

The log files don't tell you if what you're backing up is good or garbage and they don't tell you if it's restorable
.

There is a huge difference between having a log file that says something is backed up and actually being able to restore it.

Even a verify isn't a guarantee that your backup is good.

I've seen clients with boxes of backups which contain nothing but log files or the master is corrupted leaving only the incrementals, nothing at all or any other combination of worst case scenarios you can imagine including DAT tapes that previously restored correctly but were written on a misaligned DAT drive so they could not be read on a correctly aligned drive.

The IRA bombing of Manchester brought that one to light because the client's misaligned drive was buried under a few tons of rubble along with their Netware server (yes, I am that old)

So, I would highly recommend to anyone who is creating a backup/disaster recovery plan that they include a periodic restore of the backed up data and a sanity check of the restored data which means actually examining the files you've restored.
 

Offline setq

  • Frequent Contributor
  • **
  • Posts: 444
  • Country: gb
Lets also not forget the tale of the idiots with an office in the World Trade Center who stored their backups in the basement. Mmmm melty tape.

Chop your data in half explicitly: stuff you can afford to lose and can easily get back (films, music etc) and stuff you can't (family photos, designs, documents etc). If you're a business you need to keep your filestore clean and tight.

Ignore the stuff you can easily replace. It's not worth looking after it. Carrying around 2TiB of films and crap you can't possibly maintain is a killer for all backup solutions.

The rest of it, a mere 8Gb for me, I keep as follows:

1. A copy on my laptop (Samsung 850 Pro disk, encrypted)
2. A copy in Google drive (paid up Google Apps account which has different T&C to the free ones) synced automatically.
3. A copy on a high quality Corsair survivor encrypted USB stick that comes with me everywhere. I use Beyond Compare to manually move files to this. No automation is allowed as this allows me to see disparity between 2 and this drive (for example if google drive was compromised). This stick is replaced once a year as well.
4. A copy on Amazon S3 which is once a month automatically encrypted with 7zip, and uploaded with http://www.s3express.com/ - these are kept for two years.

Once every 3 months I will do a bitwise compare with beyond compare from a downloaded archive from S3, my google drive and the USB stick to make sure there is no bitrot or corruption.
 

Offline bitslice

  • Frequent Contributor
  • **
  • !
  • Posts: 493
  • Country: gb
But I want to use USB  drives so I can swap them out as they get full.  So I am looking for something that can receive a wifi signal and connect to an USB drive that can be used as a backup. Not a router or modem, an independent wifi device that can connect to a USB drive.

your exact requirements appear to match this:

Wireless Flash Drive
https://www.amazon.com/SanDisk-Wireless-Smartphones-Computers-SDWS4-064G-G46/dp/B00ZCFYF2W/

there are a few versions of this idea.


A slight deviation is an SSD in a USB drive case, attached to a USB Homeplug.
that gives you a data link over mains wiring.


I wouldn't do any of that because it sounds awful, I'd build a sealed unit NAS in the shed and link it up with a fibre optic transceiver.
No ground loops, option to RAID, no crappy wi-fi.

add an Online UPS with CVT and it's darn near bomb proof

« Last Edit: August 16, 2016, 07:19:55 pm by bitslice »
 

Offline ez24

  • Super Contributor
  • ***
  • Posts: 3091
  • Country: us
  • L.D.A.
After I posted this, I found this

https://www.amazon.com/gp/product/B00J3HMY1E/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A2LYX7FKX6AP16

I have not ordered it yet, but it looks like it is exactly what I want.  I will be able to run an extension cord outside (farthest that my wifi will go) and put this and a 2tb portable in some sort of weather proof box.  This way I will have a partial backup (the weekly one) outside the house.

I think I can use sync toy to do routine backups to two different USB drives.
YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
Looks good.  Inexpensive, too.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
FWIW, I respect Jack and his approach to backup is pretty good but there's one major flaw. that a lot of professionals, let alone individuals also suffer with.

The log files don't tell you if what you're backing up is good or garbage and they don't tell you if it's restorable
.

There is a huge difference between having a log file that says something is backed up and actually being able to restore it.

Even a verify isn't a guarantee that your backup is good.

I've seen clients with boxes of backups which contain nothing but log files or the master is corrupted leaving only the incrementals, nothing at all or any other combination of worst case scenarios you can imagine including DAT tapes that previously restored correctly but were written on a misaligned DAT drive so they could not be read on a correctly aligned drive.

The IRA bombing of Manchester brought that one to light because the client's misaligned drive was buried under a few tons of rubble along with their Netware server (yes, I am that old)

So, I would highly recommend to anyone who is creating a backup/disaster recovery plan that they include a periodic restore of the backed up data and a sanity check of the restored data which means actually examining the files you've restored.

Having worked at more than one financial institution in my time, I have been exposed to backup issues and, more particularly, Disaster Recovery Planning.

What CJay has posted is a basic outline of the greater subject of backups - with the ominous but vital question that everyone who does backups really should address.... and it will go something like this:

Will your backups allow you to recover the way you expect them to?

If you don't have an idea of how you 'expect' them to work, then you have a problem from the get-go.


There are all sorts of situations where recovery from backups will be needed, from accidental deletion of a file to the incineration of a building.  You need to assess what possibilities are relevant to your situation and plan accordingly.

However, once you have these things in place - how do you know they are going to work?

The only answer is to do a trial recovery.  Yes, it's a pain ... it will take time and be inconvenient, but until you've done it, you cannot KNOW it will be successful.


The other thing is .... you mustn't cheat, especially if you are looking at a full recovery from a disaster event.

First step is to quarantine the primary site.  You are not allowed to go in to grab any software install disks or serial keys and you can't call someone to look anything up for you.  (I once worked for a company that did a recovery exercise and even told the guy who knew all the nitty gritty to stay at home that day and not answer his phone.)

Find another location to set up.  Then, go out and buy a new box together with the necessary hardware, grab your off-site backups and see if you can get the system running as required.

This is where you will find out if you have really backed up properly and that the backups are readable....

Did you miss a software patch?  ... a little text file or spreadsheet?  ... that post-it on the filing cabinet with your access details to a supplier?


If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
« Last Edit: August 17, 2016, 02:55:56 am by Brumby »
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1085
  • Country: nz
If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
^This.

I am the IT Manager at a reasonably large NZ private company.
Because doing what Brumby has said is such a massive job for a company our size, we only do it every two years.
We do test restores of backups regularly on-site. We only do the full Disaster Recovery every two years.
We hire 8 servers, 8 desktops, 8 laptops, storage and switching. (A fraction of our actual infrastructure.)
Then we proceed to "rebuild the entire company" from our off site backups.
It can be a real challenge, but it proves it can be done.
Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.
Time is the overseer of all things.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
Yes, it is a significant effort - and how frequently it is done is one of the factors in the whole exercise.  Every two years is not unreasonable, especially if your platform is reasonably stable.  It is infinitely better than 'never'.

Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
Perhaps THE most important asset that is produced from the exercise - and this is why....
Quote
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.
 

Offline C

  • Super Contributor
  • ***
  • Posts: 1345
  • Country: us

Back up a step, you are creating all these backups, good. but what has changed on an hourly, daily,weekly or monthly basis?

A good backup is great for getting you running again after a problem, but not great for archive storage.
Most backup everything so that you do not miss something.
So one file that has not changed you have many copies. The file that changes a lot could have 0 copies of a version of the file.
Then ask yourself which of the many copies are good?

Think you should check out file systems that support "copy on write".
 At the low level, each time you go to write a sector you use a new blank sector to store the data. The huge file that has just one bit changed between versions uses one more sector of storage space. 
Some use a snapshot of file system to give you the look back in time.

https://en.wikipedia.org/wiki/Copy-on-write

https://en.wikipedia.org/wiki/Silent_data_corruption

https://en.wikipedia.org/wiki/ZFS

If you are a windows user, FreeNas which uses ZFS looks like a network drive.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
The scope of scenarios is huge and the options for backup are many.  Deciding which is best for any given situation cannot be answered in a forum like this.

Even those of us who have been through the wringer can only offer suggestions...


Most backup everything so that you do not miss something.
This assumes that everything you need for a successful recovery can be backed up.

I am being a little pedantic here, since there is pretty much always a way to have a backup of things - but people all too often limit their thinking to file copying from a HDD.  There is more - which I mentioned above.  Backup copies of software installation discs and serial keys are just a couple.
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1085
  • Country: nz
At work we use a pretty expensive suite of three key applications to achieve the backup and archive requirements we have.
Under legislation we have to keep everything for 10 years - even the bloody spam email... go figure...

Of course our work solution is impractical and too expensive for the average home user.
At home I use Veeam B&R free edition and FreeNAS,  along with encrypted cloud storage and off-site SSD and spinning storage.
All very affordable, but some may consider it overkill...
However, you have to ask yourself: Can I ever replace that photo of the wedding, baby, happy time, <insert valuable experience here>.

Because I am so anal about disaster recovery I do a similar, down scaled, exercise as at work, at home once a year.
Time is the overseer of all things.
 

Offline C

  • Super Contributor
  • ***
  • Posts: 1345
  • Country: us
EZ24
Think of a database lile.
It starts as 1000 sectors in size.
Each day 10 sectors get deleted.
Each day 110 sectors get added.
You file is growing 100 sectors a day.
After 10 days the file is 2000 sectors.
Your 10 daily backups would have used 16500 sectors for data part of this one file.
By telling ZFS to take a snapshot each day which is very fast, 2210 sectors used for data part of this one file. If the file did not change then 1000 sectors used for data part.
The directory part of this which is small could be a little more on ZFS.
You still have access to all 10 file versions.
At the end of a year
file size = 37,500 sectors
Backup = 7,045.500 sectors
ZFS      = 41,260 sectors
This is just the data, so the difference is not this extreme  ZFS uses checksums to verify correct data & directory data in sectors. 

Now think of the delete, change or encrypt problem. Unless the program is running on the computer doing ZFS, this is just a change in what current file looks like. The snapshots save you again and in case this fails, the off-line ZFS backup is there. 

Note that FreeNas uses ZFS

rsync
https://en.wikipedia.org/wiki/Rsync
rsync is a utility that keeps copies of a file on two computer systems. After 10 days running rsync each day 2110 sectors + some overhead would go across the network between the two computers for above file.

Small changes can make a huge difference.
 

Offline Kilrah

  • Supporter
  • ****
  • Posts: 1857
  • Country: ch
I have a big 6TB hard drive in my PC, and 2 more for backups. One of these is stored offsite (work) and swapped with the onsite one every week or so. When I want to run a backup I just put the drive in a USB3 cradle, and run SyncBack SE to analyze both the internal drive and the backup one, and update the backup one that's treated as a mirror copy.

Quick, easy, safe enough, no online crap, and the backups are usable as is i.e. no imaging software or anything, I can just pop any of them in the cradle and access files, which I use when I'm travelling and know I'll need all my data (I take my laptop, the cradle and one of the backup drives, work on it as I'd do at home, then when I come back I reverse the mirror direction to carry my changes back onto the "original").
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 3547
  • Country: gb
If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
^This.

I am the IT Manager at a reasonably large NZ private company.
Because doing what Brumby has said is such a massive job for a company our size, we only do it every two years.
We do test restores of backups regularly on-site. We only do the full Disaster Recovery every two years.
We hire 8 servers, 8 desktops, 8 laptops, storage and switching. (A fraction of our actual infrastructure.)
Then we proceed to "rebuild the entire company" from our off site backups.
It can be a real challenge, but it proves it can be done.
Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.

As long as your infrastructure and ecosystem isn't changing more often than your DR test frequency then you're on a winner, we tend to advise people to test once a year or more often and we offer contracts with 1 yearly DR test as standard, companies where the infrastructure changes that radically and that frequently or where legislation mandates that they'd need to test more often are generally rich enough to be able to afford their own DR Hot/Warm/Cold sites. 
 

Offline Galenbo

  • Super Contributor
  • ***
  • Posts: 1474
  • Country: be
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

I see
1) as a possibility to quick restore a system and be live again, and I see
2) as the solution for files.

I only would take
1) seriously in environments where workers and customers would be blocked, and would cause a huge loss.
2) seriously in all circumstances, protected against theft, fire, flood, rust, aging.
If you try and take a cat apart to see how it works, the first thing you have on your hands is a nonworking cat.
 

Offline suicidaleggroll

  • Super Contributor
  • ***
  • Posts: 1455
  • Country: us
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

Why not a hybrid?  Stick a raspberry pi or similar computer with an external drive at a friend or relative's house plugged into their network, and sync your local backup to that system automatically.  It's online, automatic, and remote.  This is stupid easy to do with rsync and an SSH tunnel if you run Linux or Mac, or even with Cygwin on Windows.
« Last Edit: August 17, 2016, 06:22:30 pm by suicidaleggroll »
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1085
  • Country: nz
As long as your infrastructure and ecosystem isn't changing more often than your DR test frequency then you're on a winner, we tend to advise people to test once a year or more often and we offer contracts with 1 yearly DR test as standard, companies where the infrastructure changes that radically and that frequently or where legislation mandates that they'd need to test more often are generally rich enough to be able to afford their own DR Hot/Warm/Cold sites.

Our infrastructure changes are relatively slow moving.
Company expansion is generally the only change.
We also have the luxury of a hot DR site 30km away from our production server room.
The testing every two years is to ensure we can stand up the entire company in someone else server room should there ever be a region wide disaster in Nelson. 
Time is the overseer of all things.
 

Offline eugenenine

  • Frequent Contributor
  • **
  • Posts: 825
  • Country: us
How much data could you really not live without?  I fought SSD's for a long time because I knew I could never afford a 1TB SSD in my laptop, but that started limiting my choice of hardware.  So I bought a 240G SSD as a test and hung the 1TB off a raspberry Pi to replace my old server.  Then I started by bringing back any files I needed.  At first it was  big but then over time fewer and fewer and now I have just a few G of actual used data, the rest is all archive stuff.  I then grabbed a second raspberry Pi and setup owncloud.  Now that few G of important data is synced though my own 'cloud' and on to my phone and tablet.  So its sync'ed when I'm within range of my wifi and is an offsite backup when I'm not.
 
 

Offline ez24

  • Super Contributor
  • ***
  • Posts: 3091
  • Country: us
  • L.D.A.
How much data could you really not live without?   

14 TB x 5 (backup sets) = 70 TB   so it gets expensive, but the increase has slowed so therefore the costs.  Currently 4tb external drives are my favorite.  I have a set in front of the house, one inside, one in the garage, one in the backyard, and one about 20 miles away (no good for nukes).  I think it takes me about 6 months to refresh this set every two years by low level formatting then restoring with testing.

Now I create just about 2 GB every six months per set

I just ordered this

https://www.amazon.com/gp/product/B00J3HMY1E/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

and will hook it up to a USB 2tb portable outside on the front porch.  I hope I can use SyncToy on my computer and use it and wifi to reach it (I have a wifi camera there) and backup once a day.  That way if there is a fire, someone from the street could see the fire.  I have one set of 14 tb in ammo cans on the front porch also.  So I will have a current set outside of the house.

I have had bad backup copies in the past (and I have no idea when or how it happened), the files looked ok using Explorer.  So I now use Teracopy (with test option) to copy from one drive to another.  I use SyncToy to backup my computer with "no deletions" options.  SyncToy just copies files, I think it is just an automated xcopy. So no weird files or software.

whew  :phew:
YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 10053
  • Country: au
I have had bad backup copies in the past (and I have no idea when or how it happened)

This is the danger with untested backups.  (Not just verify)
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 3547
  • Country: gb
I have had bad backup copies in the past (and I have no idea when or how it happened)

This is the danger with untested backups.  (Not just verify)

As I said earlier :)

I see it a few times a year, people who think their backup is good because it verifies but when it comes to recovering it, not so much.
 

Offline ez24

  • Super Contributor
  • ***
  • Posts: 3091
  • Country: us
  • L.D.A.

As I said earlier :)

I see it a few times a year, people who think their backup is good because it verifies but when it comes to recovering it, not so much.

Why would this happen?  Is the verify no good?  I believe Teracopy does a bit wise test, it takes just as long to test as it does to copy the files.  The files are not encrypted, they are the same as the original.

YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline Gary.M

  • Regular Contributor
  • *
  • Posts: 137
  • Country: nz

As I said earlier :)

I see it a few times a year, people who think their backup is good because it verifies but when it comes to recovering it, not so much.

Why would this happen?  Is the verify no good?  I believe Teracopy does a bit wise test, it takes just as long to test as it does to copy the files.  The files are not encrypted, they are the same as the original.

May I suggest an application called Arq backup. It runs in the background, backs up to Amazon's cloud storage, costs very little per month in storage, includes versioning, can also simultaneously store the same backups locally on a network drive or other local storage.

Sent from my Nexus 7 using Tapatalk

 

Offline Halcyon

  • Global Moderator
  • *****
  • Posts: 3969
  • Country: au
LTO tape :-D
 
The following users thanked this post: CJay

Offline Galenbo

  • Super Contributor
  • ***
  • Posts: 1474
  • Country: be
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

Why not a hybrid?  Stick a raspberry pi or similar computer with an external drive at a friend or relative's house plugged into their network, and sync your local backup to that system automatically.  It's online, automatic, and remote.  This is stupid easy to do with rsync and an SSH tunnel if you run Linux or Mac, or even with Cygwin on Windows.

What you describe is a solution that falls under category 1.
Not protected against overvoltage, y2018 viruses, soft bugs, house fire, EMP pulses, floods, NSA, DEA and Navy Seals.
If you try and take a cat apart to see how it works, the first thing you have on your hands is a nonworking cat.
 

Offline suicidaleggroll

  • Super Contributor
  • ***
  • Posts: 1455
  • Country: us
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

Why not a hybrid?  Stick a raspberry pi or similar computer with an external drive at a friend or relative's house plugged into their network, and sync your local backup to that system automatically.  It's online, automatic, and remote.  This is stupid easy to do with rsync and an SSH tunnel if you run Linux or Mac, or even with Cygwin on Windows.

What you describe is a solution that falls under category 1.
Not protected against overvoltage, y2018 viruses, soft bugs, house fire, EMP pulses, floods, NSA, DEA and Navy Seals.

Except that it is...at least against most of those.  It's not at your local address, it's remote, so it's immune to any event that's isolated to your location (voltage, fire, weather, etc).  It's no more susceptible to viruses, bugs, or the government than your "category 2" is either.

Your remote backup doesn't have to be immune to everything, it only has to be immune to the specific event that took out your system and your primary backup.  You don't have to get very far away or isolated for that to be the case.
« Last Edit: August 18, 2016, 01:48:33 pm by suicidaleggroll »
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 3547
  • Country: gb

As I said earlier :)

I see it a few times a year, people who think their backup is good because it verifies but when it comes to recovering it, not so much.

Why would this happen?  Is the verify no good?  I believe Teracopy does a bit wise test, it takes just as long to test as it does to copy the files.  The files are not encrypted, they are the same as the original.

It depends on what you're backing up, how it's backed up, where and how the verify happens and a multitude of other things but the only way to truly verify a backup is to restore and then sanity check the data.

Teracopy sounds like it does a reasonable job but even then you only get back what you back up, if the original is garbage it will have no way of telling.
 

Offline C

  • Super Contributor
  • ***
  • Posts: 1345
  • Country: us

https://en.wikipedia.org/wiki/S.M.A.R.T.

https://en.wikipedia.org/wiki/Data_degradation



As I said earlier :)

I see it a few times a year, people who think their backup is good because it verifies but when it comes to recovering it, not so much.

Why would this happen?  Is the verify no good?  I believe Teracopy does a bit wise test, it takes just as long to test as it does to copy the files.  The files are not encrypted, they are the same as the original.

A simple example
1. from drive 1, read file in to memory1
2. From memory1 write file to drive 2.
3. From drive 2, read file in to memory2
4. Compare memory1 to memory2 for match

This is fine until
1a. Memory1 changes and computer does not use ECC memory Or Memory1 is bad.
2 writes bad file, 3 reads bad file and 4 verifies bad file matches bad file.
a change to
1. from drive 1, read file in to memory1
2. From memory1 write file to drive 2.
3. From drive 2, read file in to memory2
3a. from drive 1, read file in to memory3
4. Compare memory3 to memory2 for match
This change would have catch to catch this.
Some here will say small chance( true) but memory errors do happen. 

You are still assuming that to/from disk surface to/from memory is perfect.

The above links describe other errors that happen.

What I am reading is making many copies of a file with unknown status and no easy way for computer to know if file is good or bad.
https://en.wikipedia.org/wiki/Checksum
A file checksum is one way that can help.

It also sounds like you want an archive more then a backup and be able to create many copies of the archive.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf