Author Topic: Wi-fi backup methods not using a router or wifi drive - How do you backup?  (Read 8066 times)

0 Members and 1 Guest are viewing this topic.

Offline ez24Topic starter

  • Super Contributor
  • ***
  • Posts: 3082
  • Country: us
  • L.D.A.
While reading this article    (towards end of article)

http://www.ganssle.com/tem/tem311.html

I started thinking about my backups.  I use dozens of USB drives and keep one set about 20 miles away but I only update them once a year.

I keep other drives in the garage and different rooms and in cans in the back yard.  I use Sync Toy to do the backups by backing up to a USB drive every few days as needed then backing up the USB drive to an external drive in another room.   And use a new USB drive as needed.

But all these backups are occurring inside my house.  I like the idea of being reminded to backup weekly and get the drive as far away as possible.

Another method I would like to look into is use wi-fi and send the signals outside of my house.  I would arrange some sort of shed or box and run power to it and set something up.

I know that there are routers that you can connect a backup drive to but my router is inside so this will not work.  I think there are wifi hard drives that might work.  But I want to use USB  drives so I can swap them out as they get full.  So I am looking for something that can receive a wifi signal and connect to an USB drive that can be used as a backup. Not a router or modem, an independent wifi device that can connect to a USB drive.

Does anyone have any ideas?  Thanks


YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline System Error Message

  • Frequent Contributor
  • **
  • Posts: 473
  • Country: gb
you need 3 mediums, online, local and offline.

Online like the cloud (dont put sensitive stuff). Local is where you back up to immediately first which syncs online. Offline is where you put the drive in a fireproof safe that will still be in 1 piece even if a building collapses or if your house gets hit by a missile or flattened by a tank or gets hit by a nuke.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
My first thought was something like a Compute stick or Barebone.

If their WiFi wasn't up to it, then use a WiFi extender and/or an Access Point.
 

Offline Flenser

  • Regular Contributor
  • *
  • Posts: 60
Any computer hardware that will run the backup/sync software you plan to use should work. e.g. I have used rsync which makes linux the most convenient OS for which the hardware could be almost anything. E.g. a computestick, a Raspberry PI with USB WiFi dongle, and old laptop or desktop with a WiFi card you have lying around, all would work. If your preferred sync software is Windows then it will run on most of these as well.

If you want to keep using SyncToy then all you need is something which you can setup to expose the mounted USB drive as a network share. You should be able to do this on any of the above hardware using either Windows or Linux.
 

Offline borjam

  • Supporter
  • ****
  • Posts: 908
  • Country: es
  • EA2EKH
FreeNAS is a great NAS implementation. Among its nice features, you can set up two servers and run automated, incremental replications from one to another. Those replicas are asynchronous, which means that a slowish replica over your typical Internet access line won't make your backups slower.

A good piece of hardware to run it (and cheap!) is the HP Proliant Microserver Gen8, about $200 on Amazon. You need to upgrade memory (10 GB minimum, 16 GB is better) and of course the price doesn't include disks. WD Reds are a popular choice.

Regarding the Microserver, get the basic model, you don't need a Xeon processor to run FreeNAS.

 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 4136
  • Country: gb
FWIW, I respect Jack and his approach to backup is pretty good but there's one major flaw. that a lot of professionals, let alone individuals also suffer with.

The log files don't tell you if what you're backing up is good or garbage and they don't tell you if it's restorable
.

There is a huge difference between having a log file that says something is backed up and actually being able to restore it.

Even a verify isn't a guarantee that your backup is good.

I've seen clients with boxes of backups which contain nothing but log files or the master is corrupted leaving only the incrementals, nothing at all or any other combination of worst case scenarios you can imagine including DAT tapes that previously restored correctly but were written on a misaligned DAT drive so they could not be read on a correctly aligned drive.

The IRA bombing of Manchester brought that one to light because the client's misaligned drive was buried under a few tons of rubble along with their Netware server (yes, I am that old)

So, I would highly recommend to anyone who is creating a backup/disaster recovery plan that they include a periodic restore of the backed up data and a sanity check of the restored data which means actually examining the files you've restored.
 

Offline setq

  • Frequent Contributor
  • **
  • Posts: 443
  • Country: gb
Lets also not forget the tale of the idiots with an office in the World Trade Center who stored their backups in the basement. Mmmm melty tape.

Chop your data in half explicitly: stuff you can afford to lose and can easily get back (films, music etc) and stuff you can't (family photos, designs, documents etc). If you're a business you need to keep your filestore clean and tight.

Ignore the stuff you can easily replace. It's not worth looking after it. Carrying around 2TiB of films and crap you can't possibly maintain is a killer for all backup solutions.

The rest of it, a mere 8Gb for me, I keep as follows:

1. A copy on my laptop (Samsung 850 Pro disk, encrypted)
2. A copy in Google drive (paid up Google Apps account which has different T&C to the free ones) synced automatically.
3. A copy on a high quality Corsair survivor encrypted USB stick that comes with me everywhere. I use Beyond Compare to manually move files to this. No automation is allowed as this allows me to see disparity between 2 and this drive (for example if google drive was compromised). This stick is replaced once a year as well.
4. A copy on Amazon S3 which is once a month automatically encrypted with 7zip, and uploaded with http://www.s3express.com/ - these are kept for two years.

Once every 3 months I will do a bitwise compare with beyond compare from a downloaded archive from S3, my google drive and the USB stick to make sure there is no bitrot or corruption.
 

Offline bitslice

  • Frequent Contributor
  • **
  • !
  • Posts: 493
  • Country: gb
But I want to use USB  drives so I can swap them out as they get full.  So I am looking for something that can receive a wifi signal and connect to an USB drive that can be used as a backup. Not a router or modem, an independent wifi device that can connect to a USB drive.

your exact requirements appear to match this:

Wireless Flash Drive
https://www.amazon.com/SanDisk-Wireless-Smartphones-Computers-SDWS4-064G-G46/dp/B00ZCFYF2W/

there are a few versions of this idea.


A slight deviation is an SSD in a USB drive case, attached to a USB Homeplug.
that gives you a data link over mains wiring.


I wouldn't do any of that because it sounds awful, I'd build a sealed unit NAS in the shed and link it up with a fibre optic transceiver.
No ground loops, option to RAID, no crappy wi-fi.

add an Online UPS with CVT and it's darn near bomb proof

« Last Edit: August 16, 2016, 07:19:55 pm by bitslice »
 

Offline ez24Topic starter

  • Super Contributor
  • ***
  • Posts: 3082
  • Country: us
  • L.D.A.
After I posted this, I found this

https://www.amazon.com/gp/product/B00J3HMY1E/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A2LYX7FKX6AP16

I have not ordered it yet, but it looks like it is exactly what I want.  I will be able to run an extension cord outside (farthest that my wifi will go) and put this and a 2tb portable in some sort of weather proof box.  This way I will have a partial backup (the weekly one) outside the house.

I think I can use sync toy to do routine backups to two different USB drives.
YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
Looks good.  Inexpensive, too.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
FWIW, I respect Jack and his approach to backup is pretty good but there's one major flaw. that a lot of professionals, let alone individuals also suffer with.

The log files don't tell you if what you're backing up is good or garbage and they don't tell you if it's restorable
.

There is a huge difference between having a log file that says something is backed up and actually being able to restore it.

Even a verify isn't a guarantee that your backup is good.

I've seen clients with boxes of backups which contain nothing but log files or the master is corrupted leaving only the incrementals, nothing at all or any other combination of worst case scenarios you can imagine including DAT tapes that previously restored correctly but were written on a misaligned DAT drive so they could not be read on a correctly aligned drive.

The IRA bombing of Manchester brought that one to light because the client's misaligned drive was buried under a few tons of rubble along with their Netware server (yes, I am that old)

So, I would highly recommend to anyone who is creating a backup/disaster recovery plan that they include a periodic restore of the backed up data and a sanity check of the restored data which means actually examining the files you've restored.

Having worked at more than one financial institution in my time, I have been exposed to backup issues and, more particularly, Disaster Recovery Planning.

What CJay has posted is a basic outline of the greater subject of backups - with the ominous but vital question that everyone who does backups really should address.... and it will go something like this:

Will your backups allow you to recover the way you expect them to?

If you don't have an idea of how you 'expect' them to work, then you have a problem from the get-go.


There are all sorts of situations where recovery from backups will be needed, from accidental deletion of a file to the incineration of a building.  You need to assess what possibilities are relevant to your situation and plan accordingly.

However, once you have these things in place - how do you know they are going to work?

The only answer is to do a trial recovery.  Yes, it's a pain ... it will take time and be inconvenient, but until you've done it, you cannot KNOW it will be successful.


The other thing is .... you mustn't cheat, especially if you are looking at a full recovery from a disaster event.

First step is to quarantine the primary site.  You are not allowed to go in to grab any software install disks or serial keys and you can't call someone to look anything up for you.  (I once worked for a company that did a recovery exercise and even told the guy who knew all the nitty gritty to stay at home that day and not answer his phone.)

Find another location to set up.  Then, go out and buy a new box together with the necessary hardware, grab your off-site backups and see if you can get the system running as required.

This is where you will find out if you have really backed up properly and that the backups are readable....

Did you miss a software patch?  ... a little text file or spreadsheet?  ... that post-it on the filing cabinet with your access details to a supplier?


If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
« Last Edit: August 17, 2016, 02:55:56 am by Brumby »
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1237
  • Country: nz
If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
^This.

I am the IT Manager at a reasonably large NZ private company.
Because doing what Brumby has said is such a massive job for a company our size, we only do it every two years.
We do test restores of backups regularly on-site. We only do the full Disaster Recovery every two years.
We hire 8 servers, 8 desktops, 8 laptops, storage and switching. (A fraction of our actual infrastructure.)
Then we proceed to "rebuild the entire company" from our off site backups.
It can be a real challenge, but it proves it can be done.
Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.
I approach the thinking of all of my posts using AI in the first instance. (Awkward Irregularity)
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
Yes, it is a significant effort - and how frequently it is done is one of the factors in the whole exercise.  Every two years is not unreasonable, especially if your platform is reasonably stable.  It is infinitely better than 'never'.

Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
Perhaps THE most important asset that is produced from the exercise - and this is why....
Quote
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.
 

Offline C

  • Super Contributor
  • ***
  • Posts: 1346
  • Country: us

Back up a step, you are creating all these backups, good. but what has changed on an hourly, daily,weekly or monthly basis?

A good backup is great for getting you running again after a problem, but not great for archive storage.
Most backup everything so that you do not miss something.
So one file that has not changed you have many copies. The file that changes a lot could have 0 copies of a version of the file.
Then ask yourself which of the many copies are good?

Think you should check out file systems that support "copy on write".
 At the low level, each time you go to write a sector you use a new blank sector to store the data. The huge file that has just one bit changed between versions uses one more sector of storage space. 
Some use a snapshot of file system to give you the look back in time.

https://en.wikipedia.org/wiki/Copy-on-write

https://en.wikipedia.org/wiki/Silent_data_corruption

https://en.wikipedia.org/wiki/ZFS

If you are a windows user, FreeNas which uses ZFS looks like a network drive.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
The scope of scenarios is huge and the options for backup are many.  Deciding which is best for any given situation cannot be answered in a forum like this.

Even those of us who have been through the wringer can only offer suggestions...


Most backup everything so that you do not miss something.
This assumes that everything you need for a successful recovery can be backed up.

I am being a little pedantic here, since there is pretty much always a way to have a backup of things - but people all too often limit their thinking to file copying from a HDD.  There is more - which I mentioned above.  Backup copies of software installation discs and serial keys are just a couple.
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1237
  • Country: nz
At work we use a pretty expensive suite of three key applications to achieve the backup and archive requirements we have.
Under legislation we have to keep everything for 10 years - even the bloody spam email... go figure...

Of course our work solution is impractical and too expensive for the average home user.
At home I use Veeam B&R free edition and FreeNAS,  along with encrypted cloud storage and off-site SSD and spinning storage.
All very affordable, but some may consider it overkill...
However, you have to ask yourself: Can I ever replace that photo of the wedding, baby, happy time, <insert valuable experience here>.

Because I am so anal about disaster recovery I do a similar, down scaled, exercise as at work, at home once a year.
I approach the thinking of all of my posts using AI in the first instance. (Awkward Irregularity)
 

Offline C

  • Super Contributor
  • ***
  • Posts: 1346
  • Country: us
EZ24
Think of a database lile.
It starts as 1000 sectors in size.
Each day 10 sectors get deleted.
Each day 110 sectors get added.
You file is growing 100 sectors a day.
After 10 days the file is 2000 sectors.
Your 10 daily backups would have used 16500 sectors for data part of this one file.
By telling ZFS to take a snapshot each day which is very fast, 2210 sectors used for data part of this one file. If the file did not change then 1000 sectors used for data part.
The directory part of this which is small could be a little more on ZFS.
You still have access to all 10 file versions.
At the end of a year
file size = 37,500 sectors
Backup = 7,045.500 sectors
ZFS      = 41,260 sectors
This is just the data, so the difference is not this extreme  ZFS uses checksums to verify correct data & directory data in sectors. 

Now think of the delete, change or encrypt problem. Unless the program is running on the computer doing ZFS, this is just a change in what current file looks like. The snapshots save you again and in case this fails, the off-line ZFS backup is there. 

Note that FreeNas uses ZFS

rsync
https://en.wikipedia.org/wiki/Rsync
rsync is a utility that keeps copies of a file on two computer systems. After 10 days running rsync each day 2110 sectors + some overhead would go across the network between the two computers for above file.

Small changes can make a huge difference.
 

Offline Kilrah

  • Supporter
  • ****
  • Posts: 1852
  • Country: ch
I have a big 6TB hard drive in my PC, and 2 more for backups. One of these is stored offsite (work) and swapped with the onsite one every week or so. When I want to run a backup I just put the drive in a USB3 cradle, and run SyncBack SE to analyze both the internal drive and the backup one, and update the backup one that's treated as a mirror copy.

Quick, easy, safe enough, no online crap, and the backups are usable as is i.e. no imaging software or anything, I can just pop any of them in the cradle and access files, which I use when I'm travelling and know I'll need all my data (I take my laptop, the cradle and one of the backup drives, work on it as I'd do at home, then when I come back I reverse the mirror direction to carry my changes back onto the "original").
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 4136
  • Country: gb
If you don't do something to verify that you can actually complete a recovery then you are driving down the information highway without brakes.
^This.

I am the IT Manager at a reasonably large NZ private company.
Because doing what Brumby has said is such a massive job for a company our size, we only do it every two years.
We do test restores of backups regularly on-site. We only do the full Disaster Recovery every two years.
We hire 8 servers, 8 desktops, 8 laptops, storage and switching. (A fraction of our actual infrastructure.)
Then we proceed to "rebuild the entire company" from our off site backups.
It can be a real challenge, but it proves it can be done.
Having done this now for 14 years we have a very comprehensive and accurate "Restore Manual".
This means that any reasonably accomplished network engineer could be brought in to rebuild the company if me and my staff were not available to do so.

As long as your infrastructure and ecosystem isn't changing more often than your DR test frequency then you're on a winner, we tend to advise people to test once a year or more often and we offer contracts with 1 yearly DR test as standard, companies where the infrastructure changes that radically and that frequently or where legislation mandates that they'd need to test more often are generally rich enough to be able to afford their own DR Hot/Warm/Cold sites. 
 

Offline Galenbo

  • Super Contributor
  • ***
  • Posts: 1469
  • Country: be
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

I see
1) as a possibility to quick restore a system and be live again, and I see
2) as the solution for files.

I only would take
1) seriously in environments where workers and customers would be blocked, and would cause a huge loss.
2) seriously in all circumstances, protected against theft, fire, flood, rust, aging.
If you try and take a cat apart to see how it works, the first thing you have on your hands is a nonworking cat.
 

Offline suicidaleggroll

  • Super Contributor
  • ***
  • Posts: 1453
  • Country: us
I see 2 mayor classes of "Backup"

1) the connected, automatic backup, with hardware at the same address.
2) the manually plugged-in and started, with multiple unconnected hardware stored in a safe somewhere else.

Why not a hybrid?  Stick a raspberry pi or similar computer with an external drive at a friend or relative's house plugged into their network, and sync your local backup to that system automatically.  It's online, automatic, and remote.  This is stupid easy to do with rsync and an SSH tunnel if you run Linux or Mac, or even with Cygwin on Windows.
« Last Edit: August 17, 2016, 06:22:30 pm by suicidaleggroll »
 

Offline Mr.B

  • Supporter
  • ****
  • Posts: 1237
  • Country: nz
As long as your infrastructure and ecosystem isn't changing more often than your DR test frequency then you're on a winner, we tend to advise people to test once a year or more often and we offer contracts with 1 yearly DR test as standard, companies where the infrastructure changes that radically and that frequently or where legislation mandates that they'd need to test more often are generally rich enough to be able to afford their own DR Hot/Warm/Cold sites.

Our infrastructure changes are relatively slow moving.
Company expansion is generally the only change.
We also have the luxury of a hot DR site 30km away from our production server room.
The testing every two years is to ensure we can stand up the entire company in someone else server room should there ever be a region wide disaster in Nelson. 
I approach the thinking of all of my posts using AI in the first instance. (Awkward Irregularity)
 

Offline eugenenine

  • Frequent Contributor
  • **
  • Posts: 865
  • Country: us
How much data could you really not live without?  I fought SSD's for a long time because I knew I could never afford a 1TB SSD in my laptop, but that started limiting my choice of hardware.  So I bought a 240G SSD as a test and hung the 1TB off a raspberry Pi to replace my old server.  Then I started by bringing back any files I needed.  At first it was  big but then over time fewer and fewer and now I have just a few G of actual used data, the rest is all archive stuff.  I then grabbed a second raspberry Pi and setup owncloud.  Now that few G of important data is synced though my own 'cloud' and on to my phone and tablet.  So its sync'ed when I'm within range of my wifi and is an offsite backup when I'm not.
 
 

Offline ez24Topic starter

  • Super Contributor
  • ***
  • Posts: 3082
  • Country: us
  • L.D.A.
How much data could you really not live without?   

14 TB x 5 (backup sets) = 70 TB   so it gets expensive, but the increase has slowed so therefore the costs.  Currently 4tb external drives are my favorite.  I have a set in front of the house, one inside, one in the garage, one in the backyard, and one about 20 miles away (no good for nukes).  I think it takes me about 6 months to refresh this set every two years by low level formatting then restoring with testing.

Now I create just about 2 GB every six months per set

I just ordered this

https://www.amazon.com/gp/product/B00J3HMY1E/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

and will hook it up to a USB 2tb portable outside on the front porch.  I hope I can use SyncToy on my computer and use it and wifi to reach it (I have a wifi camera there) and backup once a day.  That way if there is a fire, someone from the street could see the fire.  I have one set of 14 tb in ammo cans on the front porch also.  So I will have a current set outside of the house.

I have had bad backup copies in the past (and I have no idea when or how it happened), the files looked ok using Explorer.  So I now use Teracopy (with test option) to copy from one drive to another.  I use SyncToy to backup my computer with "no deletions" options.  SyncToy just copies files, I think it is just an automated xcopy. So no weird files or software.

whew  :phew:
YouTube and Website Electronic Resources ------>  https://www.eevblog.com/forum/other-blog-specific/a/msg1341166/#msg1341166
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12298
  • Country: au
I have had bad backup copies in the past (and I have no idea when or how it happened)

This is the danger with untested backups.  (Not just verify)
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf