Author Topic: Backup strategies  (Read 13163 times)

0 Members and 1 Guest are viewing this topic.

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Re: Backup strategies
« Reply #25 on: August 05, 2015, 03:31:46 am »
Also, why use FAT32? It's insecure and prone to corruption. With FAT32 your system lacks one of the most basic and important security systems - per user file permissions. That's one reason why operating systems newer than XP won't install on it.

Ha ha.. that reasoning always makes me laugh. "Per user file permissions" - on a machine that has only ONE user - me. Imo the entire structure of multi-user security on machines that are for personal use, is a massive pile of pointless overcomplexity and annoyance, which also forces combinatorial complexity contamination and obfuscation into all the rest of the OS. As far as I'm concerned, my machines are an extension of my own mind. Everything in them is mine, and should be as readily available to me as my own memories and thoughts. Things like 'Windows (system) file protection' and all other per-user protections and customizations are something I have to go to extra trouble to disable as much as possible, on every Windows install I do.
With every successive Windows version such unacceptable bullshit gets harder (to impossible) to remove, which is one (of many) reasons I won't follow MS down that path.

For instance, the NTFS 'protected system folders' are insufferable to me. But that's just one small detail... I'll spare you a lengthy anti-NTFS rant.

Darn it.. I wanted to link a really good article about file system abstraction and hiding in MS operating systems, but I can't find it just now.

Also ALL file systems are susceptible to corruption (spare me any claims to the contrary), and although this can be made less so, the overheads required are imo not worth the trouble.  Especially since the #1 cause of file system corruption (in personal computers) is unexpected power down during writes, and this risk could be completely eliminated by simple hardware-software measures such as an adequate PowerFailWarn interrupt, slightly longer and guaranteed power supply hold-up times, and software that took care never to begin  critical tasks it can't guarantee to complete within the known power-good window. (This is not opinion, this is experience of having designed embedded processor systems that handled LOTS of money, and were utterly impervious to data loss due to people playing drum tunes on the mains power switch. They would do it deliberately, hoping glitches might benefit them financially.)

But do we see _that_ cheap & easy improvement in the PC architecture? Noooooo.

At least with FAT32 the structure is well known, relatively simple and can be hand patched in an emergency. Not so NTFS.
Yes, I have to live with NTFS for large drives. But I consider this a temporary compromise.

« Last Edit: August 05, 2015, 04:54:45 am by TerraHertz »
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
Re: Backup strategies
« Reply #26 on: August 05, 2015, 10:34:50 am »
Also, why use FAT32? It's insecure and prone to corruption. With FAT32 your system lacks one of the most basic and important security systems - per user file permissions. That's one reason why operating systems newer than XP won't install on it.

Ha ha.. that reasoning always makes me laugh. "Per user file permissions" - on a machine that has only ONE user - me. Imo the entire structure of multi-user security on machines that are for personal use, is a massive pile of pointless overcomplexity and annoyance, which also forces combinatorial complexity contamination and obfuscation into all the rest of the OS. As far as I'm concerned, my machines are an extension of my own mind. Everything in them is mine, and should be as readily available to me as my own memories and thoughts.
This way of thinking is a malware writer's wettest & wildest dream. If you use a PC as a root/administrator user to browse internet then it is very likely infected. Using a computer as a user with limited access is very effective in keeping malware and virusses out.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline dfmischler

  • Frequent Contributor
  • **
  • Posts: 548
  • Country: us
Re: Backup strategies
« Reply #27 on: August 05, 2015, 11:04:20 am »
On the disk/tape issue: at this time, due to the cost of worthwhile tape drives, tape is only cost effective when you are rotating between a large number of tapes.  It may still be the only cost-effective backup option for dealing with dozens of terabytes or more.

I guess what I don't like that I am hearing is the idea that it is OK to keep a USB 3.0 drive plugged in for a month and keep running incremental backups to it during that period.  IMO, a drive that is plugged in and running is too vulnerable to depend on for that long.  Yes, the likelihood of a building fire or lightning strike is low, but the cost to your organization of recovering from one on the 29th day of using that cinder-that-used-to-be-a-USB-drive could be astronomical.  It is also good to realize that a drive on a shelf someplace can't be so easily damaged by malware or user errors.

How much work (i.e. new data) are you willing to risk losing?  If the answer is one day (or 2 days, or a week) then that backup disk/tape should get pulled and taken off site every day (or 2 days, or a week).  Of course, you need to make another disk available for the next backup at that time.  How many disks/tapes do you need?  That depends.  Here are some well though out ideas on that: Common Backup Strategies.
« Last Edit: August 05, 2015, 12:42:40 pm by dfmischler »
 

Offline george graves

  • Super Contributor
  • ***
  • Posts: 1257
  • Country: us
Re: Backup strategies
« Reply #28 on: August 05, 2015, 12:19:51 pm »
I only ever trust cloud providers with data that I would be happy for anyone to view....I assume someone is looking at it.

I'm not pointing fingers at you....just in general.  So take it with a grain of salt. But...............

I love how people think they are interesting enough to be "spied" on.  It's the same thing of guys that are obsessed with destroying hard drive.



A.) Yes, your mom might find your porn stash.
B.) No the NSA could care less about you.
C.)  And...again no, you're not really interesting - yes I know you googled for "how much plutonium for an atomic bomb" - so did a million other people.
D.) You don't have secrets. Get over yourself.

« Last Edit: August 05, 2015, 12:21:54 pm by george graves »
 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 37740
  • Country: au
    • EEVblog
Re: Backup strategies
« Reply #29 on: August 05, 2015, 12:30:30 pm »
How fast is your internet connection? At 5Mb/sec you would need 37 days to upload 2TB, so at the rate you produce videos it would be no problem at all to back up to the cloud overnight if you could reach that kind of speed.

You obviously don't live in Australia. Most of us don't have unlimited bandwidths, and uploads count.
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Re: Backup strategies
« Reply #30 on: August 05, 2015, 01:16:07 pm »
This demonstrates a fundamental lack of understanding of how computer security works. [snip a lot of superficial lecturing]
It's great the way you assume I don't know any of that basic stuff, because I come to a different conclusion to what you are sure must be correct. Anyway, couldn't be bothered explaining it to you.

Quote
Quote
As far as I'm concerned, my machines are an extension of my own mind.

And you think it's okay to let any random bit of software read or write anything in your mind... Okay. Good luck with that.

Bzzzt. Basic logic failure, symptomatic of your approach to many issues. Did I say it does or should work both ways? No, I didn't. Nor did I suggest allowing random software to write anywhere in the filesystem. You just assume there's only one possible protection model, and it's based on user permissions. You're wrong.
However, the question of what external information you allow to influence your understanding of whatever, can actually be a useful analogy for computing systems security. You should think about it.

Quote
Quote
Things like 'Windows (system) file protection' and all other per-user protections and customizations are something I have to go to extra trouble to disable as much as possible, on every Windows install I do.

 :palm:
Only because you can't imagine any other security method.

Quote
Can you name one other modern operating that doesn't go down this route of making things more secure? Android, OS X, iOS, Firefox OS, Linux and the various BSDs all do it. If it was such a terrible idea you would think that at least one major OS would offer an alternative.

Argument from majority view, aka appeal to authority, is invalid reasoning. 'They all do this, so it must be the right and only way'.
And it's working soooo well, isn't it? All these heavily secured systems are totally immune to all kinds of hostile code, right?

Quote
Quote
For instance, the NTFS 'protected system folders' are insufferable to me.
Indeed, why would anyone want to suffer from having their system folders protected.

Indeed, why would anyone want to even be able to see or touch _any_ of the system at all? Why not just make it all totally invisible and untouchable? Then we can all just assume it runs by magic, and all that magic is entirely benevolent and totally acting in our interests, doesn't spy or anything like that, and couldn't have been done better by any human.
(To answer that absurdity, I point in the direction of Windows 8, and the soon to be equally or even more loathed Windows 10.)

Quote
Quote
Darn it.. I wanted to link a really good article about file system abstraction and hiding in MS operating systems, but I can't find it just now.

I've probably read it. FS and registry abstraction was one of the biggest improvements in Vista. Microsoft finally stopped applications being able to shit all over the filesystem any time they wanted to. That alone killed a lot of malware and crapware in an instant.

Wow. So you know about that, and actually think it's a good idea. I'm stunned. Are you sure you are human?

Quote
Quote
Also ALL file systems are susceptible to corruption (spare me any claims to the contrary), and although this can be made less so, the overheads required are imo not worth the trouble. Especially since the #1 cause of file system corruption (in personal computers) is unexpected power down during writes, and this risk could be completely eliminated by simple hardware-software measures such as an adequate PowerFailWarn interrupt, slightly longer and guaranteed power supply hold-up times, and software that took care never to begin  critical tasks it can't guarantee to complete within the known power-good window.

Again, every single modern filesystem uses journaling and other checks to prevent corruption. The overhead is minimal, and it protects against the scenario you describe.
Except when it doesn't. Please don't try and pretend it's impossible for journalled filesystems to get in a snit.

Quote
Quote
But do we see _that_ cheap & easy improvement in the PC architecture? Noooooo.

Perhaps because it is neither easy nor cheap.
Except it is easy and cheap. Sole reason it isn't done, is that MS-Intel have no actual interest in making PC systems more robust, or capable of controlling real world machinery that requires guaranteed reliability. (Btw If they had such an interest, there wouldn't be any such thing as the Registry.)

Quote
Quote
At least with FAT32 the structure is well known, relatively simple and can be hand patched in an emergency. Not so NTFS.
NTFS was released in 1993. It is 22 years old.

Remind me again how long it took the Linux guys to reverse engineer NTFS enough to produce reliable NTFS drivers that could actually write to an NTFS partition without randomly trashing it. Oh wait, never mind, I think it was about 2007. Wasn't it nice of MS to not publish the specs? (Did they ever?)
« Last Edit: August 05, 2015, 01:17:51 pm by TerraHertz »
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16863
  • Country: lv
Re: Backup strategies
« Reply #31 on: August 05, 2015, 01:48:07 pm »
Ha ha.. that reasoning always makes me laugh. "Per user file permissions" - on a machine that has only ONE user - me.

This demonstrates a fundamental lack of understanding of how computer security works.
...

Quote
At least with FAT32 the structure is well known, relatively simple and can be hand patched in an emergency. Not so NTFS.

NTFS was released in 1993. It is 22 years old.
+1 :-DD
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16863
  • Country: lv
Re: Backup strategies
« Reply #32 on: August 05, 2015, 02:59:30 pm »
Indeed, why would anyone want to even be able to see or touch _any_ of the system at all? Why not just make it all totally invisible and untouchable?
Because 99% of the people have nothing to do there and those who really have what to do, can do this anyway.
Quote
Then we can all just assume it runs by magic, and all that magic is entirely benevolent and totally acting in our interests, doesn't spy or anything like that, and couldn't have been done better by any human.
As if being able to thinker with system files without any limitation would give you any clue about spying on you, but if you have the right knowledge to check this, thole limitations would hardly be an obstacle for you anyway.
 

Offline lewis

  • Frequent Contributor
  • **
  • Posts: 704
  • Country: gb
  • Nullius in verba
Re: Backup strategies
« Reply #33 on: August 05, 2015, 03:19:19 pm »
You don't have secrets. Get over yourself.

Do you have frosted glass on your bathroom window?
I will not be pushed, filed, stamped, indexed, briefed, debriefed or numbered.
 

Offline rx8pilot

  • Super Contributor
  • ***
  • Posts: 3634
  • Country: us
  • If you want more money, be more valuable.
Re: Backup strategies
« Reply #34 on: August 05, 2015, 06:22:31 pm »
Depends if you need to back up video or not.
For my needs as a video content producer, the only cost-effective and suitable backup and archive medium is hard drives.
Cloud doesn't work for massive amounts of video files.
I keep two hard drives (different brands) with the same data.
Just for the raw video for episodes 500-715, that fill a 2TB drive, and that requirement always increases as I shoot at ever higher bitrates.
For all my photo, documents, design files and other stuff, my 1TB Dropbox is perfect, so it's backed up in the cloud and on several local machine that are synced.

I have been challenged with massive data backup/transportation for many years in entertainment world. I am not doing it anymore, but some of the cameras I have used can be north of 56MB per frame. 8k full RGB. 2-3 cameras rolling about an hour a day at 24-60fps - That adds up to a number of Terabytes per day of shooting. The only practical option is LTO tape (at least a few years ago). The tape is VERY stable and can be stored for decades. Anything that is recorded to a disk or flash is copied as fast as possible to a RAID and then off to tapes.

I would be careful about considering DropBox synced PC's as a backup since the DropBox system has delete capability on all of them. A glitch in the database could trigger all your 'backups' to be deleted on the local machines as well as the server. I have not had this problem on DropBox, but I did with a competitive service. I mainly use BOX now, but is is being constantly mirrored to a RAID that I control and nothing can be deleted. So far I have not needed it.


I have all my active local data on a mirrored RAID NAS.

The RAID is mirrored to another RAID that is WRITE only using ALLWAY sync in one-way mode

The active data is sent to a cloud backup service.

large data sets that need high-speed access are on a PC local SSD RAID that is automatically mirrored to the NAS RAID for local backup and the cloud for offline backup.

I set it up to be all automatic and using multiple systems, methods, and brands. If my cloud provider goes belly up, multiple HDD failures, I will still be fine.

Also, I have high-end enterprise UPS systems as an extra measure.
Factory400 - the worlds smallest factory. https://www.youtube.com/c/Factory400
 

Offline Mechanical Menace

  • Super Contributor
  • ***
  • Posts: 1288
  • Country: gb
Re: Backup strategies
« Reply #35 on: August 06, 2015, 02:07:20 am »
Please also explain why every major modern OS does things the Windows way, with file permissions

Just to be pedantic will have to point out Windows doesn't use any file permissions, it uses ACLs only. And then only since NT/XP. So having permissions for accessing certain files is more the way Windows adopted than the Windows way...
Second sexiest ugly bloke on the forum.
"Don't believe every quote you read on the internet, because I totally didn't say that."
~Albert Einstein
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf