Author Topic: Proof that software as service/cloud based, will never work for long term ...  (Read 100670 times)

0 Members and 2 Guests are viewing this topic.

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
[...] I don't think that sticking to an obsolete desktop OS is a very viable solution. [...]

Counterexample 1:
Michael Bloomberg became a billionaire by sticking with Fortran for the Bloomberg Terminal, when everyone was saying it was obsolete.  It turned out that there were tons of programmers available that had the skills in Fortran, and Bloomberg made us of it...

Counterexample 2:
Owning an older car, paid off, simpler tech, is manyfold times cheaper than owning a new car,  if getting from A to B is what matters to you.

 

Offline madires

  • Super Contributor
  • ***
  • Posts: 7770
  • Country: de
  • A qualified hobbyist ;)
Post mortem of Azure DevOps Outage in South Brazil: https://status.dev.azure.com/_event/392143683/post-mortem

Quote
Hidden within this pull request was a typo bug in the snapshot deletion job which swapped out a call to delete the Azure SQL Database to one that deletes the Azure SQL Server that hosts the database.
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
oops!
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14489
  • Country: fr
Post mortem of Azure DevOps Outage in South Brazil: https://status.dev.azure.com/_event/392143683/post-mortem

Quote
Hidden within this pull request was a typo bug in the snapshot deletion job which swapped out a call to delete the Azure SQL Database to one that deletes the Azure SQL Server that hosts the database.

Fun stuff!
 

Offline MrMobodies

  • Super Contributor
  • ***
  • Posts: 1912
  • Country: gb
So driver had headphones on whilst delivering a package but reports customer who wern't in at the time with racism and Amazon suspend his "SMART" home devices until investigation finished.

https://odysee.com/@rossmanngroup:a/amazon-accuses-customer-of-racism-shuts:2

Louse Rossman:
7:55 "You don't need a home that connects to other people's servers."

How right he is.

If something like that connects to a server, I want that server to be mine, in my house under my control with no interference from what happens outside or from the manufacturer.
« Last Edit: June 12, 2023, 04:19:37 pm by MrMobodies »
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6723
  • Country: nl
That does mean if you want home automation, you're stuck with hobbyism (Home Assistant, Red Node, etc) or expensive systems (Loxone&co).

Apple keeps the data local, but the automation is tied to your Apple ID.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
That does mean if you want home automation, you're stuck with hobbyism (Home Assistant, Red Node, etc) or expensive systems (Loxone&co).

Apple keeps the data local, but the automation is tied to your Apple ID.

I've been using Home Assistant for a few years now and at this point it is pretty fantastic, I have a bunch of different brands of hardware, including Hue and noname smart bulbs, Xaiomi sensors, Sonoff sensors, a mix of zigbee, wifi and 433MHz RF stuff, even a Hughes zigbee doorbell sensor I got really cheaply because their line of home automation stuff was discontinued. It all works seemlessly together just the way home automation stuff should.

It was a proper pain getting started though, about 50% of the reason being at the time it was still somewhat primitive and way too much stuff had to be set up by editing yaml files rather than the UI, and the other 50% being that the documentation was/is garbage and the community is absolutely worthless and borderline hostile. It seemed like every time I had a question there were either dozens of other people asking the same thing with no answer, or someone on the team wanted to argue with me over why I wanted to do what I was trying to do rather than trying to help me. Their attitude seems to be "read the documentation you idiot" (which often is very vague, confusing or incomplete), or "if you don't like the way it works then buy Apple/Hue/whatever you idiot". Ultimately I completely gave up on the community and found I could learn far more from random youtube home automation geeks who would just show how to do something instead of arguing or acting like someone is a moron for not understanding the vague documentation.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
[...] I don't think that sticking to an obsolete desktop OS is a very viable solution. [...]

Counterexample 1:
Michael Bloomberg became a billionaire by sticking with Fortran for the Bloomberg Terminal, when everyone was saying it was obsolete.  It turned out that there were tons of programmers available that had the skills in Fortran, and Bloomberg made us of it...

Counterexample 2:
Owning an older car, paid off, simpler tech, is manyfold times cheaper than owning a new car,  if getting from A to B is what matters to you.

My daily driver is a 33 year old car, it's fantastic, I love it, it's not that I couldn't afford something newer, it's that nobody is making anything anymore that I would want and this old Volvo turbo wagon fits my needs like a glove.

I'm in the process of restoring a 47 year old Amana Radarange microwave oven to replace my circa 2005 GE that is getting a bit tired. 90 pounds of stainless steel, cast aluminum and acres of gleaming chrome is a functional work of art. With an original price of $495 being equivalent to around $3,000 today they sure don't build them like that anymore. Sure it doesn't have all the fancy features that a lot of modern ovens offer, but I pretty much only use it to reheat leftovers and mugs of hot water for making cocoa or tea anyway.

I also have a 1941 Philco console radio in my livingroom that I restored about 10 years ago that I use frequently too. Nothing wrong with old stuff, sometimes newer stuff is better, other times it's just newer. I hate the attitude that newer is automatically better than older.
 

Online PlainName

  • Super Contributor
  • ***
  • Posts: 6848
  • Country: va
Quote
I've been using Home Assistant for a few years now and at this point it is pretty fantastic

I've had it installed for a while but it's remained just as a curiosity. Reason being it stops working with some stuff for no apparent reason, then works again. Or I just can't get it to talk to some things. A while back I added a Duaha DVR to the network and next time I looked at HA there was a screenshot from my CCTV system! Cool! And then I did something (obviously, had to be me but I know now what) and now that screenshot just shows some meaningless error instead.

Quote
Ultimately I completely gave up on the community and found I could learn far more from random youtube home automation geeks who would just show how to do something instead of arguing or acting like someone is a moron for not understanding the vague documentation.

And you think relying on that is somehow better than relying on a commercial cloud service? I think you're just accepting it because it's free, but you're just as buggered if it goes wrong and there's no random youtube fix.
 

Offline Infraviolet

  • Super Contributor
  • ***
  • Posts: 1023
  • Country: gb
" Ultimately I completely gave up on the community and found I could learn far more from random youtube home automation geeks"
If those geeks are showing, albeit in a specific "just click the red button, then type in '3'" sort of task oriented way rather than through full explanation, how to make use of open-source and other locally controleld options... It might not gve you the skills to debug it if it goes wrong, but if it works in the first place for you then it is a lot less likely to go wrong at a future time than a commercial cloud dependent system. Anything you locally control won't be subject to arbitrary updates and other silly changes, if you get it working once it is the sort of thing you could always probably fix by wiping everything back to its fresh state and doing your initial installation procedure again.
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6723
  • Country: nl
you're just as buggered if it goes wrong and there's no random youtube fix.
If you've got a couple days to waste, there's always source code.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
And you think relying on that is somehow better than relying on a commercial cloud service? I think you're just accepting it because it's free, but you're just as buggered if it goes wrong and there's no random youtube fix.

Yes it's absolutely better, vastly superior, hands down, no question at all, it's not even close. I have essentially frozen my configuration at this point and I have backups, everything is all dialed in and "just works" and has for a couple of years now, it's been very reliable. If something goes wrong I have a fix, worst case I could go nuclear and wipe the entire system, reinstall and restore my most recent fully working backup. All this random stuff works together seamlessly rather than having to have a bunch of different ecosystems or getting locked entirely in to one company's products. My system is entirely self contained other than a few integrations that obviously have to rely on external data such as weather but with those I can choose from multiple different integrations. I could leave my system running as-is and 10 years later it will mostly still be working because I'm not forced to update and every one of the companies that made my stuff could go under and the only thing that would stop working is the Alexa integration because that relies on Amazon's cloud and of all the parts of my system that's the only one I've ever had any trouble with. It's incredibly customizable and I even have some unusual custom hardware like a couple of vintage elevator hall lanterns with esp8266 based controllers running Tasmota that I use as general purpose notification devices. It also controls the irrigation in my garden taking current and forecast weather conditions into account, it sends notifications to my mobile phone if one of the moisture sensors detects a water leak under a sink (had a drain pipe get knocked loose a couple times), if a garage door is left open, if I leave the house and one of the doors or ground accessible windows is open, reminds me to bring my potted pepper plants inside if the forecast temperature is too low, monitors moisture and tells me when it's time to water my houseplants, turns on lights automatically when I arrive home, turns them off when I leave, wakes me up with a sunrise effect of the bedroom lamp on work days, etc. I have >100 automations and rely on it pretty heavily at this point and I'm not aware of a commercial off the shelf system that would do everything I have it doing and certainly nothing cloud based that I could rely on long term to not break something, even a temporary outage would be unacceptable since I need my stuff to just work.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
" Ultimately I completely gave up on the community and found I could learn far more from random youtube home automation geeks"
If those geeks are showing, albeit in a specific "just click the red button, then type in '3'" sort of task oriented way rather than through full explanation, how to make use of open-source and other locally controleld options... It might not gve you the skills to debug it if it goes wrong, but if it works in the first place for you then it is a lot less likely to go wrong at a future time than a commercial cloud dependent system. Anything you locally control won't be subject to arbitrary updates and other silly changes, if you get it working once it is the sort of thing you could always probably fix by wiping everything back to its fresh state and doing your initial installation procedure again.

The quality of the various tutorials is highly variable, but even the worst of them are usually better than the pompous and unhelpful assholes in the official community. I learn best by example, show me how to put together a working setup and then I'll explore from there and gradually learn how it works. If you're going to just tell me to read the documentation then give me documentation that is actually something close to complete and that has some working examples that I can build off of. At the start just getting the most basic stuff to work was ridiculously difficult but by now it has matured to the point where the basic happy path is pretty easy to get going without having to dig into yaml files.

Yes that is exactly the point, I don't have to know how to debug, I mean it helps but as long as I've got a backup then absolute worst case I can start over fairly easily. Unlike most commercial stuff these days I can easily roll back to a previous build, any arbitrary build I want. More than once I've done that, installed an update it was bugging me about, found that it broke something I didn't have time to mess with right then so I rolled it back and everything is working again. They release updates several times a month typically but I've settled into updating once or twice a year when I have time to tinker. Now that it does pretty much everything I need I'm not sure I'll bother updating again.
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
" Ultimately I completely gave up on the community and found I could learn far more from random youtube home automation geeks"
If those geeks are showing, albeit in a specific "just click the red button, then type in '3'" sort of task oriented way rather than through full explanation, how to make use of open-source and other locally controleld options... It might not gve you the skills to debug it if it goes wrong, but if it works in the first place for you then it is a lot less likely to go wrong at a future time than a commercial cloud dependent system. Anything you locally control won't be subject to arbitrary updates and other silly changes, if you get it working once it is the sort of thing you could always probably fix by wiping everything back to its fresh state and doing your initial installation procedure again.

The quality of the various tutorials is highly variable, but even the worst of them are usually better than the pompous and unhelpful assholes in the official community. I learn best by example, show me how to put together a working setup and then I'll explore from there and gradually learn how it works. If you're going to just tell me to read the documentation then give me documentation that is actually something close to complete and that has some working examples that I can build off of. At the start just getting the most basic stuff to work was ridiculously difficult but by now it has matured to the point where the basic happy path is pretty easy to get going without having to dig into yaml files.

Yes that is exactly the point, I don't have to know how to debug, I mean it helps but as long as I've got a backup then absolute worst case I can start over fairly easily. Unlike most commercial stuff these days I can easily roll back to a previous build, any arbitrary build I want. More than once I've done that, installed an update it was bugging me about, found that it broke something I didn't have time to mess with right then so I rolled it back and everything is working again. They release updates several times a month typically but I've settled into updating once or twice a year when I have time to tinker. Now that it does pretty much everything I need I'm not sure I'll bother updating again.

That is getting to be a lost art in this era of subscription software (i.e. the build-and-forget closed system).   The subscription model forces the supplier to issue regular updates to justify the subscription, which is really just a huge pain in the neck as you incur the risk of problems every time that happens,  with very little gain to show for it...
 

Online RoGeorge

  • Super Contributor
  • ***
  • Posts: 6207
  • Country: ro
The only reliable way I know is to keep a virtual machine (VM) for each project. 

Not installer kits, not sources, not docker, only full, standalone VMs.  Those VMs are built from a typical Linux image, with all the needed software and build tools installed, never to be connected online, never to be updated, and I never use software that can not work offline.  This is still possible for hobby projects.

It takes about 4-10GB for each VM, but the HDD storage is cheap enough, and I don't have that many projects anyway.  The total disk space can be reduced a lot if using a file system with deduplication.

For work, use whatever method the company specifies.
 
The following users thanked this post: SilverSolder

Online PlainName

  • Super Contributor
  • ***
  • Posts: 6848
  • Country: va
Quote
It takes about 4-10GB for each VM, but the HDD storage is cheap enough

It ain't once you multiply by the number of backups.

However, the way I partly get around this, which I'm sure is or should  be common practice, is to have a common base VM and then create clones of that for the real (virtual!) working stuff. The clone could be a few hundred MB if there are few differences to the base.
 
The following users thanked this post: SilverSolder

Online RoGeorge

  • Super Contributor
  • ***
  • Posts: 6207
  • Country: ro
For backup I am using Borg backup, https://www.borgbackup.org/ which does incremental backup, has native deduplication and compression.

Would gladly use something lighter than a full VM install, just that I don't know any other way to have something that will still run a few years later.
 
The following users thanked this post: MarkL

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14489
  • Country: fr
For backup I am using Borg backup, https://www.borgbackup.org/ which does incremental backup, has native deduplication and compression.

I might give that a try, there's a package on my distro. Is it fully stable? Reasonably fast?
 

Online PlainName

  • Super Contributor
  • ***
  • Posts: 6848
  • Country: va
For backup I am using Borg backup, https://www.borgbackup.org/ which does incremental backup, has native deduplication and compression.

Can you have multiple independent destinations?
 

Online RoGeorge

  • Super Contributor
  • ***
  • Posts: 6207
  • Country: ro
I might give that a try, there's a package on my distro. Is it fully stable? Reasonably fast?

I'm using it for about a year now, didn't have any problems so far.  The only downside for me is that Borg is a command line backup, and I'm doing backups only rarely, so always forget how to backup.  Made some scripts.

There is also a 3rd party GUI for Borg, don't recall the name.  When I've tried it, I've noticed the GUI (not the Borg) was saving a password in clear, in a text file, so I've decided to use Borg only, from command line.  Made some personal scripts for the preferred compression.

Borg is using some sort of cache with checksums needed for deduplication and incremental backup.  If you keep that cache locally, at future backups it will be very fast.  First backup took many hours, then from there on, it only takes minutes.  The cache is discardable, but if you delete it, it will take again a few hours to rebuild it at the next backup.

TL;DR, so far worked without hassle for me, to keep backups for a few TB, from Linux to an external NAS.  Don't have much experience with other backup solutions, so I can not say how good it is by comparison with other backups.

 
The following users thanked this post: SiliconWizard

Online PlainName

  • Super Contributor
  • ***
  • Posts: 6848
  • Country: va
Quote
so far worked without hassle for me, to keep backups for a few TB, from Linux to an external NAS

Have you done a full restore?
 

Online RoGeorge

  • Super Contributor
  • ***
  • Posts: 6207
  • Country: ro
Tried only once, at the very beginning, and it worked.  Never since then.  :-[  I hope restore would still work.  An advantage worth mentioning is files in Borg backups can be mounted/read without restoring the whole backup.  Just in case of an impossibility to restore, I can just install fresh then paste over the configuration files.

However, for OS rollbacks (for example after a too early/buggy OS update, or when I brick something from my own mistake), I use a feature of Kubuntu with ZFS on root.  ZFS can do snapshots, but Kubuntu (Zsys was introduced as experimental in the Kubuntu 20.04 IIRC) added to ZFS another layer of tools, integrated with GRUB.

It is called "Zsys", https://didrocks.fr/2020/05/26/zfs-focus-on-ubuntu-20.04-lts-zsys-general-presentation/, and it does automated and/or manual OS snapshots, then can do rollbacks of the OS from the GRUB boot menu, similar with Windows restore.  Zsys restores can go back and forth with the rollbacks, too, as many times as you want, so if you change your mind you can start using another savepoint (needs a reboot).  Snapshot files alone can still be manually accessed without reboot (with some tricks).  Zsys has no GUI, only the command line and the GRUB boot menu.
« Last Edit: June 24, 2023, 10:22:34 am by RoGeorge »
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6723
  • Country: nl
In a CoW age all these old file based incremental backup solutions look really archaic.

Stuff like this should be the future :
https://github.com/tasket/wyng-backup (for thin provisioned LVM)
https://digint.ch/btrbk/ (for btrfs)

Unfortunately the distros have completely dropped the ball, sure most large companies use distributed filesystem but there's probably still workstations using local filesystems which need better incremental backup. AFAICS Redhat didn't even update xfsdump to use reflink. Veeam supports incremental updates with reflink though ...

PS. the ZFS stuff works too of course, but well ... ZFS ...
« Last Edit: June 24, 2023, 11:22:01 am by Marco »
 

Online PlainName

  • Super Contributor
  • ***
  • Posts: 6848
  • Country: va
Quote
https://github.com/tasket/wyng-backup (for thin provisioned LVM)

Seems to be single destination.

In fact, that seems to be the expected modus operandi for these things: single backup server with multiple clients. What is really useful for the domestic or small business user is single client with multiple destinations, but with the ability to keep doing incs/diffs appropriate to each destination. That is, one destination won't interfere with the knowledge of which files have been stored and which have been changed, that the other holds.
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6723
  • Country: nl
They are incremental snapshots of block devices, files are invisible to it period. You can put some snapshots on one server and some on the other, but I don't really see the point. Just rsync the backup server if you want a multihomed backup (though most people will backup to cloud, despite it never working long term).
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf