anyway - back to the real discussion:
1) everything we do is written in sand. eventually it'll all wash away. It's not a matter of *if* your work will ever be unrecoverable, it's a matter of when. The best we can hope for is it'll be irrelevant to anyone before then.
2) long term source code management is becoming a "solved problem" now we have a giant worldwide computer network and configuration management repositories (at least until that orange baboon's net neutrality change cuts the internet into a bunch of walled gardens...) if you can keep your basic source code in a git repo, you can maintain it indefinitely on the internet, as long as you make sure it never gets forgotten about, and left on something that ends up deleted one day.
3) the other side of being able to use source files is the biggest issue because of tools that have limits... some limits were unforseen (code was only written to work on a certain type of computer, then code maintainer goes broke and no more builds can be made, then computer manufacturer changes something, and code won't run on their new computer... ) others are deliberate (time based licensing, encrypted files, etc etc) It seems to me like computers have "normalised" a bit recently, the hardware is becoming more consistent and changes aren't happening that fast anymore... OS changes seem a lot more abrupt lately.. maybe they will settle down eventually too?
If you are doing simple embedded development where you use a GCC based toolset for simple cross compilation (even something like atollic!) you always have the basic compiler that can be rebuilt from source, which takes us back to the "solved" source availability problem (as long as you always have a compiler that can compile that!) so if you really need to, you can make an environment to build that code again... and hopefully find a way to get it from that environment to wherever it needs to go.
But when you are building a *system* that has custom code, but is also based on a full blown OS like linux with a bunch of packages from all over the place, then as explained above, there's a whole world of hell with package compatibility (mostly because linux people just change things for the sake of changing things and break compatibility all the time - the reason why linux is *still* not a reasonable choice for desktop purposes, for someone who just wants to use a computer to do work)
Personally, for any project I do, I document and archive the toolchain (note versions of everything, save copies of install files, write complete install instructions) as a minimum. Then for a project that undergoes development over time, I take the time to update that toolchain (AND THE DOCS) as I go, but only when the project hits a milestone, and I have time to baseline/compare/verify the update.. this also gives a chance to migrate other parts of the toolchain (OS, etc, debug probes if necessary) and seems the best approach to have a standard build setup, but always be in the best position to have something re-creatable at some point in the future, once the project goes into long term hibernation.... at hibernation, if a project has a VM to do the development like I have had in the past and may again in the future, then I might also save that VM (but no guarantee it'll work in future..)