Intel found the source as slightly radioactive ceramic from 3M that caused the issue and instead of Intel keeping this to themselves and profiting, they let everyone know, including their competitors!!
Anyway, 3M cleaned up their ceramic and this issue was resolved.
Best,
In the old days, Intel was a pretty decent company, in many (NOT all, a good example is the 4004, where one of its original creators was ignored and left out of acknowledgement of their historic achievements, because of very badly annoying Intel, when they left it, in bad terms. Probably
https://en.wikipedia.org/wiki/Federico_Faggin and his creating Zilog, really upset Intel. Although I fully accept Intel being really annoyed at the Zilog issue, and the Z80 "unfairly" stealing the limelight/profits/8080. Intel should still have been truthful about who created/designed etc, the 4004, and onwards.).
In later (recent years), I hear less and less, about how nice/honorable Intel is as a business. But don't get me wrong, they still seem like a decent company, over-all. Except perhaps chasing profits too keenly, rather than doing what is best for final consumers of Intels products (such as NOT allowing/enabling ECC memory capabilities, on their cheaper CPUs, even if it is actually present on the chip. Arguably, ECC should be enabled for EVERYONE, as it is of potential benefit to almost all users, even if your average joe probably, doesn't know what it is, or realize that they would benefit from it, due to less silent/secret/hidden data corruption).
Old DRAM chips, especially from the 1980/1990s (in my experience), but other years as well. Seem to sometimes, just break (whereas e.g. CPUs, very rarely break, unless badly abused). Even if just left, without being used. I sometimes wonder why that might be. Radiation from the package and/or background and/or cosmic-rays, possibly damaging the floating-gate (or whatever it should be called), mosfet (capacitor), and then damaging it in some way. As, perhaps it is NOT configured to 'dissipate' the excess charge/voltage, in a safe (undamaging) way, like most other semi-conductors, typically *might*.
TL;DR
DRAMs apparently being excessively sensitive to damage/aging/destruction due to radiation from various sources. Making them age much less well, compared to most other semiconductor ICs.
I suppose it could also be because DRAMs are really a (almost Frankenstein) mixture of digital and analogue, on the same chip. Whereas most desktop CPUs, are mostly fully digital. Which might compromise the processes to make those ICs, especially in huge bulk numbers and at very competitively low cost to the consumer (i.e. maybe not enough process steps, are carried out, to ensure the best chips and/or cost savings), have reduced their reliability, especially for the longer term.
Perhaps tiny changes inside the chip as it ages, are enough to 'damage/break' it, unlike CPUs, whose wide tolerance, big safety margins, and strict digital operation (not on all of them, as some of them are not truly static devices, but need minimal clocking, due to dynamic processes on some bits, to save transistors/chip area, and hence reduce costs), mean a CPU can carry on working, even if there are minor
changes, in the IC, as it ages.
I've sometimes searched for answers (I suspect, but I'm NOT sure), but usually draw a blank. Either because people don't know, or it is an industry/trade secret (the DRAM industry, seems extremely secretive, in modern times, i.e. after the DRAM industry left America), kept to the highly competitive DRAM memory suppliers.