I'm actually not that worried about the future of the home computer -- extrapolating the trends in CPUs, and especially GPUs, over the past decade or so seems encouraging.
- In CPUs, AMD is really blurring the lines between enterprise/server CPUs and personal/consumer CPUs; there's going to be a need for corporate workstations for a long while yet, and those a great for doing "real computing" (really not that bad for gaming either, especially if you slap a good GPU in there)
- GPUs used to be fairly gamer-focused devices, but they've grown in generality to the point that they are basically enormously powerful parallel processors that happen to be well tuned for doing graphics, but can do all sorts of stuff
I would be going too deep into speculation-land to try and figure out what the economic drivers for these trends are, but I think anyone who wants to naysay the future of home computers should at least provide one? Compared to the trend of Apple serializing iPhone batteries to prevent user repair, these are incredibly encouraging signs to me. In fact, the only threat I really see is that sort of anti-consumer sentiment seeping over to the PC world (in fact, Intel has toyed with it a fair bit.)
...I doubt that many of us will really understand the multi-core, super doomiflex desktop computers of the future. Nor can an individual write the software to employ such machines....
I can't disagree with a sentiment of these statements more.
For any given problem, it is significantly easier to write software today that it was decades ago, despite the underlying hardware becoming much more complex. Individuals can, and routinely do, write low-level software that runs on modern CPUs, some efficiently exploiting multithreading, and some using the massively parallel computing power of GPUs. It's not that hard.
You might
think that programming modern computers is more complex because modern pieces of software are more complex and written by huge teams. But those are solutions to much more ambitious problems that were intractable 40 years ago. It's the problems that are harder, not the CPU-wrangling. If you want to solve Sudoku, automate your accounting, do some scientific simulation (I dunno, the sort of things you might do on a computer 40 years ago), it has never been easier to do so. The one exception to all this might be for an individual who absolutely demands an in-depth understanding of how a CPU works in order to program it -- I will admit that an ability for abstract thought it possible more necessary than it was 40 years ago.
PS/ What I write above might be BS, mainly because I'm 35 years old.