As for comparing a desktop PC to a Mac laptop: very few PC laptops have upgradeable graphics cards. Laptops, regardless of platform, tend to have highly customized motherboards, and while the Mac has embraced fully-integrated, completely non-customizable/upgradeable motherboards, the PC world is going the same direction, just a few years later (as always...)
just to be clear, my comments earlier are on any laptop in general, not particularly on mac brand. macbook is just a high end version of a laptop. any laptop doesnt have upgradable graphics card, thats what i was talking about, one of the thing.
i dont consider any EDA such as Keil or Microchip Studio as "demanding" tools, even the Photoshop or Altium Designer, they are in "lightweight" to "middleweight" range. AutoCAD or other 3D CAD or renderers start to demand on GPU power, CPU power for custom render methods that the GPU cant do efficiently. things got heavy when you play with engineering solvers and FEA/FEM analysis tools. and i heard people talking about large scale SW development that the compiler will use to the last bit of your CPU power to do compilations of many code files, i dont think Keil is one of them. probably in sci-fi film production too. maybe you can ask people at NASA or CERN about which PC/laptop is suitable for their particle physic modeling/prediction etc. and there is no amount of processing power on earth that is "enough" if you want to do a custom O(n^x) operation with x >= 2, n = extremely large value, even for a small single function of code. ymmv.
Dude.
Nobody is disputing the
existence of users with truly high-end needs. But your original statement, made to exemplify the need for upgradability, was that anyone who chooses to "upgrade" their "skill set" will need upgradability. And that's complete and utter nonsense. The hardware demands of a piece of software have no relation to how far someone goes in their career! A programmer, no matter how skilled, will never need cutting-edge hardware to do their job. (OK, if you're going to compile the entire Windows source code, it's handy, but it's not just
your work then.) Meanwhile, anyone doing 8k video editing, no matter how low-skilled they may be (even someone straight out of school at a film studio), is going to need a beast of a machine with massive storage infrastructure. It has nothing to do with skill and all to do with the requirements of the
job. And realistically, very, VERY few things these days require high-end hardware. The vast majority of workers, regardless of skill level, have minimal-to-modest system requirements, and won't EVER upgrade their computer, they'll just replace it.
I remember when computers were so slow (very early 1990s) that a standard part of benchmarking in computer magazines was recalculating large Lotus 1-2-3 or Excel spreadsheets, because the difference between an entry level machine and a top of the line one could be 45 seconds vs. 15 seconds. That's a real, palpable difference. Only a few years later, they removed the "recalculate" command altogether because it was something that could be done in real time. It used to be that photography was an extremely high-end application, now it's something that (at least when implemented with some brains) is happy on almost anything: graphic artists no longer need to be as high up on the performance scale as they once did. When I started my career in IT, professional standard-definition video editing required machines with tons of specialized hardware and massive disk arrays (The video never even passed through the CPU: the computer was simply the host and user interface for dedicated hardware). SD (and HD) video editing is something modern systems can do in their sleep.
I also think it's important to point out that games have basically redefined what high-end means. GPU development isn't driven by high-end professional graphics (engineering workstations, etc) any more, it's driven by games. A GPU that's modest for modern games is, frankly, quite capable for a whole lot of engineering software. (The main difference between "pro" cards like the Quadro line is the drivers, which are optimized for accuracy over speed, whereas the gaming drivers are optimized for speed at the expense of accuracy. As an aside, on the Mac, there's never been this distinction, with the drivers all more or less corresponding to the pro drivers on Windows. That's why software that demands a pro GPU on Windows is perfectly happy with the equivalent gaming GPU on the Mac.) It used to be SGI pushing GPU capabilities forward for money-is-no-object level of high end applications. Less than a decade after $250,000 each SGI Onyx systems were needed to create the liquid metal in Terminator 2, SGI was moribund and Maya, the descendant of the software used to make that film, was running happily on Mac OS X and Windows NT on commodity hardware, thanks to games pushing GPU development forward at a tremendous pace.
Similarly, the growth of standard computer performance is why the high-end UNIX workstations (SGI, Sun, etc.) all died out. Mainstream hardware caught up with them, so there was no longer any justification to spend the enormous R&D costs to design those things.