Strictly a serious problem in GAMING and HIGH END Workstation software
Paul
In gaming not so much, given that 90% of games are developed for working during about the 6 months period while they are being sold. If they don't work properly afterwards, the company doesn't care anymore because they got 90% of the sales profits they would ever get from it already. Not every game is World of Warcraft or something similar that the company is able to milk (and thus needs to keep running) for 10+ years.
Actually things like OpenGL are the least of your problems there:
a) OpenGL backwards compatibility is pretty good, unlike the various D3D revisions (e.g. your OpenGL 1.x code from 1990s will very likely still work today without any changes).
b) most games don't use it anyway (D3D is the prevalent API for gaming thanks to Microsoft, not OpenGL).
c) most games don't use these APIs directly but use engines such as Unreal or Unity these days, which pretty much completely insulate you from the vagaries of changing APIs. (You get changing engine version and e.g. Unity is notorious for every version breaking something).
A much bigger issue is e.g. server availability - how many even major games are essentially unplayable today because the servers for the game and/or its DRM have been shut down already.
Workstation software is a very different kettle of fish with different needs (e.g. needing to make work a 20+ years old codebase and OpenGL direct mode drawing code with modern drivers and GPUs ...). But then someone like Autodesk has teams of people and decades of time to update the codebase as required.
It was thanks to companies like this why we are still stuck with the compatibility profile in OpenGL - i.e. OpenGL 1.x and similar, so that they don't have to fix their ancient code (ever wondered why CAD is so notoriously slow when it comes to redrawing even small scenes, despite requiring huge amounts of horsepower?).
The same for UI toolkits - yes, changing an UI toolkit is a very expensive and laborious thing to do but then you have plenty of notice when something is going to be deprecated. And toolkits like Qt are around literally for decades now, plus you can always maintain your own in-house version if required. The same about updating code for newer compilers (fixing deprecated/broken stuff, not necessarily rewriting to use the latest features) etc.
I am not talking about gratuitous changes for change's sake but e.g. OpenGL 3.x spec has been released in 2008 - more than 10 years ago. And how many big name applications are still relying on the old, deprecated fixed pipeline stuff from OpenGL 1 and 2, even though it easily means 50% or more worse performance on the same workload?
If the vendors had to keep compatibility at all costs, then no progress would be ever possible. Maintaining software doesn't mean you build it once and then hope that nothing ever changes, so that you don't need to update it (and when it does you blame the vendor instead!).
The real problem is that this rarely gets the resources required, often things are fixed only to quench the worst fires with the least amount of effort possible because it is an old product that isn't making much money. Then the technical debt grows until there is such a mountain of it, that it becomes insurmountable.
I wonder how you could work within the web development, where it is considered normal that your new framework gets deprecated within
weeks of its release and completely broken within a few months, that the paradigm that was the recommended way how to implement things is hopelessly obsolete and outdated few months down the road and the package and dependency churn make reliable and safe (with known security holes patched) production builds an impossible challenge ...
(No, I don't consider the web ecosystem as something normal or conducive to good quality software, but whining about having to update software because of OpenGL changes of all things is completely ridiculous ...)