It works. It works fast, in any browser, without bugs.
Not really. For example, I can't copy/paste images while writing a post.
Regardless of the technology stack, the user interface is tedious and dated.
This. It is weak in comparison to even a light weight blog platform these day.
However, look at the audience.
Hands up who is on Windows 7?
Hands up who is on an earlier version of Windows?
Someone mentioned speed.
Lets pick a use case example. Searching. When you search on this forum you complete the HTML form element up the top right. When you hit Search, a CGI GET request is sent to the server. The browser 'freezes' The page reloads. You need to download the WHOLE page again, even though most of it is the same templated repetitive HTML garbage.
In a Web2.0 version the page doesn't move. The initial user-interaction is handled locally. The search results view is rendered locally while in the background a REST (or REST like) request will be made to the search service. it will responds with a few Kilobytes of JSON. NOT massive verbose duplicative HTML. The local JavaScript app will then use that JSON to generate the HTML locally. Thats the "level 0". Beyond that you can start to get clever. Watch the user scroll the page, send your "future" for your next search result page and cache it with the expectation the user will request it soon. Level 1. What? Why does the user need to select the next page at all? If they have scrolled near the end of the results, lets just fetch the next page and render it without needing them to do anything - Level 2. Amazon, facebook and YouTube are at Level 104. That opens the privacy, monitoring, data-collection, user profiling thorny issues, but as you guys know, the tech and how its used are not the same kettle of fish.
For those of us who have modern hardware made in the last 5 years, this happens next to instantaneously. Certainly the local side of it. The REST request should take a few dozen milliseconds, but depends on load and your internet connection. However it will be faster than a PHP script, by orders of magnitude in many cases. It will also use a magnitude less resources and cost a magnitude less to host.
You have a 8/16 core CPU and its ONLY YOU using it to generate HTML. On this forum, it's you and ALL currently active users, bots, search engines etc. competing for the resources, while Dave pays for it.
Both models have pros and cons and there are situations where one or the other would be inappropriate. Most situations I think benefit from the more modern approach. Not just from the "User's POV", but from the business and development point of view.
Computers get faster. Humans don't.
Computers get cheaper. Humans don't.
This is why enterprise software is not like MCU development. Our landscape evolves orders of magnitude faster as it has been challenged with complexities, scales and scopes orders of magnitude greater. When you divide and conqueror, you will always create waste and slack. If those are all you can see though, you are blind to the bigger picture and to the scale involved.