My point being, you cannot force crappy people to do good work by having the compiler/language/OS do it for them. They'll just invent new, possibly worse ways, to be crappy. As long as users are happy with crappy tools, crappy tools get produced. I don't think we can fix *that*. What we can do is learn, and create something better, now, for *ourselves*.
A valid point.
And the modern ways of being crappy are more crappy than ever, beyond all imagination.
It seems to me, people who write C seem to be
fairly knowledgeable about what a computer can do, and how to program it to solve a practical problem. The result
tends to be a fairly efficient and capable software, with a few bugs related to the classical C shortcomings everyone knows so well I don't need to repeat them here - but even then, most bugs are elsewhere (language-independent).
Of course, this isn't always the case, there is a lot of uselessly crappy C code out there - but I do see this in a "mental statistics" way (I accept there is a risk of confirmation bias in this, of course, but no one else is stating any objective scientific research either and this is the "chat" forum.)
The alternative? I think the last two decades has shown us some massive piles of utter crap originating from the urge to invent:
A) a trend language of the year,
B) a trend framework of the year,
C) the trend words of the year
and yes, sometimes these tools and mindsets are capable of handling array indexing bounds checking correctly and can therefore save from a certain type of bug! But, somehow it seems, the end result is still overall too complex utter crap, and combined with the fact that clueless people use these tools, the result is a total disaster.
By this, I don't mean a "a C coder overindexed a table, the software crashed, the bug was fixed the next day" disaster, but epidemic disasters such as public IT systems in healthcare, transportation, public safety, banking, etc., being broken for
months or years straight with no way to fix them. The solution I'm actually seeing is silence. I think our modern world has become difficult. All the cool modern IT services that we saw prototypes of in the late 90's and were expecting to get commonplace, they just don't work anymore. They lag, they crash, they give server error messages, then I need to get a phone and call the helpdesk to resolve the issue.
For example, the university I was attending bought a new IT system in 2007 or so. The old one just worked, it was some "hacky" job, the website was always reponsive, you could sign to courses, exams, print out your schedules, the usual sort of jazz. The "waterfall" model was still trendy back then, the new system was in planning for 2 years, then another 2 years in implementation. When finished, the university stopped working. A new server farm was bought because it was so bloat, but while this fixed some server timeouts, it didn't fix all the other brokenness. Because the new system
still crashed all the time, people needed to actually go to the relevant offices to sign up to courses and exams. As everyone needed to do that, huge lines ensued. I was working as an assistant at the time, we moved completely to pen&paper in our department and it actually worked quite well. While doing all this, we learned two things:
1)
Bad IT is worse than no IT (and no IT is acceptable. that's how it worked 100 years ago!)
2)
Working IT is the greatest thing since sliced bread but no one comments about it when it worksIt took
months to fix the system to a somewhat "stable" state, by stable I mean, not usable, but something you can work with given the unlimited free-of-charge time of the students. Why not poll the service for a few days, a few hours a day, trying to sign on a course? It may succeed finally!
Now, the 2) above is a classical saying, but sadly, I'm seeing a change.
People have been getting so used to completely broken systems that they have stopped complaining about them as well. Even more weird, people are not even
seeing these systems don't work, even when they are not getting the job done using said software.
Another example: the bank I'm using "modernized" their web services more than a year ago. Almost all the basic functionality was basically broken for
half a year. Slowly, things started to get less sucky, and the basic features started to appear back, one by one, during months of time. Yet no one (except me) seemed to complain. We just don't complain anymore, we have totally given up any hope to do anything useful, and are happy when we see nice-looking graphical elements and can play Candy Crush. So, if I needed to have a printout of my savings account, needed in a few possible official occasions, I needed to call, using a phone, asking them to print it out and mail it to me. It took months to fix the issue.
The patient information system of the public healthcare in Finland was a highly controversial subject a few years back. It was a large piece of the budget of the whole country, comparable to building a nuclear plant, IIRC 2000 million €, which is a huge cost for such a small country (it was widely compared to the similar Estonian system which was something like 100 times less expensive, and worked). The end result was borderline useless, but it
has to be lived with, because such a massive investment can't
officially fail: too big to fail. It has been speculated (and analyzed; also officially) that the system, which doesn't work, has possibly caused significant health risks, and danger of loss of life, possibly even deaths. For example, the system crashes all the time, when the critical patient data is needed before an operation, and wastes the time and increases the stress of the doctors, nurses and staff.
But, such complex and expensive systems
cannot be fixed, and
cannot be replaced. It's impossible to even admit they need a replacement. It's the classical illusion of "not throwing away hard work".
The issues on this scale originate from extremely bad high-level design (or, possibly having
too much of it), overengineering by orders of magnitudes, using completely unusable tools for non-technical reasons, and finally, hiring people who have absolutely no clue about what they are doing to do the final implementation. These people are the victims, as well, and most likely burn out in a year. OTOH, the job market will continue needing these people to maintain all these broken systems - the job is never done.
I would hire a nerdy C/linux hacker producing ugly buggy spaghetti code with myriads of buffer overflows, over the more modern
status quo in the "professional" software market, any day, no doubt.