I think any failure of the standards is mostly due to the many cases that result in undefined behaviour. I think a lot of that could be deliberate for performance reasons but I don't really know enough about the compilers to be sure.
There are various reasons, but deliberately leaving something undefined for performance reasons would be peverse. It would lead to getting the answer faster but not being sure that it was the right answer or a repeatable answer. And if you don't have to get the right answer, you can make the progam infinitely fast
One of the prime reasons for leaving something undefined is because there are irreconcilable differences between what it does on one processor and another. Another is that it would be excessibel slow on some architectures. Another is that getting it right would break some existing programs. And of course, sometimes it is just too difficult to get all parts of the spec to agree with each other.
And be careful about the difference between "undefined" and "implementation dependent". The former is the cause of nasal demons, the latter is the cause of subtle portability problems.
As for volatile, sequence points, const etc I guess I don't find that too cryptic as it was introduced when I leant the language originally, I have the opposite view. Most additions seem necessary, I can't see embedded programming working too well with optimizing compilers without a volatile model for example. cdecl (http://cdecl.org/) and static-analysis tools get me over the rest of it.
Judging by the statements of those that were on various national committees, you think you understand them, but you don't. The problems come with the corner cases, with optimisation, and with feature "interaction".
First example that comes to mind... C (until possibly the latest spec) defined anythign to do with multithreading as outside the language spec and being the libraries' responsibility. Therefore POSIX has to rely on C features that are deliberately not defined.
Remind me, can you "cast away constness". There were endless (i.e.
years long) debates about whether it must be allowed (for libraries) or must be prohibited (to allow
known correct optimisations). I believe it is allowed, but the existence of the debate demonstrated that the language was being pulled in two incompatible directions and was out of control.
C++ is a different kettle of fish.
Yes. See the FQA, which is most entertaining - unless you have to deal with it. (Bits are relevant to C, of course).
For myself, I decided in 1988 that if C++ was the answer, I thought the question was wrong ;} Cue story about how the
designers didn't understand the template language was itself Turing-complete, until someone wrote a short program that caused the
compiler to output the sequence of prime numbers
during compilation. If the designers don't understand their creation, what hope do the rest of us have.