ADA?
Were we transported back to the 1980's?
Well, if you consider that the C was created in early 70's, one could say that Ada is a leap forward.
Ada is now more streamlined and modernized than it used to be, something like C/C++ has been improved over the years too. The compilers are now much better (for example, using the GCC compiler techniques and back-end for the code generation), the Ada Ravenscar profile is tailored for the embedded systems and more architectures are supported.
Maybe people are realizing the importance of the software robustness and correctness which needs to be addressed. It is quite obvious that C/C++ doesn't provide enough information for the human readers and the compilers / analyzers to check for the software correctness and runtime checks, for example. The MISRA C/C++ and similar guidelines just try to fix the broken C/C++ languages using a duct-tape, instead of fixing the language.
The Ada's type checking is quite similar to the process you would design analog/digital hardware. You will specify maximum and minimum voltages and current in the design, select the components so that they will meet the requirements and make sure that the signal interfaces between the components match to each other. Then, you would make a list of measurements that will determine whether your system will meet the design requirements and is ready for the production. From time to time you may need to calibrate your system and see that everything is still within the margins. If you should later change or replace a new component into the system you will need to check and test whether the new component still meets the system requirements. You would write the requirements in to the schematics or in to the design documentation. In Ada the requirements are specified directly in the source code and the compiler will check automatically that all requirements are satisfied during the compilation. In addition to this, there are [optional] run-time checks in Ada which will make constantly those "measurements" and see that the design meets it specifications and requirements and that the "voltages" and "currents" [variable values] are within the specified limits. If the system is not within the limits, the system can catch the exception and perform required action - instead of letting the things go unnotices and possibly outputting wrong results or making the device behave in an erratic way.
The C/C++ languages' type system could be improved by taking Ada's type system (or something else that makes possible to specify data types, numeric ranges, array index boundary checking etc. more thoroughly) and let the compiler check for the correctness during the compilation phase and the run-time. This would empower the static analyzers to perform better and find more potential problems which will now slip through unnotices with the existing sloppy, almost non-existing C/C++ type system.
I guess it might be possible to add optional syntax elements to C/C++ and keep the code backwards compatible so that the old "unsafe" C/C++ code can be used without problem and gradually improve the existing code base to use more better specified code. The improved language features could be switched on using some #pragmas as it is already done to invoke some extra language features. This would allow the designers to choose whether or not to use the strict checking and improved code quality . This would be a win-win situation, and would make C/C++ a bit better language. There would still be quite a few features in MISRA C/C++ recommendations which would still need addressed to make C/C++ more robust for applications requiring reliability, but fixing the type checking system would be a good start.
I am not stating that Ada is the ultimate solution for the more robust software, but it has a good set of features which make it possible to design more robust software compared to C/C++. Of course, there will be a lot of programmers saying that uint8_t, uint16_t, uint32_t, struct, and pointers are totally sufficient for the robust software.