And with STL and Boost you won't get far in the embedded world.
Tell me why I cannot get far in the embedded world, even on an 8 bit MCU, with such "horrible" STL/Boost things: std::array, std::pair, std::tuple, std::fill, std::is_same, boost::circular_buffer, boost::crc and so on.
These are pretty much pure templates, no new/delete, no exceptions or can be disabled, the compiler optimizes out everything leaving very compact output binary (or nothing in the binary, as some things like std::is_same are only compile time). I think people don't realize how good C++ compilers became in the last 1-2 decades. And in the exceptionally very rare cases one can always study the resulting assembly and may hint the compiler to do the right thing anyway, without resorting to the inline assembly.
Also there is embedded and there is embedded, sometimes people think everything is ATtiny. I once worked in a mission critical project (core network switch, total switching 6 Tbit/s) where our system was running on 14 cards, each one with dual core 1GHz PowerPC and 2GB RAM (some of the cards were control cards, one card active, one sleeping and taking over if the primary fails - extra redundancy).
The system was complex, written in C++, but with RT-allocator and Xenomai where needed. The management decided we cannot use STL because: "STL is not for embedded". Instead they've decided to get some people to develop "in-house" STL containers alternative. After a lot of time and corporate money they've came up with C++ classes operating on void pointers and even the simple vector there was 20% slower than the STL one, not to mention the bugs there (STL is tested for decades everywhere).
C is still around because it serves its purpose better than anything that has come along. Major projects, like the Linux kernel, are still being written in C.
Historical reasons and Linus' personal animosity towards C++, probably related to the former. When Linux was started C++ wasn't that mature, compilers buggy and the UNIX tradition has always been K&R C.
AFAIR Linus's argument against C++ was dynamic memory management (new/delete) and exceptions. For my embedded projects I disable exceptions (binary size too big for small MCUs) and either don't use heap (small stuff) or put TLSF allocator (real-time) underneath new/delete operators - problem solved.
I did some kernel development and when you start learning Linux codebase the lack of C++ syntax screams at you everywhere, for example:
- containers done with void pointers and macros (e.g. INIT_LIST_HEAD) - yay type safety,
- polymorphism, virtual functions and object hierarchies done with structs with function pointers that have to be manually set (e,g, ethtool_ops), pointer arithmethic to get private data (e.g. netdev_priv) and other horrid constructs
- instead of exceptions we get a ton of error prone gotos and return codes
Of course no one in the right mind would now start rewriting Linux to C++, it would be a tremendous effort. I think there were plans to rewrite GCC to C++, but I don't know what's the current status. Richard Stallman (original author of GCC) is another anti-C++ dinosaur. Of course it is perfectly possible to write a good compiler in C++, LLVM/Clang is the proof.
I don't get this C/C++ argument at all. C++ supports C as well and one can do stupid things in both languages. If someone is not fluent in C++ and writes embedded code with heavy use of heap (and with non-RT allocator), incorrectly uses exceptions (for program flow) or god knows why uses RTTI then it is not the language to blame. One can do stupid stuff in C, pulling in size-heavy glibc functions, dynamic memory on void pointers and ignore type safety.
I personally use C++ everywhere, from small MCU to large multithreaded image processing apps, either on the CPU or nVidia CUDA and I absolutely love strong type safety and template metaprogramming in all of these platforms. Just, as any other skilled worker, know your tools and libraries.