In skimming this thread, it strikes me that the real language problem here isn't with C, or C++, or which variant thereof, or Python, or ..., it is with human communication.
As is often the case, everyone has differing understandings of the terminology being used, starting with "programming," "software development," "embedded development," and what people mean when they talk about C, C++ and variants thereof in that context. People are more likely to have a high-degree of overlap in their understanding of well established terminology.
As has been noted, a decade or two ago, the idea of relying on C for embedded development was relatively new and not well accepted, now, it is. Why was there so much skepticism? Why have things changed? In part, the early skepticism was due to some real shortcomings in libraries, and compilers, and linkers, and in time, those problems were ironed out. I think though, that even if everything worked perfectly skepticism would still have prevailed, in part because it was new an unfamiliar. People were justifiably skeptical that it might be a flash in the pan. Also ,being new and unfamiliar, people heard about or had direct experience with C projects going badly, and it was inevitable that they would, because C gives one plenty of rope with which to hang oneself.
Overtime though, the craft of embedded programming in C became well understood enough that people could start to take for granted that they were talking about substantially the same thing. Which isn't to say that there isn't considerable room for misinterpretation. For some, the assumption is making use of a particular vendors ecosystem of libraries and compilers. For others, its writing or adapting ones own libraries. But even these differences have become better and more widely understood, and people are better at clarifying.
In time, I would expect greater shared understanding about the use of C++ in embedded development.
I think though, that there is going to be increased confusion about what constitutes embedded development. For this, we can thank/blame the semiconductor process engineers and chip designers. Their steady progress over the past half-century have enabled plenty of confusion, chaos, and dysfunction at every level of software development. Embedded development is no different.
The way I see it, the envelope of fixed costs, unit costs, and power consumption have shifted. Some situations that previously required embedded development specialists who fretted about every cent of unit cost, every coulomb, every byte of code, every clock cycle, every I/Opin and every bug, are now open to a wider range of practitioners. People who, ~15 years ago, might have been Visual Basic 6 developers, integrating various Microsoft and third party components to make in-house apps, might now be buying a few modules and using the vendor provided dev environment to make an embedded device for in-house use. When bugs are discovered, they'll use a vendor provided solution to push updated software to the field.
I'm sure this sounds like a perfect nightmare to most anyone who considers themselves embedded engineers, but for someone who can't quite find anything that is quite right for their need, and who can't afford to do "proper" embedded development its going to be a godsend. It remains to be seen what this does to traditional embedded development. Will opportunities for such work shrink? Hold steady? Expand? It's hard for me to guess. I think there will always be a need for the skills and mindset. But how many will they be employed developing for deployed products, vs tools used by higher-level developers?
In this new world, I think small low powered devices programmed using Lua, Python, Javascript, and similar are going to become more numerous. As for fear of them being flashes in the pan, I don't know if it will reassure Howardlong to hear that his perspective may be a little skewed. Perl's time in the sun as king of the backend languages was quite short. Coldfusion and ASP were strong challengers by 1998, and by the dotcom crash, PHP had a strong and rapidly expanding foothold on the UNIX servers Perl CGI apps called home. It wasn't too much longer before Ruby + Rails was stealing a lot of PHPs thunder. Python was slower catching on on Web servers than Ruby + Rails, but before Ruby found a niche in web development, Python was already being used as an integration language in a variety of areas (google used it a lot internally, YouTube was built with it, and it has a strong foothold in bioinformatics). Lua, its had a great foothold in computer game scripting for over a decade. Javascript has been the lingua franca of client-side web development since before anyone even really developed client-side web apps. Its presence as a serverside language is just about as old, though it didn't get any traction on the server until someone took Chrome's fast javascript engine and made Node.js.
To wrap all of this up:
Computer languages are really human languages. Even with the economic and cultural force of the combined British and American empires over multiple centuries, there are plenty of people who rarely use English, and plenty more who never use English. Even among english speakers, there are significant differences in accent, pronunciation, spelling, dialect, vocabulary and idiom. Even among english speakers talking about the same technical subject, there is ample room for misunderstanding.
Therefore, there are apt to be a variety of computer languages, and variants of individual computer languages and their applications to given problem domains. Moreover, now that computing is almost ubiquitous, the forces of proliferation and variation are likely to grow.
There may be large areas of common adoption and practice, but there will always be fractal edges. Todays tiny fractal edge of computing/programming could conceivably include more practitioners and users than all of computing did in the days that some of you started your careers. Or looking at it another way, there have been ~10x as many Rasberry Pi's sold as there were DEC PDP-11s.