-
:o Just found this new subforum with nothing in it. Took the opportunity to start a thread about my favourite programming language, https://rust-lang.org It's a systems language, similar to C, but with more safety guarantees and modern features like iterators. It's slowly being adopted in the embedded landscape as well. Ask Me Anything.
-
Which microcontrollers have compilers?
-
Please, give some examples of the safety guarantees.
-
Ask Me Anything.
OK
How long have we got until the next ELE (https://en.wikipedia.org/wiki/E.L.E._(Extinction_Level_Event):_The_Final_World_Front)?
-
Ask Me Anything.
OK
How long have we got until the next ELE (https://en.wikipedia.org/wiki/E.L.E._(Extinction_Level_Event):_The_Final_World_Front)?
https://youtu.be/_Cje2LJJKS8?list=PLHSoxioQtwZfY2ISsNBzJ-aOZ3APVS8br (https://youtu.be/_Cje2LJJKS8?list=PLHSoxioQtwZfY2ISsNBzJ-aOZ3APVS8br)
-
Which microcontrollers have compilers?
There are compiler targets for thumbv6-8 and MIPS. I thought there was MSP430 but I don't see it now. There is also a matter of board support packages, which are currently all community provided, STM32 has the most traction there.
-
Please, give some examples of the safety guarantees.
Rust prevents common errors such as buffer overflows and data races statically at compile time.
-
.
.
It's slowly being adopted in the embedded landscape as well.
Really? Then I must have been sleeping too much as I have not exactly stumbled upon that (unless "slowly" is to be taken VERY literally)...
-
.
.
It's slowly being adopted in the embedded landscape as well.
Really? Then I must have been sleeping too much as I have not exactly stumbled upon that (unless "slowly" is to be taken VERY literally)...
Wow Rust has moved up to the #3 spot this year (was #9 last year). It seems to remain #1 in the "most loved language" category...
https://codinginfinite.com/best-programming-languages-to-learn-2019/
-
Please, give some examples of the safety guarantees.
Rust prevents common errors such as buffer overflows and data races statically at compile time.
Some types of races, at least. But you can't make things foolproof, since fools are so damn ingenious. The Dunning-Kruger syndrome is very evident in programming.
-
Please, give some examples of the safety guarantees.
Rust prevents common errors such as buffer overflows and data races statically at compile time.
Some types of races, at least. But you can't make things foolproof, since fools are so damn ingenious. The Dunning-Kruger syndrome is very evident in programming.
There's no substitute for understanding programming paradigms and actually designing a program instead of just slapping code together. The last 20 years of my career I was responsible for writing, and later for analyzing a lot of third-party web code and the process seemed to be "let newly-fledged Java coders write huge apps and hope for the best". They apparently assumed Java's garbage collection and libraries were going to prevent them from doing anything bad, with the result that garbage collection and mismanaged libraries caused problems by the score. We actually were recommending suites of monitoring software to watch the behavior of all this bloated, misdesigned code - yes, it was so bad that we had software monitoring software. :--
In contrast, during my prior job we had to write a very large embedded program without the "help" of object-oriented languages and their concomitant structures. We actually prepared a series of templates and processes, analyzed the pseudocode before writing the functional version, and stepped through the result before committing it to the compiler. Lots more work than just coding and debugging, but I'm convinced it saved us a lot of effort and the code was highly reliable (it was a realtime system for aircraft avionics; not vital to the flight system, but important to the end user).
I'm just beginning to explore Rust and Go, but they do seem to be designed so that intelligently-written code is much better behaved than with monstrosities like C++ and Java. Of course you'll be able to write crap with any of them, but these languages, at least, seem to be built to provide more capability and make better software possible instead of effortful. I'm hopeful that with the attention (e.g.) Rust is receiving, it may displace a lot of the junk and encourage coders to think and design instead of just following the received wisdom of a failed generation of software implementation.
-
yes, it was so bad that we had software monitoring software. :--
Which in turn probably needs software monitoring software monitoring software, no doubt produced by a team managed by ex-Major Major Major Major.
There's a lot to be said for having been brought up in the era when you wrote on coding forms, had it punched, and then had to wait for the fan-fold printout. Aside from the obvious advantages of an endless supply of used 80-column cards for bookmarks and shopping lists, it made you careful and it made you plan. Recent research suggests that there's a factor of between 10 and a 100 in the amount of total effort expended between taking time to get it right at the start of the software development process versus fixing bugs in deployment or production. The further down the pipeline towards production you go before fixing things, the higher the ratio.
-
Please, give some examples of the safety guarantees.
Rust prevents common errors such as buffer overflows and data races statically at compile time.
Some types of races, at least. But you can't make things foolproof, since fools are so damn ingenious. The Dunning-Kruger syndrome is very evident in programming.
There's no substitute for understanding programming paradigms and actually designing a program instead of just slapping code together. The last 20 years of my career I was responsible for writing, and later for analyzing a lot of third-party web code and the process seemed to be "let newly-fledged Java coders write huge apps and hope for the best". They apparently assumed Java's garbage collection and libraries were going to prevent them from doing anything bad, with the result that garbage collection and mismanaged libraries caused problems by the score. We actually were recommending suites of monitoring software to watch the behavior of all this bloated, misdesigned code - yes, it was so bad that we had software monitoring software. :--
I've seen all that, but don't place the root cause of the problems on a language.
The root cause is that OOP+GC has enabled large libraries to be easily created, bolted together and used. Much larger and more easily than without OOP+GC. While that has removed swathes of "old" problems, it has allowed others to emerge.
A classic example is blaming problems on "memory leaks", when actually it is "unintended data retention".
In contrast, during my prior job we had to write a very large embedded program without the "help" of object-oriented languages and their concomitant structures. We actually prepared a series of templates and processes, analyzed the pseudocode before writing the functional version, and stepped through the result before committing it to the compiler. Lots more work than just coding and debugging, but I'm convinced it saved us a lot of effort and the code was highly reliable (it was a realtime system for aircraft avionics; not vital to the flight system, but important to the end user).
I don't doubt that, but it is nothing to do with OOP+GC.
I'm just beginning to explore Rust and Go, but they do seem to be designed so that intelligently-written code is much better behaved than with monstrosities like C++ and Java. Of course you'll be able to write crap with any of them, but these languages, at least, seem to be built to provide more capability and make better software possible instead of effortful. I'm hopeful that with the attention (e.g.) Rust is receiving, it may displace a lot of the junk and encourage coders to think and design instead of just following the received wisdom of a failed generation of software implementation.
Rust and Go are beguiling, but I've seen too many languages with claimed advantages that turned out to be minor and unimportant. Ignoring them and concentrating on the real advances has been key to my career.
Neither Rust nor Go directly tackles a key emerging problem: many core processors with NUMA memory hierarchy. The only languages I've seen that do tackle it head on are CSP, Occam and the modern incarnation, xC. Note that the concept of "channels" is taken directly from CSP and Occam.
-
No, the specific problems are not linked directly to OOP/GC, but the fact that universities and employers push languages like Java without teaching the coders how to actually design programs first does cause that type of problem. It's so easy to make a gargantuan dogpile in Java or C++ that unless the coders actually blocked it out and analyzed it, they're likely to just cause problem after problem when they add more code to address the symptoms.
The "memory leak" issue comes from people writing spaghetti code or using libraries with unintended side effects, and not caring or knowing that they have to be aware of memory usage in spite of GC. Many of the problems we saw in production were the GC kicking in at a very bad time and grinding the programs to a halt because the programmers didn't bother to understand or stress test the code before submitting it to production. Everything to do with bad design and very little to do with the language - except that Java and its ilk are not taught along with proper design techniques (at least among the programmers we worked with at AT&T). I tended to write procedural Java (much despised by "real" OO programmers) but because I knew where memory was being used and for what, my code didn't have those issues; it may have run a bit slower at a guess, but it never crashed.
So my point was that languages like Java and C++ typically exhibit issues because the programmers aren't taught good techniques, and those languages are likely to be used a lot by inexperienced programmers IME who are too lazy to code wisely. They're also hard to use efficiently by fledgling programmers. Heck, I used Java for 16 years and never got familiar enough with it to be aware of all the pitfalls. C, on the other hand, is so easy to master that I think I understood it fairly well.
I haven't tossed my hat in the Rust or Go rings yet, but I'm watching to see if they do indeed exhibit advances which make them worth adopting. If so, then great.
-
No, the specific problems are not linked directly to OOP/GC, but the fact that universities and employers push languages like Java without teaching the coders how to actually design programs first does cause that type of problem. It's so easy to make a gargantuan dogpile in Java or C++ that unless the coders actually blocked it out and analyzed it, they're likely to just cause problem after problem when they add more code to address the symptoms.
I'm certainly not going to defend C++ in any way; my attitude is that if C++ is the answer then exactly what was the question.
The gargantuan dogpiles that can accrete in Java are because of its success in avoiding many of the problems in C/C++. Essentially avoiding the problems has allowed run of the mill programmers to make much larger programs than is practical in C/C++.
As for not teaching how to design programs, t'was ever thus. Some people should never be allowed near a keyboard or a soldering iron!
The "memory leak" issue comes from people writing spaghetti code or using libraries with unintended side effects, and not caring or knowing that they have to be aware of memory usage in spite of GC. Many of the problems we saw in production were the GC kicking in at a very bad time and grinding the programs to a halt because the programmers didn't bother to understand or stress test the code before submitting it to production. Everything to do with bad design and very little to do with the language - except that Java and its ilk are not taught along with proper design techniques (at least among the programmers we worked with at AT&T). I tended to write procedural Java (much despised by "real" OO programmers) but because I knew where memory was being used and for what, my code didn't have those issues; it may have run a bit slower at a guess, but it never crashed.
I wasn't clear enough. My objection is calling "data cancer" a "memory leak".
And it is well known that you can write Fortran in any language :)
So my point was that languages like Java and C++ typically exhibit issues because the programmers aren't taught good techniques, and those languages are likely to be used a lot by inexperienced programmers IME who are too lazy to code wisely. They're also hard to use efficiently by fledgling programmers. Heck, I used Java for 16 years and never got familiar enough with it to be aware of all the pitfalls. C, on the other hand, is so easy to master that I think I understood it fairly well.
As someone that first used C when there were precisely two books on the language (i.e. in 1982), I am only too well aware of how people think they know the language - but don't. C++ is even worse, and is beyond being salvaged.
Java is a much simpler language in that it has fewer gotchas - there's no need for "language lawyers", for example. The libraries can have gotchas, but that's a different thing altogether.
I haven't tossed my hat in the Rust or Go rings yet, but I'm watching to see if they do indeed exhibit advances which make them worth adopting. If so, then great.
That's my attitude. They show promise, but I've seen that too often in the past to hold my breath!
-
Indeed, also I tend to ignore languages until there is a standard document (with a version number that pins to a particular release of the standard, HTML '5' Looking at YOU!) and at least two competing implementations of that standard from different vendors...
Trying to keep up with a language under development is just an exercise in frustration (and means that a project you finished last year likely does not build with this years toolchain) |O
C for all its warts (and numerous implementation or undefined behaviour traps) is at least reasonably prone to maintain backwards compatibility, and you can, if you really know the language, do a surprising amount that is very likely to still compile reasonably cleanly 10 years from now, I am not so sure this applies to rust.
Regards, Dan.
-
Trying to keep up with a language under development is just an exercise in frustration (and means that a project you finished last year likely does not build with this years toolchain) |O
C for all its warts (and numerous implementation or undefined behaviour traps) is at least reasonably prone to maintain backwards compatibility, and you can, if you really know the language, do a surprising amount that is very likely to still compile reasonably cleanly 10 years from now, I am not so sure this applies to rust.
I've heard that there are applications which are Rust version specific, which is another reason I'm not going to dive in right now. It's already beyond acceptable that finished applications require specific (not minimum) Java versions to run; if a 25 year old language is that unstable (or your app is so poorly written), maybe consider a different language? I would anyway, but then Java doesn't do anything I need in a language that (say) Python doesn' t have.
People moan about the "modern" language features that C is missing, but it is a low level language and wasn't designed to be the answer to all things. I'm glad to see that it has remained its economical architecture even as bloat infests the popular languages; a lot of the embedded micro chips still use C development tools because it's a time-tested and efficient language.
-
Trying to keep up with a language under development is just an exercise in frustration (and means that a project you finished last year likely does not build with this years toolchain) |O
C for all its warts (and numerous implementation or undefined behaviour traps) is at least reasonably prone to maintain backwards compatibility, and you can, if you really know the language, do a surprising amount that is very likely to still compile reasonably cleanly 10 years from now, I am not so sure this applies to rust.
I've heard that there are applications which are Rust version specific, which is another reason I'm not going to dive in right now. It's already beyond acceptable that finished applications require specific (not minimum) Java versions to run; if a 25 year old language is that unstable (or your app is so poorly written), maybe consider a different language? I would anyway, but then Java doesn't do anything I need in a language that (say) Python doesn' t have.
Oh, but for programs with an expected long lifetime, with C and C++ it is prudent not to insist on using the same version of the compiler. There are quite a few cases where newer versions of a compiler optimise code differently - and break the application. Hence SOP is to develop inside a virtual machine, and to archive that virtual machine as part of the deliverable.
Now the traditional C compiler writer's response is that the developer misused the language. While that may (or may not) be true, it is certainly an indictment of the fragility of the language+compiler+developer combination.
People moan about the "modern" language features that C is missing, but it is a low level language and wasn't designed to be the answer to all things. I'm glad to see that it has remained its economical architecture even as bloat infests the popular languages; a lot of the embedded micro chips still use C development tools because it's a time-tested and efficient language.
Er no:
- C/C++ has changed considerably. The latest versions now have a defined memory model (a quarter of a century after Java demonstrated its advisability). Whether that memory model is defined well enough remains to be seen.
- You could not, until the latest version, even implement multitasking. To me it is remarkably that Hans Boehm even had to write a paper pointing that out: http://hboehm.info/misc_slides/pldi05_threads.pdf (http://hboehm.info/misc_slides/pldi05_threads.pdf) "Threads Cannot be Implemented as a Library"
In the 80s C/C++ was a low level language, but it became confused about its objectives in the early 90s. It would have been equally valid and useful to be either a low level language or a general purpose high level applications development language, but you can't be both. In trying to be both, it failed to be good at either.
Java then rapidly took over as a general high level applications language, and Python is also up and coming.
-
C, on the other hand, is so easy to master that I think I understood it fairly well.
Mastering it of course does not allow any non trivial project with more than one programmer to avoid buffer overflows and use after free, they are defacto language features ... not bugs.
-
- To me it is remarkably that Hans Boehm even had to write a paper pointing that out: http://hboehm.info/misc_slides/pldi05_threads.pdf (http://hboehm.info/misc_slides/pldi05_threads.pdf) "Threads Cannot be Implemented as a Library"
You actually cannot implement malloc in a conforming way as a library either (The issue has to do with alignment, and not actually knowing which type has the tightest requirements), not news to anyone who has actually read the standard.
The getout is that the C library implementation is NOT itself required to be written in standards conformant C, and writing it using either knowledge of implementation defined details of the compiler or (in some cases) by dropping down to platform specific assembler is perfectly legit. So it is with threads (C has no native test and set atomic for example, makes locking primitives kind of hard).
VMs with all the project build tools are a good and happymaking thing, but C11 is mostly a superset of the earlier language and backwards compatibility is fairly good in my experience (C++ MUCH less so, that language has been bent out of all recognition).
The real win with the VMs is that they mean you can reproduce bugs with exactly the tools used to originally build the release, usually a difference between tools means you cocked it up, but still being able to reproduce exactly has value.
-
C, on the other hand, is so easy to master that I think I understood it fairly well.
Mastering it of course does not allow any non trivial project with more than one programmer to avoid buffer overflows and use after free, they are defacto language features ... not bugs.
True, but it is more than just those "features".
See the C++ Frequently Questioned Answers http://yosefk.com/c++fqa/ (http://yosefk.com/c++fqa/) The standard response is "yes but the latest version of the standard is better", neglecting whether there are any compilers that fully and correctly implement the standard, and whether the new compilers can be used in the current project.
If C++ is the answer, you should re-evaluate the question.
-
:o Just found this new subforum with nothing in it. Took the opportunity to start a thread about my favourite programming language, https://rust-lang.org It's a systems language, similar to C, but with more safety guarantees and modern features like iterators. It's slowly being adopted in the embedded landscape as well. Ask Me Anything.
Hi Sajattack,
I'm a Forth user and would like to say Rust has been of great benefit to me indirectly :)
Some Rust folks wrote a program to convert the Texas Instruments version of CMSIS-SVD (everyone just has to have their own special stuff!) into the ARM version of CMSIS-SVD and it works perfectly! This allows me to use my SVD parsers on Ti MCUs' as well as ARM MCU's to automatically produce memory mapped and register bitfield Forth Words. Thanks guys!
-
Totally off-topic, but I seriously read the subject of the thread as "Rust in Peace Megadeth"
(https://i.ytimg.com/vi/9q6JFpQS384/maxresdefault.jpg)
-
I prefer Neil Young's Rust Never Sleeps.
-
Ad-Blocker Performance increased by 69x with New Engine Implementation in Rust
https://brave.com/improved-ad-blocker-performance/
-
Ad-Blocker Performance increased by 69x with New Engine Implementation in Rust
https://brave.com/improved-ad-blocker-performance/
We therefore rebuilt our ad-blocker taking inspiration from uBlock Origin and Ghostery’s ad-blocker approach. This focuses on a tokenization approach specific to ad-block rule matching against URLs and rule evaluation optimised to the different kinds of rules. We implemented the new engine in Rust as a memory-safe, performant language compilable down to native code and suitable to run within the native browser core as well as being packaged in a standalone Node.js module. Overall, we found that:
- The new algorithm with optimised set of rules is 69x faster on average than the current engine.
So, what does this have to do with Rust?
If they had it written in C it would be probably 300 times faster...
-
Ad-Blocker Performance increased by 69x with New Engine Implementation in Rust
https://brave.com/improved-ad-blocker-performance/
We therefore rebuilt our ad-blocker taking inspiration from uBlock Origin and Ghostery’s ad-blocker approach. This focuses on a tokenization approach specific to ad-block rule matching against URLs and rule evaluation optimised to the different kinds of rules. We implemented the new engine in Rust as a memory-safe, performant language compilable down to native code and suitable to run within the native browser core as well as being packaged in a standalone Node.js module. Overall, we found that:
- The new algorithm with optimised set of rules is 69x faster on average than the current engine.
So, what does this have to do with Rust?
If they had it written in C it would be probably 300 times faster...
It's just a demonstration of how Rust is being accepted by industry, and able to compile into very performant executables while having many advantages over C.
-
While I personally am a rank amateur programmer, we had a job at work about 2 years ago to write a kind of server. It had some data structures, a bunch of algorithms etc. not a huge project, but not trivial either. The young programmer who usually writes c++ wanted to do it in Rust. I said go for it. He wrote it in a few weeks. The server program has run now for two years under heavy load every day and has crashed zero times. That was a fairly impressive demo for me.
-
So, what does this have to do with Rust?
If they had it written in C it would be probably 300 times faster...
Some tools originally written in C, (grep) are actually faster in Rust (ripgrep). Nevermind the fact that Rust's worst case is nowhere near 300x slower, what do you think this is, an interpreted language?
-
So, what does this have to do with Rust?
If they had it written in C it would be probably 300 times faster...
Some tools originally written in C, (grep) are actually faster in Rust (ripgrep).
Thanks to better algorithms and despite the language.
Nevermind the fact that Rust's worst case is nowhere near 300x slower, what do you think this is, an interpreted language?
300x compared to 69x, relatively speaking 4.347826087 times faster ;D
-
Microsoft starting to look at RUST as a more secure alternative to C, C++
https://www.theregister.co.uk/2019/07/18/microsoft_rust_security/ (https://www.theregister.co.uk/2019/07/18/microsoft_rust_security/)
https://msrc-blog.microsoft.com/2019/07/16/a-proactive-approach-to-more-secure-code/ (https://msrc-blog.microsoft.com/2019/07/16/a-proactive-approach-to-more-secure-code/)
-
Microsoft starting to look at RUST as a more secure alternative to C, C++
https://www.theregister.co.uk/2019/07/18/microsoft_rust_security/ (https://www.theregister.co.uk/2019/07/18/microsoft_rust_security/)
https://msrc-blog.microsoft.com/2019/07/16/a-proactive-approach-to-more-secure-code/ (https://msrc-blog.microsoft.com/2019/07/16/a-proactive-approach-to-more-secure-code/)
More accurately, "one small part of Microsoft starting to look at Rust".
It is a start, but the accumulated code (etc etc) is a big impediment.
-
To me, using a language that is not standardized is a big impediment in itself. Some others may not care. I do.
Of course you can say that it's a chicken-and-egg question and that to get a chance to be standardized, a language first has to come into existence and get used widely enough. But when a language is at this "early" stage, you often have to stick to just ONE compiler, giving it your full trust even when there's nothing that can really tell you how robust it is, except a bunch of fanatic users. ;D
And that said, there have been SO MANY attempts releasing alternatives to C and C++ in the past 30 years or so, all meant to be that much safer, cure all diseases and solve world hunger... which one has really taken off? (And don't get me started with Java! :-DD )
Not saying that it's not interesting though - I've taken a look at Rust on several occasions and there are interesting concepts, although I find it a little too "bloated" already (I tend to lean towards Wirth's philosophy of "lean" languages, although I think he may have taken this a bit too far.)
I'm just really wondering what would need to be done in order to come up with a new language that really has a chance to take off and replace older languages that are still widely used. Yeah accumulated code is of course a big problem, but I think this is far from being the only one.
-
To me, using a language that is not standardized is a big impediment in itself. Some others may not care. I do.
Of course you can say that it's a chicken-and-egg question and that to get a chance to be standardized, a language first has to come into existence and get used widely enough. But when a language is at this "early" stage, you often have to stick to just ONE compiler, giving it your full trust even when there's nothing that can really tell you how robust it is, except a bunch of fanatic users. ;D
And that said, there have been SO MANY attempts releasing alternatives to C and C++ in the past 30 years or so, all meant to be that much safer, cure all diseases and solve world hunger... which one has really taken off? (And don't get me started with Java! :-DD )
Not saying that it's not interesting though - I've taken a look at Rust on several occasions and there are interesting concepts, although I find it a little too "bloated" already (I tend to lean towards Wirth's philosophy of "lean" languages, although I think he may have taken this a bit too far.)
I'm just really wondering what would need to be done in order to come up with a new language that really has a chance to take off and replace older languages that are still widely used. Yeah accumulated code is of course a big problem, but I think this is far from being the only one.
You are being a bit blinkered by your prejudices.
Smalltalk was highly influential, and is embedded in several complex HP and Tektronix instruments.
Objective-C has also been rather successful, and is only recently being overtaken by Swift.
Java has taken off and succeeded extremely well in areas where C++ tried and failed dismally. OTOH, Java failed in the embedded arena.
As for languages with a standard, C/C++ is extremely good: there are so many different C/C++ standards to choose from.
As for whether Rust will succeed, only time will tell. It does have some beguiling core principles that avoid the frequent problems with C/C++, but it doesn't (yet) have the ctitical mass.
-
You are being a bit blinkered by your prejudices.
If you say so. We are all expressing our own opinions here. I'm not claiming that I hold the Truth.
My only prejudice though here, I think, is my initial reaction of considering Rust as "yet another me too wannabe". Sure it's a prejudice, not denying, but based on experience. We'll talk about it in 10 years from now if we are still around, I won't have a problem with being proven wrong.
Smalltalk was highly influential, and is embedded in several complex HP and Tektronix instruments.
Smalltalk's initial design actually predated C, and had pretty different objectives IMO, so it doesn't fall into the categories I was mentioning whatsoever (I was absolutely not saying that NO other language than C and C++ was worth considering!) But yes it has been highly influential, as many other languages of that era.
As to being used, that's a very small niche you're talking about.
Objective-C has also been rather successful, and is only recently being overtaken by Swift.
AFAIK, only Apple made it successful, and it's already declining now. Still the epitome of a niche. I actually found Objective-C better designed than C++, so this could have been one I had been happy to see taking over at least C++.
As to being used, that's still a niche we're talking about, and it's probably going to disappear completely once everything has migrated to Swift. Or something else.
Java has taken off and succeeded extremely well in areas where C++ tried and failed dismally. OTOH, Java failed in the embedded arena.
To me, JAVA is a disgrace for many reasons that would be too long to expose here. It has succeeded in some areas, I agree. That's the only serious example of a replacement here. I just don't think it has been a sane one, but that's another story. It worked. So whatever we think of the language, there are probably key ideas in it and in the way it was promoted that need to be considered to partially answer my last question.
As for languages with a standard, C/C++ is extremely good: there are so many different C/C++ standards to choose from.
It's the only guarantee to me that the tools will be correctly designed and do what they are supposed to. Yes standards are evolving, so? Does that make them irrelevant? I don't think so. Incidentally, the major languages that are still in common use today in safety-critical applications, such as C and ADA, are standardized. Call me when Java, Smalltalk or Objective-C (or now Rust) is used in such applications. (And I hope Boeing is not taking this as a hint :P )
Again, as I said above, that's my own view and criterion. If you don't care or even think standards are counter-productive, good for you. To each his own.
As for whether Rust will succeed, only time will tell. It does have some beguiling core principles that avoid the frequent problems with C/C++, but it doesn't (yet) have the ctitical mass.
Of course only time will tell. Again, I'm just pointing out the absurd number of previous attempts all wishing the same thing, all having core features supposed to avoid the frequent problems with C and C++.
And I'm not bashing the idea of attempting to do so per se, I'm just convinced they all have been missing some critical points that I'm not completely sure about at this point (if I were, I may make such an attempt myself ;D ), and, again, I do think this goes beyond the mere force of habit or huge code bases, even though those obviously count.
Not willing to rain on Rust's parade. I just haven't detected anything in Rust that would really separate it from the previous attempts, but I'd be interested in hearing which features exactly could make it different, and in which areas specifically.
-
As for whether Rust will succeed, only time will tell. It does have some beguiling core principles that avoid the frequent problems with C/C++, but it doesn't (yet) have the ctitical mass.
Not willing to rain on Rust's parade. I just haven't detected anything in Rust that would really separate it from the previous attempts, but I'd be interested in hearing which features exactly could make it different, and in which areas specifically.
All you have to do is look at the tutorials and documentation at, for example, https://doc.rust-lang.org/book/ Having said that, the concepts are scattered and introduced gradually; the Java whitepaper was much better in that respect.
Essentially there appear to be a number of relatively small concepts which cumulatively remove many of the gotchas, particularly in the areas relating to concurrency and memory. In addition there are design styles based on those concepts that encourage people to "think concurrent" in ways that have been shown to be workable.
-
I prefer Neil Young's Rust Never Sleeps.
Hey hey, my my
Bugs in code will never die
-
All you have to do is look at the tutorials and documentation at, for example, https://doc.rust-lang.org/book/ Having said that, the concepts are scattered and introduced gradually; the Java whitepaper was much better in that respect.
Well, I have! Granted not extensively... and yes it's not really summing things up.
Essentially there appear to be a number of relatively small concepts which cumulatively remove many of the gotchas, particularly in the areas relating to concurrency and memory. In addition there are design styles based on those concepts that encourage people to "think concurrent" in ways that have been shown to be workable.
Well, yes. But again, I haven't really seen anything that was not already tried in other languages for the same reasons. Maybe it's all in the small details and the exact set of all the language features that they chose for Rust that makes it different, but it's not very obvious to me.
-
All you have to do is look at the tutorials and documentation at, for example, https://doc.rust-lang.org/book/ Having said that, the concepts are scattered and introduced gradually; the Java whitepaper was much better in that respect.
Well, I have! Granted not extensively... and yes it's not really summing things up.
Essentially there appear to be a number of relatively small concepts which cumulatively remove many of the gotchas, particularly in the areas relating to concurrency and memory. In addition there are design styles based on those concepts that encourage people to "think concurrent" in ways that have been shown to be workable.
Well, yes. But again, I haven't really seen anything that was not already tried in other languages for the same reasons. Maybe it's all in the small details and the exact set of all the language features that they chose for Rust that makes it different, but it's not very obvious to me.
There was a revealing major difference between the academic papers from the C/C++ community and the Java community:
- Java community: referenced the concepts and experience found in a wide range of languages and environments
- C/C++ community: referenced other C/C++ literature, and appeared not to know much about other languages and enviroments. That doomed them to reinventing wheels, poorly
A notable feature of the Java whitepaper was that it clearly stated the key principles, where they came from, why they had been proven good, and how they all worked well together. Summary: wide knowledge plus good taste.
The Rust document lacks the clarity, but I'm beguiled about their sense of taste in a way I've only found with C (in 1981!), Smalltalk, Java and - for hard realtime applications - xC.
-
For the most part I've dismissed Rust as something like Go or Node.js; lots of hype from fans, but nothing you couldn't do in a library in any major language. Plus the language fans try to build it up by tearing down other languages. There are lots of good languages and technologies. It is really hard to say that everyone should program in x or everyone shouldn't program in y.
Reading through this thread, I saw in Rust documentation that it doesn't support NULL but uses algebraic data types like Haskell. That, in and of itself, is a major plus in my book. When Java tried to fix C++ or MS did C# and .Net, they all made the mistake of keeping NULL ... and making it the default! If anything, C++ handles NULL better by pushing value semantics and references that don't have NULL.
So yeah, I guess I am intrigued by Rust now.
-
For the most part I've dismissed Rust as something like Go or Node.js; lots of hype from fans, but nothing you couldn't do in a library in any major language. Plus the language fans try to build it up by tearing down other languages. There are lots of good languages and technologies. It is really hard to say that everyone should program in x or everyone shouldn't program in y.
Reading through this thread, I saw in Rust documentation that it doesn't support NULL but uses algebraic data types like Haskell. That, in and of itself, is a major plus in my book. When Java tried to fix C++ or MS did C# and .Net, they all made the mistake of keeping NULL ... and making it the default! If anything, C++ handles NULL better by pushing value semantics and references that don't have NULL.
So yeah, I guess I am intrigued by Rust now.
That's pretty much my view.
One of my heroes created that "billion dollar mistake"; https://en.m.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retractions
-
One of my heroes created that "billion dollar mistake"; https://en.m.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retractions
Certainly one of mine as well.
Now of course there can be a fine line between what's reasonable and what's not.
We could also say that we should get rid of the zero value in all numerical representations so that no computation would ever result in a "divide by zero" error (which can potentially ruin a program's execution just as badly as a bad reference). Would that be going a little too far?
-
:o Just found this new subforum with nothing in it. Took the opportunity to start a thread about my favourite programming language, https://rust-lang.org It's a systems language, similar to C, but with more safety guarantees and modern features like iterators. It's slowly being adopted in the embedded landscape as well. Ask Me Anything.
I don't see why someone should throw everything he invested in the continuously improving and dominantly used C++ away.
As i see it the things Rust claims to improve can also be achieved with C++ by appropriate methodology and (possibly future) static checking/diagnostic tools, but at a overall lower and also incremental effort.
As i see it stagnation before C++ 11 might have triggered the Rust people to try their own way, but this is becoming more and more obsolete.
-
:o Just found this new subforum with nothing in it. Took the opportunity to start a thread about my favourite programming language, https://rust-lang.org It's a systems language, similar to C, but with more safety guarantees and modern features like iterators. It's slowly being adopted in the embedded landscape as well. Ask Me Anything.
I don't see why someone should throw everything he invested in the continuously improving and dominantly used C++ away.
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
As i see it the things Rust claims to improve can also be achieved with C++ by appropriate methodology and (possibly future) static checking/diagnostic tools, but at a overall lower and also incremental effort.
You miss a critical point: presuming it is even possible that can only be achieved in a subset of C++. And every person/group/company chooses a slightly different subset.
Now try to guarantee your program's properties when using libraries from 10 different sources - you simply cannot.
-
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
That might not be the best example to chose. Semiconductor memory started displacing core memory at the beginning of the 70s, but it took until the mod 80s to displace it from high reliability applications.
-
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
That might not be the best example to chose. Semiconductor memory started displacing core memory at the beginning of the 70s, but it took until the mod 80s to displace it from high reliability applications.
It was clear, in 1978, that core memory was the past, and semiconductor memory the future. It didn't take a genius to see that!
-
If anyone wants to get their feet wet with embedded Rust, The Discovery Book (https://docs.rust-embedded.org/discovery/index.html) is a introductory tutorial using the STM32F3DISCOVERY board. Fair warning, it is not a tutorial on the Rust language itself. There's also The embedded Rust book (https://rust-embedded.github.io/book/), and The embedonomicon (https://docs.rust-embedded.org/embedonomicon/index.html) for more in-depth coverage.
-
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
That might not be the best example to chose. Semiconductor memory started displacing core memory at the beginning of the 70s, but it took until the mod 80s to displace it from high reliability applications.
It was clear, in 1978, that core memory was the past, and semiconductor memory the future. It didn't take a genius to see that!
It was clear in 1972 that core memory was the past. However. core was still being designed into fresh high reliability applications in 1980.
-
OTOH, it's not clear in 2019 that Rust is the future.
;D
-
OTOH, it's not clear in 2019 that Rust is the future.
;D
Just so!
As I wrote earlier
Rust and Go are beguiling, but I've seen too many languages with claimed advantages that turned out to be minor and unimportant. Ignoring them and concentrating on the real advances has been key to my career.
Neither Rust nor Go directly tackles a key emerging problem: many core processors with NUMA memory hierarchy. The only languages I've seen that do tackle it head on are CSP, Occam and the modern incarnation, xC. Note that the concept of "channels" is taken directly from CSP and Occam.
-
My pet peeve are claims that C is a bad choice for new projects and that languages like Go or Rust are better options This is only true if you take a subset of the lessons history has taught us and disregard the rest. I see this as irresponsible.
Fundamentally: having programs that "work at all" is more important than any form of bug or memory safety issue. In a set of specific cases (medical, auto & other life-critical) you can argue that a program should have the opposite priorities; but in the vast majority of cases a device that works 99% of the time is better for society than not having that device at all.
I think we are living in the 'information dark ages', where a lot of our useful and amazing creations of code are simply not going to be around ten or twenty years from now. They will be broken (or passively deleted, another problematic topic).
We seem to have come to accept that a program written today won't be useful or usable ten years from now, and I think that's a very bad philosophy. Compare it with many of the machines (cars, factories), artwork (music, books, film), architecture and systems (society, education, etc) that show how good creations can last decades or more. Most don't, but they have at least a chance of doing it.
The idea that a program or code can't survive 10 or 20 years without severe rewriting, because of a choice in its design (eg programming language, framework), is really bad. A similar argument can be made about a lot of modern car designs and DRM in digital works: we dislike the idea that they have an almost fixed use-by date.
I see choosing "new" languages as a very risky proposition to the life of a project. These languages are less likely to be around in ten or twenty years time (newer and better Rusts and Gos are likely to form in the years ahead) and in the meantime require you to constantly expend effort to prevent your projects from being turned into a brick by language/environment/library changes. In contrast my old C projects from 5-8 years ago still compile and run, with one particular example only needing a small modification. It's not perfect, but it's much safer.
TL;DR: "time safety" is more important than "memory safety" et al. We live in a world where time destroys more things than software bugs will ever be able to do. You should choose a language based off the biggest threats to the (integrated) usefulness of your projects, not based off the threats in your local (sampling) region. C is much more likely to last many times longer and much more stably than these new languages.
Should you care about code and software longevity if you are only making personal projects?
I still say yes. I want to be able to show and demonstrate projects I made "when I was young" to my kids, just like I will be able to show them old pictures of family. I want to be able to show them the lessons of my life, not show them that the only lesson is about "oh yeah we lost all of that".
It's always useful to be able to use old code from old projects too. Ideally perhaps even forever, but in our current world that's a longshot. But even code snippets that survive a few years are very useful. Contrast this to how many environment or framework specific code examples from the web that you have tried but found do not work any more -- you do not want your code becoming that.
Should we persue these new languages?
Yes, experimentation and development drive our society. But ordinary people using them, IMHO, provides minimal benefit compared to the problems it causes.
This is the same reason that we don't all drive prototype cars and wear prototype clothes. Overall it would cause more harm than good.
As such: I don't think it's responsible to ask ordinary programmers to start their projects in new languages.
-
If anyone wants to get their feet wet with embedded Rust, The Discovery Book (https://docs.rust-embedded.org/discovery/index.html) is a introductory tutorial using the STM32F3DISCOVERY board. Fair warning, it is not a tutorial on the Rust language itself. There's also The embedded Rust book (https://rust-embedded.github.io/book/), and The embedonomicon (https://docs.rust-embedded.org/embedonomicon/index.html) for more in-depth coverage.
The (Rust) Discovery Book is nicely written and I think will be a great assistance to anyone interested in Rust on a Cortex-M3 Discovery. I'm a Cortex-m0 Forth user and not interested in Rust but I have spoken with some keen and dedicated Rust users who seem to have a habit of writing programs very useful to me, i.e. :
svd2rust: https://github.com/whitequark/svd2rust Register definitions are extracted from the DSLite supplied with the TM4C module of the Energia IDE, converted to SVD using ti2svd and generated as Rust code using svd2rust.
The MSP430xx SVD is created by Vadzim Dambrouski, https://github.com/pftbest/msp430_svd. Msp430_svd takes DSlite files released by Ti and processes them with the Rust programming language into SVD files suitable for Svd2mecrisp.
I consider understanding CMSIS-SVD a top priority for any Cortex-M embedded designer, and note that this was one of the areas that Rust programmers looked at and mastered, and in so doing even solved my problems, namely converting the Texas Instruments debug-related files in Energia that are basically a not-invented-here variant of SVD. This indicates to me that Rust *is* a serious embedded language.
This meant that I could add the MSP430 and TM4C MCU's to my list of autogenerated support files for Forth PLUS I could now auto generate Assembly support files and modify the Mecrisp source using CMSIS-SVD consistent naming.
-
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
You imply the difference between Rust and C++ is like the difference between core and semiconductor memory?
So with Rust your memory needs a tiny fraction of the space?
So with Rust your memory becomes thousands of times faster?
So Rust makes your memory super cheap?
As software engineering is not a physical device, the somewhat physical thing here is the engineer. And you say if this engineer is using Rust he will get these dramatic improvements?
Yours are the typical cheers of someone wanting to suggest a benefit to people - if they believe him - that never can materialize.
You miss a critical point: presuming it is even possible that can only be achieved in a subset of C++. And every person/group/company chooses a slightly different subset.
Now try to guarantee your program's properties when using libraries from 10 different sources - you simply cannot.
They can choose, what is wrong with that?
It seems Rust only limits these choices, you say it is only for the "good", but maybe too intrusive or limiting, i don't know.
So you imply engineers are too stupid or lazy to deal with their freedom and responsibility to install proper coding guidelines and proper tools themselves, which is not as expensive as throwing everything old away, so they better submit to Rust?
This is quite pretentious, you know?
And why do you think over time (or even already) the available libraries in Rust will not just become quite similar, have similar problems, because after all it is more about the people doing them, not the slight variation of a language these are coded it in?
-
Out of curiosity, what is your background and experience?
I'm a professional engineer that has been using C and other languages since 1981, and been designing and implementing systems ranging from low-noise analogue electronics and RF, through soft and hard realtime digital+software systems, to high availability telecoms systems. Hence I do have considerable experience in such systems work, and how they can fail.
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
You imply the difference between Rust and C++ is like the difference between core and semiconductor memory?
So with Rust your memory needs a tiny fraction of the space?
So with Rust your memory becomes thousands of times faster?
So Rust makes your memory super cheap?
I don't imply that, so those statements are irrelevant.
As software engineering is not a physical device, the somewhat physical thing here is the engineer. And you say if this engineer is using Rust he will get these dramatic improvements?
Yours are the typical cheers of someone wanting to suggest a benefit to people - if they believe him - that never can materialize.
I don't understand what you are trying to say, but it looks like there are strawman arguments there.
You miss a critical point: presuming it is even possible that can only be achieved in a subset of C++. And every person/group/company chooses a slightly different subset.
Now try to guarantee your program's properties when using libraries from 10 different sources - you simply cannot.
They can choose, what is wrong with that?
The problem is that people make poor choices, some of which can be avoided by using a better tool.
It seems Rust only limits these choices, you say it is only for the "good", but maybe too intrusive or limiting, i don't know.
So you imply engineers are too stupid or lazy to deal with their freedom and responsibility to install proper coding guidelines and proper tools themselves,
That is indeed the evidence from the real world.
Read comp.risks. Every software and hardware engineer should do that, since it is low volume, carefully curated, and full of high quality examples from very experienced professionals.
Pay attention to technical news reports.
Look at the CERT advisories.
Very few people understand all the implications of all the various sections in all the various C and C++ standards. Those that do understand them have to spend far too much time working around boring low-level problems caused by the tools. It would be far better if they used their intellect to address problems inherent in the application they are creating.
This is quite pretentious, you know?
I expect English is not your first language.
And why do you think over time (or even already) the available libraries in Rust will not just become quite similar, have similar problems, because after all it is more about the people doing them, not the slight variation of a language these are coded it in?
I anticipate that, over time, there will be problems with such libraries. But they will be new problems associated with being able to address applications that are not very practical in existing languages - especially those related to networking and NUMA multicore processors.
-
Out of curiosity, what is your background and experience?
I'm a professional engineer that has been using C and other languages since 1981, and been designing and implementing systems ranging from low-noise analogue electronics and RF, through soft and hard realtime digital+software systems, to high availability telecoms systems. Hence I do have considerable experience in such systems work, and how they can fail.
"How they can fail" ... the most important experience a designer can get imho.
-
Out of curiosity, what is your background and experience?
I'm a professional engineer that has been using C and other languages since 1981, and been designing and implementing systems ranging from low-noise analogue electronics and RF, through soft and hard realtime digital+software systems, to high availability telecoms systems. Hence I do have considerable experience in such systems work, and how they can fail.
"How they can fail" ... the most important experience a designer can get imho.
Beginners and amateurs/hobbyists concentrate on how things work. Professionals concentrate on how things fail, and design to avoid the failures as far as possible.
Frequently such design appears to be "a tail that wags the dog" and "overly complex". Classic examples are the detailed requirements of various mains electrical installation codes.
Beginners then think the details aren't important, decide they can do without them, and make subtle old mistakes that should be avoided.
-
So you imply engineers are too stupid or lazy to deal with their freedom and responsibility to install proper coding guidelines and proper tools themselves
Only the ones not considering alternative languages to avoid the common memory errors in C.
Which until recently was far far too many of them. They'd rather kludge on with tools and coding practices which can't even guarantee absence of those memory errors. Like good old MISRA C ... fat lot of good that did THREADX (https://nvd.nist.gov/vuln/detail/CVE-2019-6496).
-
So you imply engineers are too stupid or lazy to deal with their freedom and responsibility to install proper coding guidelines and proper tools themselves
Only the ones not considering alternative languages to avoid the common memory errors in C.
Which until recently was far far too many of them. They'd rather kludge on with tools and coding practices which can't even guarantee absence of those memory errors. Like good old MISRA C ... fat lot of good that did THREADX (https://nvd.nist.gov/vuln/detail/CVE-2019-6496).
No need for a new language, just use smart pointers (which you can have since >20 years in C++), std::string, std::vector, etc. and proper coding techniques.
-
So you imply engineers are too stupid or lazy to deal with their freedom and responsibility to install proper coding guidelines and proper tools themselves
Only the ones not considering alternative languages to avoid the common memory errors in C.
Which until recently was far far too many of them. They'd rather kludge on with tools and coding practices which can't even guarantee absence of those memory errors. Like good old MISRA C ... fat lot of good that did THREADX (https://nvd.nist.gov/vuln/detail/CVE-2019-6496).
No need for a new language, just use smart pointers (which you can have since >20 years in C++), std::string, std::vector, etc. and proper coding techniques.
The "escape clause" in any such guarantee is, of course, "and proper coding techniques".
Even without smart pointers, there are "proper coding techniques". Demonstrably programmers don't use them, so smart pointer will be convenient but insufficient.
Alternatively, if programmers did use "proper coding techniques", then smart pointers would be convenient but unnecessary.
That's the problem with the real world: you have to assume imperfect people, and choose your strategy to take that into account. Hence, if C++ is the answer, just what was the question?
-
Out of curiosity, what is your background and experience?
I'm a professional engineer that has been using C and other languages since 1981, and been designing and implementing systems ranging from low-noise analogue electronics and RF, through soft and hard realtime digital+software systems, to high availability telecoms systems. Hence I do have considerable experience in such systems work, and how they can fail.
Someone might note that you could have used C++ then instead of C as it was a way to avoid common errors, improve abstraction and ease coding.
A very successful and quite groundbreaking way, back then and still today.
Agreed, but neither should they stay with something when something better is available. For example, I was once offered a job in a company that made core memory when semiconductor memory was available!
You imply the difference between Rust and C++ is like the difference between core and semiconductor memory?
So with Rust your memory needs a tiny fraction of the space?
So with Rust your memory becomes thousands of times faster?
So Rust makes your memory super cheap?
I don't imply that, so those statements are irrelevant.
So we agree? Rust is not significantly "better", definitely not worth discarding all proven?
You miss a critical point: presuming it is even possible that can only be achieved in a subset of C++. And every person/group/company chooses a slightly different subset.
Now try to guarantee your program's properties when using libraries from 10 different sources - you simply cannot.
They can choose, what is wrong with that?
The problem is that people make poor choices, some of which can be avoided by using a better tool.
And people always will. Look at history. The most persistent thing is human nature, making poor choices but also good choices that brought us into our world today.
It also looks like the effort spent to stop people from making presumably bad choices is less effective than just giving people choices and make them responsible for their choices.
Of course you have to be careful with whom you can give how much choices and responsibility.
-
The "escape clause" in any such guarantee is, of course, "and proper coding techniques".
Even without smart pointers, there are "proper coding techniques". Demonstrably programmers don't use them, so smart pointer will be convenient but insufficient.
Alternatively, if programmers did use "proper coding techniques", then smart pointers would be convenient but unnecessary.
That's the problem with the real world: you have to assume imperfect people, and choose your strategy to take that into account. Hence, if C++ is the answer, just what was the question?
You call smart pointers in C++ unnecessary but convenient and advertise Rust having a kind of elaborated smart pointer implicitly built in?
I think you made my point clear, thank you.
-
Out of curiosity, what is your background and experience?
I'm a professional engineer that has been using C and other languages since 1981, and been designing and implementing systems ranging from low-noise analogue electronics and RF, through soft and hard realtime digital+software systems, to high availability telecoms systems. Hence I do have considerable experience in such systems work, and how they can fail.
Someone might note that you could have used C++ then instead of C as it was a way to avoid common errors, improve abstraction and ease coding.
A very successful and quite groundbreaking way, back then and still today.
Oh, I did look at C++ and Objective-C in 1988. C++ didn't add any significant practical benefits, and introduced a whole new lot of traps. Even Soustroup said (and says) as much. If you dout
that, have a look at the C++ FQA, and alternately weep and giggle. https://yosefk.com/c++fqa/
Objective-C did add benefits, since it took inspiration from Smalltalk.
In the early 90s the committee(s) spent at least a year discussing whether it should be impossible to "cast away const", or whether that was a necessary ability - there are good rationales for both. I, and others, took the attitude that if that couldn't be easily resolved, something was rotten in the state of Denmark.
When Java appeared in 96, it was very notable how many significant libraries were instantly available, and how they interplayed nicely with other libraries. C++ had totally failed to achieve that in a decade of development!
Again, I ask "what is your background and experience?". The reason I ask is because your statements are typical of those that have some small-scale experience of a language, but have little experience of how things work (and fail) in larger, longer-lived projects.
-
The "escape clause" in any such guarantee is, of course, "and proper coding techniques".
Even without smart pointers, there are "proper coding techniques". Demonstrably programmers don't use them, so smart pointer will be convenient but insufficient.
Alternatively, if programmers did use "proper coding techniques", then smart pointers would be convenient but unnecessary.
That's the problem with the real world: you have to assume imperfect people, and choose your strategy to take that into account. Hence, if C++ is the answer, just what was the question?
You call smart pointers in C++ unnecessary but convenient and advertise Rust having a kind of elaborated smart pointer implicitly built in?
The detailed differences are important, both practically and theoretically. A key consideration is not any one feature in isolation, but how the sum of the features work together.
I think you made my point clear, thank you.
If you think that, then you understand less than you think.
-
Oh, I did look at C++ and Objective-C in 1988. C++ didn't add any significant practical benefits, and introduced a whole new lot of traps. Even Soustroup said (and says) as much. If you dout
that, have a look at the C++ FQA, and alternately weep and giggle. https://yosefk.com/c++fqa/
Objective-C did add benefits, since it took inspiration from Smalltalk.
Its a pity Stroustroup didn't look more closely at Objective-C. It has better functionality than (at least the early) C++, but I find the syntax of C++ more readable. Readable is good for reducing stupid errors.
More than this I wish there had been a true C++. A genuine incremental step from C that includes the basic structuring aspects of C++, like classes, but doesn't have all the stuff that can have serious performance issues, but only benefits a few applications. Sure, you can use C++ and try to avoid that stuff, but it only takes one slip.....
-
Oh, I did look at C++ and Objective-C in 1988. C++ didn't add any significant practical benefits, and introduced a whole new lot of traps. Even Soustroup said (and says) as much. If you dout
that, have a look at the C++ FQA, and alternately weep and giggle. https://yosefk.com/c++fqa/
Objective-C did add benefits, since it took inspiration from Smalltalk.
Its a pity Stroustroup didn't look more closely at Objective-C. It has better functionality than (at least the early) C++, but I find the syntax of C++ more readable. Readable is good for reducing stupid errors.
That would never have happened:
- the philosophy was too different. Objective-C took one known good idea and implemented it simply and effectively. Soustroup made the deliberate choice to avoid making a choice, thus forcing every developer thereafter to make their own choices
- they were developed at the same time
- the C/C++ community traditionally ignores what is happening in other communities, whereas the Java (and other) communities have a far wider knowledge and select harmonious concepts from whereever they can find them. That is visible if you look at the references of academic papers. C/C++ reference other C/C++ papers, whereas Java (etc) reference a wider range of papers
Summary: other languages/communities learn from other language/community mistakes. The C/C++ community is immersed in the complexity of their creation, so they repeat mistakes that others have avoided.
Rust is avoiding that inward looking dysfunction, so it stands a chance.
More than this I wish there had been a true C++. A genuine incremental step from C that includes the basic structuring aspects of C++, like classes, but doesn't have all the stuff that can have serious performance issues, but only benefits a few applications. Sure, you can use C++ and try to avoid that stuff, but it only takes one slip.....
Yes, yes, and yes that's the problem that even Soustroup acknowledges.
-
Again, I ask "what is your background and experience?". The reason I ask is because your statements are typical of those that have some small-scale experience of a language, but have little experience of how things work (and fail) in larger, longer-lived projects.
This is your argument? What about the subject?
So you say only what you call a "large scale" and "long lived" project is a valid project?
You again expose yourself as a authoritarian supremacist, but i hope you are retired and don't bother engineers anymore.
Your statements are typical of a bureaucrat that likes to take his place of power in the hierarchy of a big company, not being productive but more a politician or even activist.
I can assure you, even without your "advice", people will succeed in solving their problems adequately in their way, just as they did before you.
-
The detailed differences are important, both practically and theoretically. A key consideration is not any one feature in isolation, but how the sum of the features work together.
These inconsistencies are called progress, of which C++ has made a lot since 11.
And surely the "inconsistency" of abandoning C++ totally is bigger.
If you think that, then you understand less than you think.
I at least gave a explanation why i came to my conclusion, you just imply i don't understand.
"You don't understand anything!" - a well known ultimate "Argument".
-
I promised myself I'd stay out of this one, but I can't watch someone vigorously defending C++ as possible to write something safe in with STL and proper coding standards. To err, is human, and to err when the only tool you have is C++ is inevitable and a lot of the time no one in the company knows there's a big err. Programmers have always been arrogant enough to suggest that self-imposed rules and constraints are enough to prevent whole classes of defects, but they are wrong. That assumes humans are perfect, which they are not.
std::string a = "a";
std::string_view b = a + "b\n";
std::cout << b;
Spot the problem. A "thanks" to anyone who finds it in such a simple example. It requires that you truly understand how the compiler works.
Also lets remember that STL implementations vary in quality and some things are statically compiled with "some old shit from the 90s" or a variant of it...
To note, I don't write C++ any more. It's basically my job to persuade people not to write C++. I prefer to recommend C# these days. It has a decent type system, standard library and compilers, it's fast, the tools are good, humans exist who are moderately productive and it solves problems without turning everything into CVE factories. You can concentrate on the implementation bugs and constraints rather than the shaky foundation you build your crap pile on. The crap piles I deal with are around the 2-30 MLOC sized "platforms" not trivial stuff.
Final comment: C++ even at the latest and greatest version is a chunk of emmental where someone has attempted to putty up some of the holes on it and put a cloth over the mouldy bits.
I make no comment on rust. I don't use it. It's not mature. When it has been around for about 10 years, or ThoughtWorks come up with a condensed motivational paper for using it, I will evaluate it, but not before!
-
Google uses smartpointers and has a ton of fuzzers/etc and still they get exploitable overflows and use after free.
-
Again, I ask "what is your background and experience?". The reason I ask is because your statements are typical of those that have some small-scale experience of a language, but have little experience of how things work (and fail) in larger, longer-lived projects.
This is your argument? What about the subject?
So you say only what you call a "large scale" and "long lived" project is a valid project?
You again expose yourself as a authoritarian supremacist, but i hope you are retired and don't bother engineers anymore.
Your statements are typical of a bureaucrat that likes to take his place of power in the hierarchy of a big company, not being productive but more a politician or even activist.
I can assure you, even without your "advice", people will succeed in solving their problems adequately in their way, just as they did before you.
Your as hominem attack is very wide of the mark.
Given that you continue to avoid telling us your experience and background, I think we are justified in presuming you have little real world experience of creating medium or large commercial programs.
Experience does matter. It makes you humbler and more aware of how difficult it is to avoid things failing.
-
std::string a = "a";
std::string_view b = a + "b\n";
std::cout << b;
So... What happens?string_view gives you read-only access to another string, but a+"b\n" is just an rvalue without permanence (or is it?) Do you get a pointer off into space that no longer exists, or some sort of memory leak?
To some extent, this demonstrates why I prefer simpler languages (like C, or very minimal C++) - It's one thing to write code that is bad (and you can look at it and see that it is wrong), and another to have some obscure language feature buried deep in some library that doesn't behave quite the way you thought it did (and most "modern" languages LOVE "obscure features for library writers.")
On the third hand, it's difficult to write complex programs without complex libraries from "someone else."
-
Correct. It's a use after free :-+
So easy to fire a footgun it's unreal.
Same thing in C#
var a = "a";
var b = a + "b\n";
Console.WriteLine(b);
Literally no concern. Compiler assumes programmer is a dumbass :)
-
Your as hominem attack is very wide of the mark.
Given that you continue to avoid telling us your experience and background, I think we are justified in presuming you have little real world experience of creating medium or large commercial programs.
Experience does matter. It makes you humbler and more aware of how difficult it is to avoid things failing.
Again, no arguments, but you solicit something for your ad hominem attacks which you even project on me.
It is because you have no valid arguments for your claim we all should use Rust to make it into the brave new wold.
And surely does experience matter.
Because of this i question abandoning all experience with a old tool, get a new toy because you are told it will solve all your problems.
Experience tells me the first problem is myself when using the tools available to me, then the tools.
Scott Adams said you can't schedule stupidity.
The old stupidity will just find new ways to manifest.
And my experience also made me aware of your kind of activists that appeal to narcissistic people by telling them it is not them who are inferior but the tools they are using.
-
Can we turn off the "usenet circa 1995" part of this thread and concentrate on the technical arguments.
-
I prefer to recommend C# these days.
This is a electronics forum, about mostly embedded systems.
Who in his right mind will use a garbage language, meaning a automatic garbage collected only, indeterministic, factually vendor proprietary language and eco-system there?
Possibly only someone who can't handle a gun, always shoots himself in the foot and thus wants to ban all guns.
The "bad boys" will always keep their guns.
And about your string_view example, the language allows you to do optimizations crucial in embedded or systems programming that simply aren't possible in less deterministic managed languages, and it's misuse can be detected by tools, either if it happens by accident, incompetence or even intentional like in your example.
And possibly there are people that want to do bloated toy embedded systems with managed languages, because the alternative is too difficult for them, but i suspect they don't earn much because the customer tends to buy what bangs most for his buck.
You see it with something like the Raspberry PI, or Adruino, for example.
The money there is made by selling people something that makes them feel they are a "maker", but what they "design" hardly makes any money by selling it, but of course he may tell others how great it is and increases sales of the "maker" supplies.
-
std::string a = "a";
std::string_view b = a + "b\n";
std::cout << b;
Looks like "clang experimental -Wlifetime" in compiler explorer gives a warning about this.
https://godbolt.org/z/zK4HVx
But not very glorious that GCC 9.1 doesn't. Maybe the newest MSVC also catches this with it's static analyzer or warnings.
If the argument for not using C++ is the deficiency of tooling that seems to be fixed rather soon even for the freely available tools or is fixed already for people using commercial static analyzers, it sounds rather compulsive.
Some more information here:
https://herbsutter.com/2018/09/20/lifetime-profile-v1-0-posted/
-
I prefer to recommend C# these days.
This is a electronics forum, about mostly embedded systems.
Who in his right mind will use a garbage language, meaning a automatic garbage collected only, indeterministic, factually vendor proprietary language and eco-system there?
Possibly only someone who can't handle a gun, always shoots himself in the foot and thus wants to ban all guns.
Lots of embedded applications don't have a real time requirement. As far back as 2004 I've been deploying C# on windows CE in embedded systems. This spans everything from mobile vehicle tracking systems to aircraft. In a lot of cases the application's "embedded" portion is a tiny piece of the picture and most of the application is actually on traditional compute. Hell if you look at a lot of embedded use cases, some of them are even desktop windows builds...
(http://i.imgur.com/lhAs5.jpg)
Some points:
1. As for the underlying C# platform (.net), you realise it's open source and supports AOT compilation?
2. As for the platform, you also realise it doesn't matter if it's open source or not? Same as everyone just buys Keil etc. What matters is applicability to the problem domain within the budgetary constraints.
For hard real-time applications you're right. I used Ada on PowerPC when I did that. Note, also not C++ and not C because why the hell would you use that?!?! We used formally verified compilers and formally verified code for those applications, most of which ended with a large boom.
My personal preference for hard real time applications is FORTH and native assembly for ref. I write C for the AVR now for personal interest because it's the path of least resistance and it's cheap as chips and there is precisely zero risk of anything I do causing harm to anyone. I won't touch the Arduino toolchain because of the tangled bastardised C++ nightmare that sits over it. Just managing interrupts and library opinions is a chore.
The "bad boys" will always keep their guns.
Put your gun away. We're not shooting anyone :-DD
And about your string_view example, the language allows you to do optimizations crucial in embedded or systems programming that simply aren't possible in less deterministic managed languages, and it's misuse can be detected by tools, either if it happens by accident, incompetence or even intentional like in your example.
I think you miss the point. The point is that this was intentional on my part, because I understand the problem domain. The majority of people do NOT understand the problem domain even people with 20+ years of "embedded experience" so determining intent is difficult from what is presented to you. If there was that much determinism there wouldn't be stuff like http://www.underhanded-c.org/ (http://www.underhanded-c.org/) . With power comes responsibility and most people are too arrogant to handle that responsibility, something I observe here regularly.
The comparison with C# above was strictly to illustrate how many of the problems evaporate if you pick the right tool for the job
Edit: Incidentally the static analysers I have here (Coverity and Sonar) didn't actually pick this use after free up. Valgrind didn't either.
Optimisation, fair enough. I don't argue that. We sometimes have hardware constraints which need to be managed but at no point would I consider even throwing a C++ compiler in that direction. Just the abstractions it forces on you are too large for the hardware. C is about as close to the metal as you can go. It's basically a high level assembler there. The C++ abstractions are just way too high level to even throw anywhere near embedded stuff at that level.
And possibly there are people that want to do bloated toy embedded systems with managed languages, because the alternative is too difficult for them, but i suspect they don't earn much because the customer tends to buy what bangs most for his buck.
The customer spends what the customer wants and that is barely related to what comes out the other end. I actually know someone who's entire business is based on the Parallax Basic Stamp and he does very well and his products work fine. He knows his limits and what the risks are if he migrates everything to C so made a sound engineering decision to buy something in to make that risk go away.
As for money, I make it solving problems, not picking political sides in a battle of engineering requirements. The customer doesn't always care how they are solved. I have to compromise somewhere which means picking technology that won't fail both myself (thus incurring maintenance costs and decline in profit) or the customer (thus incurring wrath).
You see it with something like the Raspberry PI, or Adruino, for example.
The money there is made by selling people something that makes them feel they are a "maker", but what they "design" hardly makes any money by selling it, but of course he may tell others how great it is and increases sales of the "maker" supplies.
I'll avoid getting into the "maker" discussion here but that's just a marketing fad, a hashtag to shove on any old shit to try and sell it.
If we stick to the commercial side of this there is a market for high level languages in embedded systems and yes it is because they are easier but this is because it reduces time to market and risk profile for what you are doing.
C++ is a joke for everything. It doesn't fit anywhere. It only exists simply because of momentum. On the desktop it's dying on it's arse; people are rewriting shit in JavaScript and Electron and other high level languages. No one wants to deal with ATL/MFC/Qt any more. Apple are even walking away from Objective-C now and implementing their own higher level language. Mozilla are heading to Rust. On the embedded side of things, C++ has as much non determinism as on the desktop. C is marginally better but at the same time, it's a mess of non-determinism as well. Even the basic ANSI C specification has more unknowns than knowns in it.
Now my day job these days, past that history of embedded work at various levels, mostly consists of attempting to prevent large financial companies from doing stupid stupid things and giving them sound architectural advice. I've learned a lot over the years from C and C++ and that is basically summarised as:
If you want to spend money maintaining shit with a high failure and security risk, pick C or C++.
Now back to the Ad Hominem side of things, now we're done with the technical evaluation. Honestly I've worked with people with your general attitude of superiority and arrogance and fired them sharpish!
std::string a = "a";
std::string_view b = a + "b\n";
std::cout << b;
Looks like "clang experimental -Wlifetime" in compiler explorer gives a warning about this.
https://godbolt.org/z/zK4HVx (https://godbolt.org/z/zK4HVx)
But not very glorious that GCC 9.1 doesn't. Maybe the newest MSVC also catches this with it's static analyzer or warnings.
If the argument for not using C++ is the deficiency of tooling that seems to be fixed rather soon even for the freely available tools or is fixed already for people using commercial static analyzers, it sounds rather compulsive.
Some more information here:
https://herbsutter.com/2018/09/20/lifetime-profile-v1-0-posted/ (https://herbsutter.com/2018/09/20/lifetime-profile-v1-0-posted/)
This covers my point nicely:
1. GCC doesn't. MSVC maybe. clang does. Non-determinism.
2. clang experimental warn feature flags this. Experimental.
3. I am not harry potter and don't wish to have to shout "Expelliarmus!" every time I accidentally load up the foot gun.
The deficiency is not the tool chain but the language design and specification and the crack in the pipe that was smoked by Stroustrop.
-
Correct. It's a use after free :-+
So easy to fire a footgun it's unreal.
Same thing in C#
var a = "a";
var b = a + "b\n";
Console.WriteLine(b);
Literally no concern.[
Now let's try same thing in C++
string a = "a";
string b = a + "b\n";
cout << b;
You are just trying hard to create buggy code by using some obscure nonsense nobody normally uses. I had to look up what a string_view is.
Compiler assumes programmer is a dumbass :)
No comment here :-DD
-
string_view is in <string>. Someone will use it even if you tell them not to. And it will escape through code review. And it will escape through specifications because humans are responsible for enforcing all of them.
Lets throw unicode in. std::string doesn't cut it. std::wstring has portability issues. std:u16string and std:u32string have BOM interpretation issues in different STL implementations. std::u8string isn't coming until C++20...
I know that because I wasted a whole fucking day of my life on that shit.
It's also the sort of issue that causes my enjoyment of Последний Герой on my Citroen's embedded radio to decline as the track title is empty. So this is necessary, in 2019, in embedded systems.
I'm not wrong and a massive chunk of the industry happens to be on the same side of the fence. The whole thing is a poorly specified and engineered mess.
-
I worked on a moderately complex, multithreaded C++ project for a few years. I just don't remember segfaults and concurrency bugs as a serious issue. And code review was effective at keeping the dumbasses from comitting obvious crap :P
C++ is a crude mess, I don't think anyone denies it. But there are things you can do in a language with pointers and a Turing-complete metaprogramming system that you simply couldn't do equally easily and/or with the same performance in a safe, statically verified langugage, precisely because it's safe and boring.
I acknowledge most people would be better off programming in Java; in fact, I always say that people should only touch C++ if they plan to use it for hours a day and for years. At the same time, I think the world would be better if less people programmed computers, not more.
You can rave about the superiority of JavaScript and Rust to C all you want, but at the end of the day, all that Web 2.0 garbage is sluggish and buggy like Windows 3.1 era desktop software written in C running on i386. Or worse. Why? Because it allows idiots with no clue about computers to make software :P
I understand that this battle is mostly lost. It's always cheaper to hire code monkeys writing in some BDSM language and tell the customer to buy a faster CPU. I will have a laugh if Moore Law hits the wall within my lifetime, though ;D
-
If you think I'm a fan of all that web 2.0 garbage you'd be wrong.
Back in the late 1980s and early 1990s I used Acorn RISC OS machines. They were pretty powerful ARM beasts, in fact where the ARM CPU originated. They had a well defined low-high level abstraction which meant you could write the majority of what you did in an HLL (BASIC in fact) and drop to assembler as needed. I think we need something in that space again. I'm certainly not suggesting BASIC today, but something in that space.
Really I'd like to see a stepped platform abstraction based on the same, well defined language. This is possible but means a lot of starting again and no traction which is why other attempts have failed. The abstraction should be as follows:
1. Low level (core/embedded/kernel). native memory access. Inline assembly. No libraries other than static data structures.
2. Mid level (systems/network programming). No memory access outside allocator (reference management or GC). Higher level data structures and libraries. Network access.
3. High level (end user). No memory access outside allocator (with GC). No inline assembly. Higher level data structures and libraries still and user interface
Something along the lines of Algol / Oberon would be the sweet spot between functional and usable. Go is actually closest to the mark believe it or not.
FORTH actually covers all of them as well. You can redefine some of the words on lower level abstractions at startup to stop hardware access.
-
Lots of embedded applications don't have a real time requirement. As far back as 2004 I've been deploying C# on windows CE in embedded systems. This spans everything from mobile vehicle tracking systems to aircraft. In a lot of cases the application's "embedded" portion is a tiny piece of the picture and most of the application is actually on traditional compute. Hell if you look at a lot of embedded use cases, some of them are even desktop windows builds...
These Windows powered Tektronix devices were dreadful, luckily these ar history.
All embedded systems have real time requirements, you ignore that by simply talking only about the UI part. And even there, you have users only accepting non determinism to some degree.
And it looks like this UI part is transitioning into the browser and done in JavaScript or maybe even C++ compiled to WASM running in the browser in the future.
It makes sense, you can get so many "web hipsters" competing for a job to give you a cool UI with a corn flower blue button. They really shouldn't be punished with C++, though.
Who cares about the language here, it is a war of UI platforms/toolkits and the end of the vendor specific platforms is dawning.
Some points:
1. As for the underlying C# platform (.net), you realise it's open source and supports AOT compilation?
Didn't i say "factually"?
It is a more symbolic act to to prolong the life of C# and it's eco-system, to extract the most business out of the C# community.
Fair, but hell i never would want to be forced to annoy myself doing real embedded work with C# or any other managed garbage collected language.
For new things i would rather go the browser approach, meaning put the UI in the browser, interface this to the C++ real embedded part.
Other people see this as a threat, so we got Blazor.
The "bad boys" will always keep their guns.
Put your gun away. We're not shooting anyone :-DD
You need help understanding my analogies?
...
C++ is a joke for everything. It doesn't fit anywhere.
...
If you want to spend money maintaining shit with a high failure and security risk, pick C or C++.
Do you realize that without C++, there would be no internet, no browsers, no infrastructure, no mobile phone networks, no factory automation?
Ok, some defense done with ADA would be left so we could fight a war then because the infrastructure is gone.
This covers my point nicely:
1. GCC doesn't. MSVC maybe. clang does. Non-determinism.
You really slander the meaning of the word "determinism" here.
It is easily known from the beginning, easy to determine if you get the check or not, so you can act appropriately simply by telling the not that capable programmers to avoid std::string_view and other things that require high responsibility and proficiency or restrict them to their managed language sandbox because it works better for them to do the corn flower blue button UI.
2. clang experimental warn feature flags this. Experimental.
It already works, soon it will not be experimental anymore, it will also be in GCC and it is free and open source.
But i guess you want to turn it into a original sin.
3. I am not harry potter and don't wish to have to shout "Expelliarmus!" every time I accidentally load up the foot gun.
Sounds like you have a very superstitious approach to engineering.
The deficiency is not the tool chain but the language design and specification and the crack in the pipe that was smoked by Stroustrop.
A nice ad hominem.
He possibly gets a lot of that and counters: There are two kind of programming languages, ones that everyone complains about and the ones nobody is using.
-
Do you realize that without C++, there would be no internet, no browsers, no infrastructure, no mobile phone networks, no factory automation?
First web browser: objective C
First web server: C
Operating systems of the era: C (not C++). Even the "C++" that NT was written in was C. It only got C++ from ATL/MFC and above. Win32 was C!
What infrastructure? It's all C.
Mobile phone networks? Plenty of Erlang there and kernels from the above. AT&T's telecoms products were built on C and UNIX.
Factory automation? I spent several years doing automation and touched C precisely zero times. It was surprisingly perhaps all VB4 on NT 3.51 talking to things on RS422/GPIB stuff that ran ladder logic or some other vile proprietary shit. We even had some SBCL in there. That was nice.
Most of the Internet to this day runs on C and ASICs and SFA else.
Now C sucks for different reasons but the scope is tiny and we'd have been fine without C++. We'd be a LOT further along without C++. It's one of those mistakes the human race hasn't quite got over yet like smoking, putting lead in gas and bagpipes.
A nice ad hominem.
He possibly gets a lot of that and counters: There are two kind of programming languages, ones that everyone complains about and the ones nobody is using.
C++ job trends ...
(https://imgur.com/LMgaR0I.jpg)
I look forward to watching him eat those words for supper.
-
If the argument for not using C++ is the deficiency of tooling that seems to be fixed rather soon even for the freely available tools or is fixed already for people using commercial static analyzers, it sounds rather compulsive.
The problem with that is that the c++ mob have been claiming that for the past quarter century. Many people have drawn the comparison with fusion power :)
The tools are still nowhere as good as tools for Java in 1996 or Smalltalk in 1985. BTW both HP and Tektronix have had very good portable instrument products based on Smaltalk, so don't recommit your "this is an embedded forum" mistakes.
There are fundamental reasons why the tools can never be as good. Start by considering the effects of #define macros, and the ability to cast a fish into bicycle. After you have not solved those, you can move onto more subtle problem.
BTW, you really should listen to bd139; his points are valid. But somehow I doubt you will.
I should note that he has more patience than me in clearing up other people's messes. I have done it, but prefer to be less well financially rewarded by merely cleaning up my own mistakes :)
-
Also worth a read before you consider C++ appropriate for low level stuff...
https://web.archive.org/web/20140111204548/http://msdn.microsoft.com/en-us/windows/hardware/gg487420.aspx (https://web.archive.org/web/20140111204548/http://msdn.microsoft.com/en-us/windows/hardware/gg487420.aspx)
An oldie but a goodie and it smells of architectural regret.
-
Something along the lines of Algol / Oberon would be the sweet spot between functional and usable. Go is actually closest to the mark believe it or not.
FORTH actually covers all of them as well. You can redefine some of the words on lower level abstractions at startup to stop hardware access.
The key usecases for the future will be around many core processors with many threads and non-uniformly memory architectures, with networking thrown in for good measure.
C et al spent 40 years explicitly avoiding those issues. Maybe the new standards will help, but that won't be proven for a decade. And I'll bet the promise will be negated by legacy code and normal programmers.
I don't see how Algol (of fond memory) helps there, and I don't know Oberon. Erlang has a strong commercial track record. For the embedded commercial arena, I like the modern incarnation of CSP and Occam, i.e. xC
-
C++ is a crude mess, I don't think anyone denies it.
Frankly, i also thought C++98 did more harm than good as many overeager "programmers" have ridden it close to death.
Maybe also the Java shock helped C++98 to become a zombie.
But the Java shock, which was countered with C#, was followed 10 years later by the disillusion about Moores law, there is no free lunch anymore.
http://www.gotw.ca/publications/concurrency-ddj.htm (http://www.gotw.ca/publications/concurrency-ddj.htm)
And it got worse, even the wonder battery that is supposed to save BEVs is draining too quickly in our mobile devices with all the Java, JavaScript and managed, non AOT languages. And you never can have too much RAM and it even needs power to store all this not yet collected garbage data.
This revived interest and work in C++ and i would say with respectable results, possibly extending the life of C++ significantly.
So good news for the haters of C++, probably they will not lose their purpose until the end of their life.
This brings me back to the Rust topic.
It looks like these Rust activists here are motivated by hate for C++ to a unhealthy degree.
It is just because like i said, C++ is a mess, but not such a mess that abandnoning it is worth it or generally mandatory, esp. as it is continuously improving including tooling.
And then i think even Firefox, which i am using as a web browser, while preferring chromium for dedicated apps, is not 100% Rust, there are definitely C++ libraries used and then maybe extensions done in C++.
One should view the dogma of Rust superiority by their activists with this in mind.
Oh, i found something:
https://github.com/4e6/firefox-lang-stats/blob/63912019312ac0b25701092b06fde64bec341f64/git-file-stats.log (https://github.com/4e6/firefox-lang-stats/blob/63912019312ac0b25701092b06fde64bec341f64/git-file-stats.log)
...
217 rst
...
3299 c
...
7170 cpp
...
Interesting would be also to know the trend here.
I suspect Rust seriously lags behind in the ambitions of it's activists and then C++11,17,20 is making so much bad news that they see themselves forced to step their hate up.
-
Something along the lines of Algol / Oberon would be the sweet spot between functional and usable. Go is actually closest to the mark believe it or not.
FORTH actually covers all of them as well. You can redefine some of the words on lower level abstractions at startup to stop hardware access.
The key usecases for the future will be around many core processors with many threads and non-uniformly memory architectures, with networking thrown in for good measure.
C et al spent 40 years explicitly avoiding those issues. Maybe the new standards will help, but that won't be proven for a decade. And I'll bet the promise will be negated by legacy code and normal programmers.
I don't see how Algol (of fond memory) helps there, and I don't know Oberon. Erlang has a strong commercial track record. For the embedded commercial arena, I like the modern incarnation of CSP and Occam, i.e. xC
Totally agree.
C-like languages are dead in that space. Everything is too mutable to handle concurrency effectively and scale well. Most of the synchronisation primitives we use are to work around that.
Erlang is a good one for sure. I've got an Erlang stack up here although it mostly runs RabbitMQ spewing stuff between C# and python programs as that's what my usual problem domain desires. Algol 68 - look up "par". The "par" from XC and Haskell came from there ;)
I should note that he has more patience than me in clearing up other people's messes. I have done it, but prefer to be less well financially rewarded by merely cleaning up my own mistakes :)
Patience comes from knowing you don't have to renew :-DD
-
It looks like these Rust activists here are motivated by hate for C++ to a unhealthy degree.
Not a Rust activist here for ref.
Incidentally I just spent an hour with Rust and I hate it as much as C++ almost immediately. There's even more intent hiding in punctuation! Starting to look like Perl where every time the cat walks on the keyboard, apart from the dose of toxoplasma gondii, it churns out a valid perl program.
-
Starting to look like Perl where every time the cat walks on the keyboard, apart from the dose of toxoplasma gondii, it churns out a valid perl program.
This paper (https://famicol.in/sigbovik/) was making the rounds early this summer.
Edit: Another link: Programmer migration patterns (https://apenwarr.ca/log/20190318)
-
Do you realize that without C++, there would be no internet, no browsers, no infrastructure, no mobile phone networks, no factory automation?
Your "firsts" and list of anecdotes all are close to irrelvant, far from being a realistic picture.
A nice ad hominem.
He possibly gets a lot of that and counters: There are two kind of programming languages, ones that everyone complains about and the ones nobody is using.
C++ job trends ...
(https://imgur.com/LMgaR0I.jpg)
I look forward to watching him eat those words for supper.
You are so loving.
BTW, these percentages are inflated by the web/JS/Java bubble. The absolute numbers for C++ job trends are rising.
One can even see the that the brave new Java world or however you call it has ended as the percentages seem to stagnate since a few years despite the total number of "programmers" still rising.
And i think the C++ jobs may be less volatile than some others, of course when you can handle C++, its evolution and don't hate it.
But who needs it rising? It is not for everyone, only for the ones who can efficiently use it.
And then it is making big progress in making programmers more productive, the ones who can handle it, of course.
-
Starting to look like Perl where every time the cat walks on the keyboard, apart from the dose of toxoplasma gondii, it churns out a valid perl program.
This paper (https://famicol.in/sigbovik/) was making the rounds early this summer.
Hahaha hadn't seen that. That's hilarious
But who needs it rising? It is not for everyone, only for the ones who can efficiently use it.
And then it is making big progress in making programmers more productive, the ones who can handle it, of course.
Some of us can handle it but choose only to do so when we're paid danger money. Some people never get to a position where they can make that choice :)
-
C-like languages are dead in that space. Everything is too mutable to handle concurrency effectively and scale well. Most of the synchronisation primitives we use are to work around that.
The concurrency problem is not a problem of the language, it is a problem of the programmer.
I rather expect these programmer deficiencies getting eased by compilers and virtual machines like WASM, for example.
Simply because the programmer not only has the burden to implement a effective parallel solution but also to take most available and increasingly diversifying architectures into account.
He is thrown back to basically coding "machine specific parallel assembly", or overly generic and repetitive code which again burdens run time and coding performance, regardless of the used language, one could conclude.
I rather think the generated machine code for virtual machines of the future, maybe running in a browser, has some additional information making mapping of the functions and their data flow to the heterogeneous and diverse user machines viable.
So a productive combination of AOT and JIT, not only have JIT because the bubble paid for it.
-
Starting to look like Perl where every time the cat walks on the keyboard, apart from the dose of toxoplasma gondii, it churns out a valid perl program.
This paper (https://famicol.in/sigbovik/) was making the rounds early this summer.
Hahaha hadn't seen that. That's hilarious
But who needs it rising? It is not for everyone, only for the ones who can efficiently use it.
And then it is making big progress in making programmers more productive, the ones who can handle it, of course.
Some of us can handle it but choose only to do so when we're paid danger money. Some people never get to a position where they can make that choice :)
And then there's the Dunning-Kruger effect.
-
C-like languages are dead in that space. Everything is too mutable to handle concurrency effectively and scale well. Most of the synchronisation primitives we use are to work around that.
The concurrency problem is not a problem of the language, it is a problem of the programmer.
If the language doesn't haver primitives with the necessary semantics, then even the best programmer cannot succeed. Those with Dunning-Jruger syndrome might think they have succeeded, but...
If you build a castle on sand, it doesn't matter how good the builders are.
Until very recently, c didn't have the semantics. It remains to be seen whether the semantics are sufficient, and whether they are implemented correctly.
-
And then there's the Dunning-Kruger effect.
My entire business runs on pretending that doesn't exist :)
-
Interesting thread, I subscribed to it. Some comments acknowledge what I experienced.
After reading all this, it convinced me even more to stick with plain C for MCU's.
-
C-like languages are dead in that space. Everything is too mutable to handle concurrency effectively and scale well. Most of the synchronisation primitives we use are to work around that.
The concurrency problem is not a problem of the language, it is a problem of the programmer.
If the language doesn't haver primitives with the necessary semantics, then even the best programmer cannot succeed. Those with Dunning-Jruger syndrome might think they have succeeded, but...
If you build a castle on sand, it doesn't matter how good the builders are.
Until very recently, c didn't have the semantics. It remains to be seen whether the semantics are sufficient, and whether they are implemented correctly.
Looks like you just read the first sentence and it was sufficient to trigger your denial. Isn't that what Dunning Kruger described?
My point was that burdening the programmer with more primitives and semantics probably will not work, because if it would, it could have worked since more than 20 years and we wouldn't wait for anything to compute while 7/8 cores and 2 GPUs are idling.
-
Interesting thread, I subscribed to it. Some comments acknowledge what I experienced.
After reading all this, it convinced me even more to stick with plain C for MCU's.
Shouldn't you use Rust according to how this thread started?
You also can have a look there:
https://electronics.stackexchange.com/questions/3027/is-c-suitable-for-embedded-systems
Looks like it didn't kill all people who did use some C++ while being very careful about certain of it's features and constructs.
No, it rather benefited them.
-
string_view is in <string>. Someone will use it even if you tell them not to. And it will escape through code review. And it will escape through specifications because humans are responsible for enforcing all of them.
Yep. string_view appeared in C++17 if I'm not mistaken. This is still relatively new (and explains probably why GCC doesn't catch the problem).
I agree with you, the spirit of C++ itself makes it a heavily fragmented language (meaning almost every developer will use their own subset).
In your example, using too recent features can be easily avoided in your team by using a rule of NOT writing C++17 code. As I reckon, many teams still don't even use C+11... this is relatively easy to enforce if you use automated build systems. Just enable the C++ standard you wish to support. Any feature not supported will not compile (for GCC 8.x, I had to explicitely set std=c++17 in order for your piece of code to compile, otherwise GCC gave a message about not supporting string_view).
But yeah C++ is really a gigantic mess. ;D
Still, to play devil's advocate here, your example could use a user-defined class instead of the standard string_view, and lead to a similar bug. So this is not completely a problem related to the language itself.
-
Had a interesting read about the "safety" of Rust.
https://medium.com/@shnatsel/how-rusts-standard-library-was-vulnerable-for-years-and-nobody-noticed-aebf0503c3d6
-
And then there's the Dunning-Kruger effect.
My entire business runs on pretending that doesn't exist :)
Does that make you a disaster capitalist?
-
C-like languages are dead in that space. Everything is too mutable to handle concurrency effectively and scale well. Most of the synchronisation primitives we use are to work around that.
The concurrency problem is not a problem of the language, it is a problem of the programmer.
If the language doesn't haver primitives with the necessary semantics, then even the best programmer cannot succeed. Those with Dunning-Jruger syndrome might think they have succeeded, but...
If you build a castle on sand, it doesn't matter how good the builders are.
Until very recently, c didn't have the semantics. It remains to be seen whether the semantics are sufficient, and whether they are implemented correctly.
Looks like you just read the first sentence and it was sufficient to trigger your denial. Isn't that what Dunning Kruger described?
My point was that burdening the programmer with more primitives and semantics probably will not work, because if it would, it could have worked since more than 20 years and we wouldn't wait for anything to compute while 7/8 cores and 2 GPUs are idling.
You need the right concepts and semantics, and the fewer the better.
C++ explicitly avoided choosing a few simple semantics in favour of letting each programmer choose a different (often incompatible) of the language and concepts.
Apart from that you are discussing application lspecific issues, not language issues.
-
Had a interesting read about the "safety" of Rust.
https://medium.com/@shnatsel/how-rusts-standard-library-was-vulnerable-for-years-and-nobody-noticed-aebf0503c3d6
You seem to see the world in black and white; you misinterpret and misrepresent what other people are saying as X is all good or all bad.
In reality the world is shades of grey. In this case c is blacker and rust might be whiter.
I distrust anyone that doesn't recognise greyness, since it shows imbalance.
-
And then there's the Dunning-Kruger effect.
My entire business runs on pretending that doesn't exist :)
Does that make you a disaster capitalist?
Opportunist perhaps. In the house of the blind, the man with one eye is king. Unfortunately half the house wants your advice and the other side want to poke your good eye out.
Had a interesting read about the "safety" of Rust.
https://medium.com/@shnatsel/how-rusts-standard-library-was-vulnerable-for-years-and-nobody-noticed-aebf0503c3d6
You seem to see the world in black and white; you misinterpret and misrepresent what other people are saying as X is all good or all bad.
In reality the world is shades of grey. In this case c is blacker and rust might be whiter.
I distrust anyone that doesn't recognise greyness, since it shows imbalance.
Grey is an understatement. Everything is somewhere to the side of the scale. It’s brown and stinks. Product selection is about extracting the juicy bits of sweetcorn from the turd. It’s still a turd though and you have to eat it every day. Yum.
-
And then there's the Dunning-Kruger effect.
My entire business runs on pretending that doesn't exist :)
So you pretend it doesn't apply to you?
-
And then there's the Dunning-Kruger effect.
My entire business runs on pretending that doesn't exist :)
So you pretend it doesn't apply to you?
No, just the business. The business is a facade. If someone comes with a problem, the business must sell an answer or turn a customer away, the latter of which is poor business acumen. Thus the business knows everything and is not subject to that theory, with the disclaimer of a lead time, exploration, analysis and a price. It's my responsibility to make sure that when it applies to me personally, I bridge the gap between ignorance and competence properly before reaching any conclusions. This is how think tanks, research companies and consultancies operate. Well the honest ones anyway.
The dishonest ones usually have a book of answers available ranked in order of cash backhanders for product recommendations.
-
Had a interesting read about the "safety" of Rust.
https://medium.com/@shnatsel/how-rusts-standard-library-was-vulnerable-for-years-and-nobody-noticed-aebf0503c3d6
You seem to see the world in black and white; you misinterpret and misrepresent what other people are saying as X is all good or all bad.
In reality the world is shades of grey. In this case c is blacker and rust might be whiter.
I distrust anyone that doesn't recognise greyness, since it shows imbalance.
Grey is an understatement. Everything is somewhere to the side of the scale. It’s brown and stinks. Product selection is about extracting the juicy bits of sweetcorn from the turd. It’s still a turd though and you have to eat it every day. Yum.
Fortunately, while I've occasionally got it wrong, I've a pretty good track record of spotting the sweetcorn.
One of the more unpleasant examples was of selecting a good product from a great company and team. They were so good they were borged by Oracle, and we all know where that leads :(
-
Ah yes. Been there. I got hired to port something away from Sleepycat to SQLite when Oracle rolled up to buy them out.
Edit: I was actually Oracle 9i DBA for a bit on HP/UX (back in the days of N-class hardware with the promise of Itanium on the horizon) because the usual suspect decided to go and drive his stupid Lotus car into a tree. I was offered cert but declined it as it was like wearing a nazi uniform :)
-
So anyway... has anyone here developed something interesting with Rust?
-
No just marketing :-DD
-
So anyway... has anyone here developed something interesting with Rust?
You mean like recording tape, or floppy disks?
-
So anyway... has anyone here developed something interesting with Rust?
You mean like recording tape, or floppy disks?
Rust not rust.
And that was a genuine question. ;D
And I meant someone from the forum, not someone from Mozilla.