Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?
https://www.adacore.com/nvidia
That seems reasonable. Much more so than switching to uh, Rust, for instance.
Maybe they are just trying to sidestep ageism, or create opportunities for ex-defence people, and hire some people old enough to have actually used Ada.
Did no one get my troll ... Pict reference? :-( :-(
Did no one get my troll ... Pict reference? :-( :-(
Have you been grooving with one?
Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?
https://www.adacore.com/nvidia
That seems reasonable. Much more so than switching to uh, Rust, for instance.
Hot off the press: "The U.S. NSA finally came out this week to strongly endorse `memory-safe' languages for most software programming, specifically mentioning C#, Go, Java, Ruby, Rust, and Swift as examples. Apparently orphaned DoD language *Ada* was conspicuously left out of NSA's list, even though versions of Ada that target JVM can utilize Java JVM's GC."
FFI see
http://catless.ncl.ac.uk/Risks/33/53#subj3If a language doesn't reduce the chances of something appearing in that comp.risks, then I'm unlikely to want to swap to it.
You are focusing way too much on "nice looking" syntax. This is not the most important part. If your code is full of those computed gotos, then you are likely doing something wrong anyway.
And I'm lost again what is being discussed. You throw some random examples of code that you don't want to work.
Yes, it is possible to make a language exactly as you like, but that happens by opening a text editor and typing the code, not forum posts.
You could be contributing to the conversation (instead of trolling) as several other people are, I'm sure you have much to offer but it seems you simply resent me discussing the subject. Several posts have contained genuinely useful responses, things that give me pause for thought.
I too have stated that I think sugared syntax is relatively unimportant; clean and orthogonal and understandable semantics are
much more important.
I disagree that ataradov is trolling. Blunt != trolling in any sense of the word.
Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?
https://www.adacore.com/nvidia
That seems reasonable. Much more so than switching to uh, Rust, for instance.
Hot off the press: "The U.S. NSA finally came out this week to strongly endorse `memory-safe' languages for most software programming, specifically mentioning C#, Go, Java, Ruby, Rust, and Swift as examples. Apparently orphaned DoD language *Ada* was conspicuously left out of NSA's list, even though versions of Ada that target JVM can utilize Java JVM's GC."
FFI see http://catless.ncl.ac.uk/Risks/33/53#subj3
If a language doesn't reduce the chances of something appearing in that comp.risks, then I'm unlikely to want to swap to it.
The list above is hilarious. Uh and advice from the NSA? Not interested.
Did no one get my troll ... Pict reference? :-( :-(
Have you been grooving with one?
In a cave.
Have you seen the Simpsons lost in the forest at night mashup, with Willie as the Pict?
Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?
To be fair, take these "we decided to start doing X" things with grain of salt. It's similar to Toyota deciding on transition into hydrogen cars every year for last 30 years.
Chances are pretty high it's just a PR program and maybe some trainee replaces some (unused or unimportant) module, while actual work commences just like before.
Professionals who enable all this crazy hightech are just fine working with C, after all, and don't like managers messing up their good productivity for social scores.
[Professionals who enable all this crazy hightech are just fine working with C, after all, and don't like managers messing up their good productivity for social scores.
Productively producing bugs that could've been avoided. No business likes that. I rarely meet developers who like maintaining and bugfixing C/C++ when the fun challenge of getting it stable is first completed.
With Rust you will typically find libraries that have no activity for a year or two. That's a huge warning sign with most other languages. With Rust, it simply means it's done.
Think about that! DONE!
We have an in house component written in Rust and the only times we've had to change the container is when there's CVEs in the underlying container runtime. Nothing has been reported in my dependencies, and no bugs have been reported by users. Two years in production. Let's just say that every other component has a long list of issues against them.
It's a brave, new and delightful world!
Well, do178b/misra95 C code has the same "it's done" characteristic.
Well, do178b/misra95 C code has the same "it's done" characteristic.
Both that and Rust are C with the naughty bits forbidden
Ditto SPARK and Ada
No, I'm not going to attempt to fully justify those statements, but there is some truth in them.
Removing naughty bits is far more important that fiddling with syntax. Unless, of course, it is the syntax that is the source of "inappropriate" behaviour.
Removing naughty bits is far more important that fiddling with syntax. Unless, of course, it is the syntax that is the source of "inappropriate" behaviour.
Misra is about synthax and grammar: necessary (for ICE inspections and analysis) but not sufficient.
Do178b is about passing (by ICE validation) the appropriate behavior(1): suffient.
Combined && -> QA passed
(1) normal, abnormal, defensive code
Productively producing bugs that could've been avoided.
Language purists are interesting folks, because they promise you the Moon from the sky, and completely bug-free systems. Compare that to non-programming fields! How about an aircraft engineer who claims they never make any mistakes? Anyone heard about such ridiculous claim?
For some strange reason, the track record for these "perfect" languages is near-zero. It's always "C causes bugs, <X> is perfectly bug-free, oh why are all the professionals so stupid they don't use <X> but stick to C, boo hoo hoo".
Talk is cheap. C is indeed quite crappy but most of the dangerous aspects are very well known to the professionals and avoided. They know their tools, and despite all the shortcomings of C, its strongest points are standardization, the fact it's not a moving target, being very well known, something to build expertise on.
Surgeons also use knives despite the fact you need to use them carefully. There are no wannabe-surgeons who complain all the time and have little to show, though. That's the difference when compared to computer programming.
Productively producing bugs that could've been avoided.
Language purists are interesting folks, because they promise you the Moon from the sky, and completely bug-free systems. Compare that to non-programming fields! How about an aircraft engineer who claims they never make any mistakes? Anyone heard about such ridiculous claim?
For some strange reason, the track record for these "perfect" languages is near-zero. It's always "C causes bugs, <X> is perfectly bug-free, oh why are all the professionals so stupid they don't use <X> but stick to C, boo hoo hoo".
Talk is cheap. C is indeed quite crappy but most of the dangerous aspects are very well known to the professionals and avoided. They know their tools, and despite all the shortcomings of C, its strongest points are standardization, the fact it's not a moving target, being very well known, something to build expertise on.
Surgeons also use knives despite the fact you need to use them carefully. There are no wannabe-surgeons who complain all the time and have little to show, though. That's the difference when compared to computer programming.
You say "talk is cheap" yet that entire post is nothing but anecdotes, how about facts, metrics, objective data?
This is one of two forums where this subject is being actively discussed for MCU languages, I think I have enough at this stage to begin to define the formal grammar for such a language.
The grammar doesn't cover semantics and other specifics but will be enough to perhaps get a working recursive descent parser up and running, if I do venture that far I'll be writing this in C# for .Net 7 since that is highly portable now, a parsing tool will thus run fine on either Windows or Linux.
Perhaps with a working initial grammar some of this discussion will become less abstract.
I already have a very reasonable lexical analysis library I put together last year, this is
a small FSM that's driven by a CSV token definition file, for those curious you can see this here.
Scanner library.CSV Token Definitions Example.
Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?
https://www.adacore.com/nvidia
That seems reasonable. Much more so than switching to uh, Rust, for instance.
Hot off the press: "The U.S. NSA finally came out this week to strongly endorse `memory-safe' languages for most software programming, specifically mentioning C#, Go, Java, Ruby, Rust, and Swift as examples. Apparently orphaned DoD language *Ada* was conspicuously left out of NSA's list, even though versions of Ada that target JVM can utilize Java JVM's GC."
FFI see http://catless.ncl.ac.uk/Risks/33/53#subj3
If a language doesn't reduce the chances of something appearing in that comp.risks, then I'm unlikely to want to swap to it.
Well they are
all memory safe, all of them only expose raw pointers as an
option, only needed when interoperating with systems that rely on pointers like C or C++ libraries.
When not using raw pointers these language ecosystems can be 100% sure that invalid pointers/addresses can never ever arise at runtime.
Interesting mention of array bound checking, there are two forms of checking one is pretty fast the other is slow.
The fast one ensures memory safety whereas the slow one ensures memory safety
and calling code validity.
Well, do178b/misra95 C code has the same "it's done" characteristic.
Both that and Rust are C with the naughty bits forbidden Ditto SPARK and Ada
No, I'm not going to attempt to fully justify those statements, but there is some truth in them.
Removing naughty bits is far more important that fiddling with syntax. Unless, of course, it is the syntax that is the source of "inappropriate" behaviour.
Bear in mind my own views here are
primarily about grammar not syntax.
The core of the grammar should - IMHO - be along the lines of:
<source> ::= <assignment_stmt> ";" | <keyword_stmt>
<assignment_stmt> ::= <reference> = <expression>
<reference> ::= <qualified_name> | <qualified_name> [ "(" <commalist> ")"] etc..
<keyword_stmt> ::= <declaration> ; | <defintion> <def_body>
<declaration> ::= "declare" <identifier> [<array_def>] <reqd_dcl_attribs> [<optional_dcl_attribs>] etc
<definition> ::= <type_def> | <entry_def>
<entry_def> ::= <proc_def> | <func_def>
<proc_def> ::= "procedure" <identifier> [(<commalist>)] <reqd_proc_attribs> [<optional_proc_attribs>] "{" <proc_body> "}"
etc etc etc...
You won't fix anything this way.
Necessary but not sufficient.
Language purists are interesting folks, because they promise you the Moon from the sky, and completely bug-free systems. Compare that to non-programming fields! How about an aircraft engineer who claims they never make any mistakes? Anyone heard about such ridiculous claim?
Just so.
and despite all the shortcomings of C, its strongest points are standardization, the fact it's not a moving target, being very well known, something to build expertise on.
Except for all the parts that fall
outside that category.
Isn't it obvious that all the implementation dependent behaviour isn't standardised?!
And then there's all the undefined behaviour, which trips up many many people.
Surgeons also use knives despite the fact you need to use them carefully. There are no wannabe-surgeons who complain all the time and have little to show, though. That's the difference when compared to computer programming.
Surgeons have to be rigorously trained and pass exams, and are regularly monitored by formal regulatory bodies. Anybody can call themselves a programmer and practice [sic] it without being monitored.
You won't fix anything this way.
Necessary but not sufficient.
Precisely!
It seems to be axiomatic that digital hardware engineers yearn to create their own processor, and software engineers yearn to create their own language. Without a lot of industry knowledge plus clarity and insight, both are doomed to being (at best) an interesting academic exercise.
You won't fix anything this way.
Necessary but not sufficient.
Precisely!
It seems to be axiomatic that digital hardware engineers yearn to create their own processor, and software engineers yearn to create their own language. Without a lot of industry knowledge plus clarity and insight, both are doomed to being (at best) an interesting academic exercise.
Well whether some exercise is "doomed", depends entirely upon what one's objectives are. If the objectives are met then by definition one has success.
If I create a language, and it meets my overall objectives then I don't see how that can be regarded as an example of "doomed".
It is impossible to predict influences as well, one has no idea how an idea or a concept might propagate over time.
Perhaps you think I am attempting to create a commercial product, or establish a new language that achieves huge popularity, but that isn't the case at all.
You say "talk is cheap" yet that entire post is nothing but anecdotes, how about facts, metrics, objective data?
FWIW, to me the one single objective datapoint is I professionally develop firmwares for embedded devices in C and have absolutely no problem with it whatsoever and do not complain about it lacking anything critical, or generating critical bugs for me. I haven't stepped in any C footgun in
years, just make classical bugs (wrong assumptions, logical mistakes) I would do in any language, and not too many. I also don't mind supporting my code, it's not fire-and-forget.
I know a few others in my field who also do not struggle. But I'm not going to give names, just like you are not giving the names of your "very experts" who struggle with PICs.
Isn't it obvious that all the implementation dependent behaviour isn't standardised?!
What weird mental acrobatics. It being in the standard means it is standardized! Standard basically says: we did not define the effect of this operation, so don't do that unless you really know what you are doing.
Would be very nice if implementation-defined behavior was not needed in the standard, but it's still much better to have standardized implementation-defined behavior than the standard being completely silent about some corner cases. This makes C standard
good, compared to poorly standardized languages where you can't know what happens by reading the standard.
And then there's all the undefined behaviour
Exactly the same for UB! This is the strong point of C: UB being standardized, so just read and understand the standard, don't do UB, and you are good to go.
Having such standard is not obvious at all.
Surgeons have to be rigorously trained and pass exams, and are regularly monitored by formal regulatory bodies. Anybody can call themselves a programmer and practice [sic] it without being monitored.
But anyone can also buy surgical knives freely and do whatever with them. And doing something with C where human lives are at a stake, say design an autopilot for a passenger aircraft, also requires at least some form of monitoring by formal regulatory bodies. Anyone is free to use surgical knife to peel an apple, even if that poses a risk of cutting oneself. My surgeon analogue is thus valid.
You say "talk is cheap" yet that entire post is nothing but anecdotes, how about facts, metrics, objective data?
FWIW, to me the one single objective datapoint is I professionally develop firmwares for embedded devices in C and have absolutely no problem with it whatsoever and do not complain about it lacking anything critical, or generating critical bugs for me. I haven't stepped in any C footgun in years, just make classical bugs (wrong assumptions, logical mistakes) I would do in any language, and not too many. I also don't mind supporting my code, it's not fire-and-forget.
I know a few others in my field who also do not struggle. But I'm not going to give names, just like you are not giving the names of your "very experts" who struggle with PICs.
Right, so no facts to backup the original anecdotal remarks and now the rather obvious claim that there are many engineers who are quite content with C. None of that has any bearing on what's being discussed, obviously it is not those who are content, that inspire or generate progress.
You say "talk is cheap" yet that entire post is nothing but anecdotes, how about facts, metrics, objective data?
FWIW, to me the one single objective datapoint is I professionally develop firmwares for embedded devices in C and have absolutely no problem with it whatsoever and do not complain about it lacking anything critical, or generating critical bugs for me. I haven't stepped in any C footgun in years, just make classical bugs (wrong assumptions, logical mistakes) I would do in any language, and not too many. I also don't mind supporting my code, it's not fire-and-forget.
I know a few others in my field who also do not struggle. But I'm not going to give names, just like you are not giving the names of your "very experts" who struggle with PICs.
Right, so no facts to backup the original anecdotal remarks and now the rather obvious claim that there are many engineers who are quite content with C. None of that has any bearing on what's being discussed, obviously it is not those who are content, that inspire or generate progress.
There are many engineers who are content with C.
There were many engineers who were content with COBOL. And steam engines. And bronze swords.
Nonetheless, incremental changes are legion (and ignorable), whereas significant advances are rare and to be enthusiastically accepted. It takes wide experience and good taste to distinguish one from the other.