opinion "undefined behaviour" should - at least in this context - be that the result is undefined.
What say you lot in the peanut gallery? :)
a = 12345
b = 0;
c = a % b;
a = 12345
b = 0;
c = a % b;
I think there is some merit in things failing in a predictable way when performing the same operation.Yes, there is, but this is not what C specification says. And compilers implement the specification. So, your complaint is to the standards people, not the compiler writers. And a lot of people have complaints to them, but they have their reasons.
in a more modern language, I would design in explicit checks and try to have as few undefined things as possible, ideally none. But this is not what C is, and it is to late to change.
Undefined behavior covers anything including the destruction of the machine (look up halt and catch fire).
If you don't want this to happen, then it's your job as C programmer to make sure you never trigger this behavior, for example by checking a divisor is non-zero before dividing. Undefined in the C spec equivalent of your doctor saying "If you get a headache after drinking five cups of coffee, then don't drink five cups of coffee".
If you want the language to take care of this instead of you, use a higher level language that defines a zero division exception or something like that.
Yes, there is, but this is not what C specification says. And compilers implement the specification. So, your complaint is to the standards people, not the compiler writers.
needs to have some kind, pragmatic consideration applied.This is called "defining the behaviour". Once you limit a set of outcomes, you defined the behaviour.
I think that is when pretence of "we can't help it" needs to go away."We don't check" is the defined behavior though. And this is good because it lets you write more optimal code. If you are passing some variable around, you as a programmer can check it once and then let the code assume the operation would be defined.
"We don't check" is the defined behavior though. And this is good because it lets you write more optimal code.
If I'm allowed to produce code that doesn't give the correct output, then I can produce extremely small and performant code very quickly.Exactly. Thant's what compiler did in my case where it replaced a huge function with UDF instruction. The result of executing my huge code and this one instruction is the same, so why do more?
If I'm allowed to produce code that doesn't give the correct output, then I can produce extremely small and performant code very quickly.Exactly. Thant's what compiler did in my case where it replaced a huge function with UDF instruction. The result of executing my huge code and this one instruction is the same, so why do more?
This only happened because undefined behavior was detected at compile-time, of course (there is probably some combination of warning flags that would have warned me, but they were not a part of -W -Wall). In run-time you just end up with a really unpredictable behavior that depends on the underlying architecture.
Compilers free to do this, so don't write code that contains UB.
I fully agree. And I don't like that UB exists. But I can hardly blame people in 1972 for not thinking about it. And I can't blame standard maintainers for not wanting to break things. It is better that way. Look at Python developers, they are happily breaking things all the time to male it "better".
C is what it is. Although I personally don't like most of the new languages, there are good arguments for not using C anymore, especially for really critical stuff.
I would like to se improved version of C, but unfortunately all attempts at this instantly grow a ton of features and become bloated and unusable.
I fully agree. And I don't like that UB exists. But I can hardly blame people in 1972 for not thinking about it. And I can't blame standard maintainers for not wanting to break things. It is better that way. Look at Python developers, they are happily breaking things all the time to male it "better".
C is what it is. Although I personally don't like most of the new languages, there are good arguments for not using C anymore, especially for really critical stuff.
I would like to se improved version of C, but unfortunately all attempts at this instantly grow a ton of features and become bloated and unusable.
Multicore CPUs are forcing the adoption of new languages, and about time too :)
When there is no good reason, don't cause unrecoverable program failure for unexpected input. It crashes rockets.
the serial RS422 cable of the MTC bridge
Yes, there is, but this is not what C specification says. And compilers implement the specification. So, your complaint is to the standards people, not the compiler writers.
I suppose you're right. I guess that's what happens when you try to mould a standard around a lot of existing and varied implementation.
Don't get me wrong, I'm not criticising the C language as a whole. If I were upset about this stuff, I wouldn't be using it. I know it's the programmer's job to avoid triggering undefined behaviour (e.g. check your divisor values first, your array bounds, that pointers are valid, etc.). I just wish in some areas (like this) there was more predictability and consistency. Speaking of which, further to my example above, I just remembered one more thing about this scenario: the routine for 32-bit division (as opposed to modulus) doesn't even have the same issue! It uses a different algorithm and is perfectly fine with a divisor of zero! Even more inconsistency. ::)
I think the whole thing about "undefined behaviour" meaning any random event could occur - up to and including your computer exploding, your hair turning purple, or being haunted by evil spirits - needs to have some kind, pragmatic consideration applied. Not for compiler or library writers to just shrug their shoulders and say "we're okay with letting random shit happen even though some of it is under our control".
Edit: That last sentence also gave rise to this thought: as I understand it, a lot of C undefined behaviour is due to having to allow for the idiosyncrasies of varied hardware. The circumstances that lead to undefined behaviour may be uncontrollable due to the nature of the platform code is running on. But when the behaviour is entirely within an entity's control (e.g. add a line of code to a library routine to perform a zero check), I think that is when pretence of "we can't help it" needs to go away.
I fully agree. And I don't like that UB exists. But I can hardly blame people in 1972 for not thinking about it.I can blame them. The relevant knowledge was available at that time.
I am sure they did think about it. Hardware back then was far less 'generic' in capability and implementation. Even how many bits were in a character or byte was still up in the air. This is over a decade before IEEE floating point was a thing.I fully agree. And I don't like that UB exists. But I can hardly blame people in 1972 for not thinking about it.I can blame them. The relevant knowledge was available at that time.
Yes, yes and yes.I am sure they did think about it. Hardware back then was far less 'generic' in capability and implementation. Even how many bits were in a character or byte was still up in the air. This is over a decade before IEEE floating point was a thing.I fully agree. And I don't like that UB exists. But I can hardly blame people in 1972 for not thinking about it.I can blame them. The relevant knowledge was available at that time.
C was designed around what is the minimum reasonable behavior we can expect from a computer. C also says "don't play outside that safe space if you want consistent, portable, predictable results from your code".
I will even venture that had C not had any UB cases, it would have been long dead by now.Yes - C is practical, like many of us engineers. And I truly enjoy seeing how the practicality of C and the massive success thereof triggers so many "computer scientist" types of people who can't stand something so practical just works against their expectations, and refuses to die or get replaced against their predictions, decade after decade. I'm not as much as C fanboy, but more like a "C hatred discussion enjoyer".
I will even venture that had C not had any UB cases, it would have been long dead by now.Yes - C is practical, like many of us engineers. And I truly enjoy seeing how the practicality of C and the massive success thereof triggers so many "computer scientist" types of people who can't stand something so practical just works against their expectations, and refuses to die or get replaced against their predictions, decade after decade. I'm not as much as C fanboy, but more like a "C hatred discussion enjoyer".
Undefined behavior covers anything including the destruction of the machine (look up halt and catch fire).i wish i can type more but this new SMF forum "quoting system" is f*king annoying. a proof that newer program is not always better, given a "not so thoughtfull" person behind programming it.
If you don't want this to happen, then it's your job as C programmer to make sure you never trigger this behavior, for example by checking a divisor is non-zero before dividing. Undefined in the C spec equivalent of your doctor saying "If you get a headache after drinking five cups of coffee, then don't drink five cups of coffee".
If you want the language to take care of this instead of you, use a higher level language that defines a zero division exception or something like that.
C was a hack job, a quick and dirty "it'll do for now" because designing was too much effort, it wasn't intended to be a serious, practical language, just something crude but useable to play with operating system ideas.says who? says someone who collecting questionaires about machine and ARM language in this forum in this few days? or says someone better than Bell's Lab computer scientist who got a Turing Award? working for years on it? and Unix, from where Linux was originated? https://en.wikipedia.org/wiki/C_(programming_language)
C was a hack job, a quick and dirty "it'll do for now" because designing was too much effort, it wasn't intended to be a serious, practical language, just something crude but useable to play with operating system ideas.says who? says someone who collecting questionaires about machine and ARM language in this forum in this few days? or says someone better than Bell's Lab computer scientist who got a Turing Award? working for years on it? and Unix, from where Linux was originated? https://en.wikipedia.org/wiki/C_(programming_language)
did you click and read the link? and the links inside? where is your credentials or proofs? you can talk to yourself that when you make a wild statement, please also provide your proofs to show you are right.C was a hack job, a quick and dirty "it'll do for now" because designing was too much effort, it wasn't intended to be a serious, practical language, just something crude but useable to play with operating system ideas.says who? says someone who collecting questionaires about machine and ARM language in this forum in this few days? or says someone better than Bell's Lab computer scientist who got a Turing Award? working for years on it? and Unix, from where Linux was originated? https://en.wikipedia.org/wiki/C_(programming_language)
Don't rant about me the person, prove me wrong, if you have evidence I'm wrong then present it and we can discuss the matter without animosity or insults or insinuations of incompetence.
Trying to argue against me on the basis of presumed credentials or lack thereof won't strengthen your case, besides I've designed and developed compilers far more sophisticated than C, have you? still want to talk credentials?
did you click and read the link? and the links inside? where is your credentials or proofs? you can talk to yourself that when you make a wild statement, please also provide your proofs to show you are right.C was a hack job, a quick and dirty "it'll do for now" because designing was too much effort, it wasn't intended to be a serious, practical language, just something crude but useable to play with operating system ideas.says who? says someone who collecting questionaires about machine and ARM language in this forum in this few days? or says someone better than Bell's Lab computer scientist who got a Turing Award? working for years on it? and Unix, from where Linux was originated? https://en.wikipedia.org/wiki/C_(programming_language)
Don't rant about me the person, prove me wrong, if you have evidence I'm wrong then present it and we can discuss the matter without animosity or insults or insinuations of incompetence.
Trying to argue against me on the basis of presumed credentials or lack thereof won't strengthen your case, besides I've designed and developed compilers far more sophisticated than C, have you? still want to talk credentials?
Thompson desired a programming language to make utilities for the new platform. At first, he tried to make a Fortran compiler, but soon gave up the idea.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.yes use C sharp (C#)
if you call evolution, improvement, expansion is a hack, then everything is a hack including python and what you are going to (re) invent... C evolved, you dont expect whats work today will work 10-20 years from now. and its not the first language Ritchie experimented. please know your place... unless if you already have something better (credentials) to prove, ie once your language got on top list for decades....
If you can't live with occasional snags, don't use a scalpel. Scalpels are for professionals only.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.
If you can't live with occasional snags, don't use a scalpel. Scalpels are for professionals only.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.
The problem is that very few programmers understand all the subtle and surprising causes of "undefined behaviour". Most think they do, of course :( That was famously illustrated when Hans Boehm thought it worthwhile to write a paper pointing out that you couldn't write a threading library in C - too many people presumed or "thought" you could.
C++ is worse. The committee defining the language didn't understand what they had created until their noses were rubbed in it by a short program that spat out the sequence of prime numbers while it ws being compiled. That's right; the design committee hadn't realised C++ templates are a Turing complete language in themselves - and that some valid C++ programs could never finish being compiled.
The least unsatisfactory alternative with C is to adhere to one of the relatively benign subsets of C, e.g. MISRA.
dcl a_ptr pointer based (b_ptr);
dcl b_ptr pointer based (a_ptr);
If you can't live with occasional snags, don't use a scalpel. Scalpels are for professionals only.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.
The problem is that very few programmers understand all the subtle and surprising causes of "undefined behaviour". Most think they do, of course :( That was famously illustrated when Hans Boehm thought it worthwhile to write a paper pointing out that you couldn't write a threading library in C - too many people presumed or "thought" you could.
C++ is worse. The committee defining the language didn't understand what they had created until their noses were rubbed in it by a short program that spat out the sequence of prime numbers while it ws being compiled. That's right; the design committee hadn't realised C++ templates are a Turing complete language in themselves - and that some valid C++ programs could never finish being compiled.
The least unsatisfactory alternative with C is to adhere to one of the relatively benign subsets of C, e.g. MISRA.
What do you think would happen to a patient, that urgently needs an operation to save their life, if the only tool available, was the aforementioned, scalpel?
If a tricky embedded system, needs to perform a very rapid software operation, in order to work, safely and correctly, otherwise, the voltage/system/etc will change too much (in real-time), ruining the motor controller, or whatever the system/software is trying to do.
Then your suggested 'new' language, i.e. NOT very high speed/efficient C, may make the task relatively unsolvable with the existing/affordable hardware.
In other words. Use the RIGHT TOOL for the RIGHT JOB. Not some film-flamcy cotton wool, safety tape covered, thing, that is not suitable for doing the job in hand.
If you can't live with occasional snags, don't use a scalpel. Scalpels are for professionals only.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.
The problem is that very few programmers understand all the subtle and surprising causes of "undefined behaviour". Most think they do, of course :( That was famously illustrated when Hans Boehm thought it worthwhile to write a paper pointing out that you couldn't write a threading library in C - too many people presumed or "thought" you could.
C++ is worse. The committee defining the language didn't understand what they had created until their noses were rubbed in it by a short program that spat out the sequence of prime numbers while it ws being compiled. That's right; the design committee hadn't realised C++ templates are a Turing complete language in themselves - and that some valid C++ programs could never finish being compiled.
The least unsatisfactory alternative with C is to adhere to one of the relatively benign subsets of C, e.g. MISRA.
If you cover the scalpel, with cotton wool, safety tape and other precautions, making it so that even a toddler, could use it safely.
What do you think would happen to a patient, that urgently needs an operation to save their life, if the only tool available, was the aforementioned, scalpel?
If a tricky embedded system, needs to perform a very rapid software operation, in order to work, safely and correctly, otherwise, the voltage/system/etc will change too much (in real-time), ruining the motor controller, or whatever the system/software is trying to do.
Then your suggested 'new' language, i.e. NOT very high speed/efficient C, may make the task relatively unsolvable with the existing/affordable hardware.
In other words. Use the RIGHT TOOL for the RIGHT JOB. Not some film-flamcy cotton wool, safety tape covered, thing, that is not suitable for doing the job in hand.
What do you think would happen to a patient, that urgently needs an operation to save their life, if the only tool available, was the aforementioned, scalpel?
If a tricky embedded system, needs to perform a very rapid software operation, in order to work, safely and correctly, otherwise, the voltage/system/etc will change too much (in real-time), ruining the motor controller, or whatever the system/software is trying to do.
Then your suggested 'new' language, i.e. NOT very high speed/efficient C, may make the task relatively unsolvable with the existing/affordable hardware.
In other words. Use the RIGHT TOOL for the RIGHT JOB. Not some film-flamcy cotton wool, safety tape covered, thing, that is not suitable for doing the job in hand.
To be fair, C has some footguns, uncertainties and unsafety that are not required to get the power. It is just that people who find C unusably bad, tend to come up with much worse alternatives that fail to provide the "power" as you point out.
C is like a scalpel which fits in the hand relatively well, but not perfectly, and sometimes it slips. Most "C hateboys" are those who ask for a language where scalpel is wrapped in cotton wool, or a scalpel that looks like a beautiful cathedral. But the right thing to ask for is a scalpel which is still a scalpel, but a few safety improvements. Think about a knife with that "bump" in the handle which prevents your hand from slipping into the blade if you push it down. C lacks this "bump". But C has a rigid screw hole in the right place so you can easily add this bump.
I do admit it is interesting to discuss about the crappiness of C, but the end result seems to be: it's still not colossally too bad, and alternatives do not prove very usable, so - we stick with C, and we are careful, and we make far fewer mistakes than the "C is dangerous" alarmists claim. At least for myself, 95% of my bugs are in logical thinking and would happen in any language. Maybe some 5% would be avoidable with a better designed language. I don't even remember when I last time overindexed a table or made a mistake with a zero-terminated string, it has been years!
What do you think would happen to a patient, that urgently needs an operation to save their life, if the only tool available, was the aforementioned, scalpel?
I think it's more like tablesaws with saw stops and without, lots of professional wood workers will keep becoming 9 digit professionals for the foreseeable future but times are a changing.
C programmers got their chance with Cyclone C, now bean counters will make the choice and it will not be C-like. Learn to love Rust and Ada-SPARK, though I admit they are hard to love.
Well I can argue that the committed C devotees, can only ever defend their position by making up analogies! Defending the language on technical, engineering, computational prowess or expressive merits is just too hard to do.
I'd prefer if a particular language, had programmer adjustable options, which could vary the amount of safety checking, a compiler does, especially during run-time.
To be fair, C has some footguns, uncertainties and unsafety that are not required to get the power. It is just that people who find C unusably bad, tend to come up with much worse alternatives that fail to provide the "power" as you point out.
C is like a scalpel which fits in the hand relatively well, but not perfectly, and sometimes it slips. Most "C hateboys" are those who ask for a language where scalpel is wrapped in cotton wool, or a scalpel that looks like a beautiful cathedral. But the right thing to ask for is a scalpel which is still a scalpel, but a few safety improvements. Think about a knife with that "bump" in the handle which prevents your hand from slipping into the blade if you push it down. C lacks this "bump". But C has a rigid screw hole in the right place so you can easily add this bump.
I do admit it is interesting to discuss about the crappiness of C, but the end result seems to be: it's still not colossally too bad, and alternatives do not prove very usable, so - we stick with C, and we are careful, and we make far fewer mistakes than the "C is dangerous" alarmists claim. At least for myself, 95% of my bugs are in logical thinking and would happen in any language. Maybe some 5% would be avoidable with a better designed language. I don't even remember when I last time overindexed a table or made a mistake with a zero-terminated string, it has been years!
Well I can argue that the committed C devotees, can only ever defend their position by making up analogies! Defending the language on technical, engineering, computational prowess or expressive merits is just too hard to do.
You miss the fact that we do engineering, based on technical facts, day in day out. These discussions are free-time for us, and making up analogies is fun.
Also the technical sides have been pretty much covered already, over and over again.
Besides, whenever we go deep into technical, factual reasoning, nothing seems to be enough for you because you want "proof" instead of "opinions". So enjoy actual opinions, and some stupid analogies. It's more fun, anyway.
The danger with analogies is that even a slightly misleading one can lead to conclusions that aren't true for the original scenario. We can compare C to a broken chainsaw, or a broken semi automatic, nobody would advocate a surgeon use a broken scalpel, or a scalpel with a chipped, loose, blunt blade would they?
(https://i.pinimg.com/736x/55/89/2b/55892bc762fbce02facbdd6107cddeea.jpg)
(https://i.pinimg.com/736x/55/89/2b/55892bc762fbce02facbdd6107cddeea.jpg)
No, I think you accidentally posted C++ (just the standard without compiler extensions).
C with compiler extensions is attached.
Alright, that's fine with me, analogies it is! So if the C language was implemented in hardware, it would look a lot like this:i dont get your analogy... what different vendor extensions? what different standards? and what are you suggesting as better alternative? i will looking forward to look up vendor extensions and different standards on the better language you are suggesting.
(https://i.pinimg.com/736x/55/89/2b/55892bc762fbce02facbdd6107cddeea.jpg)
You can see all of the many vendor extensions there quite well, the support of different standards here and there too, pretty much captures all of the bells and whistles I think.
Many of the things people want from C, are actually available as compiler extensions already, and as a practicalist, that is completely fine to me, and I don't feel bad still calling it "C". GCC can add runtime checks for out of bound array access, and also check UB, as ataradov already showed on post #13: https://gcc.gnu.org/onlinedocs/gcc-7.2.0/gcc/Instrumentation-Options.html
But these threads always go in circles because people enjoy the discussion itself and do not spend a lot of time in reading and digesting what was said and what the consequences are thereof.
Ahh, you're right, my apologies, here we go, if the C language was clothing:
(http://lh6.ggpht.com/__hNkdGikmfU/S1b4rmDB9PI/AAAAAAAAASs/wd2dTgBd2GE/s400/caduta-009392-2_-3__tonemapped.JPG)
Those 'going round in circles'i think because one group want everything if possible readily served on silver plater, ie the built-in library, the compiler, and the (managed) language is doing the job for them, the boundary check, UB's behaviour etc. i guess thats what the profession demands for production's sake. this goes me extrapolating that what they actually wish for is a language/compiler that can read their mind or at least program a simple line DoMyProgram() and everything is done. and the other old school group (probably for hobby sake like me) still want to stick with all manual control bare down to the metal language who happily do if ((i >= size) || (divisor == 0)) CallBackUndefineBehaviour(); or diy build a bulletproof memory management class/library for a particular resource limited HW, i think this is what OP tried to do and solve halfly and then encountered problem in built-in/standard library, which library vendor/maker? i'm not sure but i dont recall an infinite loop on my M$ C/C++ compiler during my entire life programming, it just throw exception and program exited. lets hope we extinct and lets see what the new generation/language can offer in the future, better or worse? without C... Google AI possibly that is...
Alright, that's fine with me, analogies it is! So if the C language was implemented in hardware, it would look a lot like this:i dont get your analogy... what different vendor extensions? what different standards? and what are you suggesting as better alternative? i will looking forward to look up vendor extensions and different standards on the better language you are suggesting.
(https://i.pinimg.com/736x/55/89/2b/55892bc762fbce02facbdd6107cddeea.jpg)
You can see all of the many vendor extensions there quite well, the support of different standards here and there too, pretty much captures all of the bells and whistles I think.
Ahh, you're right, my apologies, here we go, if the C language was clothing:
(http://lh6.ggpht.com/__hNkdGikmfU/S1b4rmDB9PI/AAAAAAAAASs/wd2dTgBd2GE/s400/caduta-009392-2_-3__tonemapped.JPG)
Isn't that the opposite? I mean, C lets you do pretty much what you want, including unsafe things. The clothing pictured would be when a C greybeard becomes mentally unstable and is given Ada to work with to do less harm, no?
Those 'going round in circles'i think because one group want everything if possible readily served on silver plater, ie the built-in library, the compiler, and the (managed) language is doing the job for them, the boundary check, UB's behaviour etc. i guess thats what the profession demands for production's sake. this goes me extrapolating that what they actually wish for is a language/compiler that can read their mind or at least program a simple line DoMyProgram() and everything is done. and the other old school group (probably for hobby sake like me) still want to stick with all manual control bare down to the metal language who happily do if (i >= size) CallBackUndefineBehaviour(); or diy build a bulletproof memory management class/library for a particular resource limited HW. lets hope we extinct and lets see what the new generation/language can offer in the future, better or worse? without C...
you doing sort of informal fallacy on C. just because its so extensible by nature, which is imho is a big plus, so every vendors is doing extension to it in their own way, i also made my own extended library complete with bulletproof boundary check classes, so that made me the 15th standards which is bad? (but luckily i dont publish, its for my own). what do you expect? M$ style of driver certification where each vendor should comply to certain standards set by the committee? and possibly pay for the certification? M$ got heavy bashing for this, no? how could you satisfy all? are you going to make a language that only you you can extend it? good luck! python faced the same compatibility issue! on ever expanding library/extensionsi dont get your analogy... what different vendor extensions? what different standards? and what are you suggesting as better alternative? i will looking forward to look up vendor extensions and different standards on the better language you are suggesting.
Arm C Language Extensions (http://file:///C:/Users/Hugh/Downloads/acle-2021Q2.pdf)
Microsoft extensions to C (https://learn.microsoft.com/en-us/cpp/build/reference/microsoft-extensions-to-c-and-cpp?view=msvc-170)
Nvidia C Language extensions (https://developer.download.nvidia.com/compute/DevZone/docs/html/C/doc/CUDA_C_Programming_Guide.pdf)
Microchip C Language extensions (https://www.puntoflotante.net/51288f.pdf)
Ahh, the old "what are you suggesting as better alternative" defense, how predictable.
if it is insisted that the language performs extensive safety checks, at run time, even at the lowest (system) level. It might need to check the address is within the valid bounds of the particular array (if applicable). It might check the value is NOT 0 (NULL), as that may indicate a possible error somewhere.
It could check the value is not less than 32, and is not too big, etc.
Possibly other safety checks could be implemented.
But then with all the extra memory accesses, comparisons, branches and other stuff. It might take 5 or even 20 or more times longer to execute.
you doing sort of informal fallacy on C. just because its so extensible by nature
That's exactly how I manage to to "solved" my problems.
20x the code size you would get with C89 and 20x slower (-O1 is the only one implemented at the moment).
As a benefit it simplifies both ICE-testing(1) and ICE-coverage.
That's a good counter analogy. As CPUs become ever increasingly fast, powerful and relatively low cost (for the performance you get). The importance of compiler language's speed efficiency, gradually diminish, over the years.
you doing sort of informal fallacy on C. just because its so extensible by nature, which is imho is a big plus, so every vendors is doing extension to it in their own way, i also made my own extended library complete with bulletproof boundary check classes, so that made me the 15th standards which is bad? (but luckily i dont publish, its for my own). what do you expect? M$ style of driver certification where each vendor should comply to certain standards set by the committee? and possibly pay for the certification? M$ got heavy bashing for this, no? how could you satisfy all? are you going to make a language that only you you can extend it? good luck! python faced the same compatibility issue! on ever expanding library/extensionsi dont get your analogy... what different vendor extensions? what different standards? and what are you suggesting as better alternative? i will looking forward to look up vendor extensions and different standards on the better language you are suggesting.
Arm C Language Extensions (http://file:///C:/Users/Hugh/Downloads/acle-2021Q2.pdf)
Microsoft extensions to C (https://learn.microsoft.com/en-us/cpp/build/reference/microsoft-extensions-to-c-and-cpp?view=msvc-170)
Nvidia C Language extensions (https://developer.download.nvidia.com/compute/DevZone/docs/html/C/doc/CUDA_C_Programming_Guide.pdf)
Microchip C Language extensions (https://www.puntoflotante.net/51288f.pdf)
Ahh, the old "what are you suggesting as better alternative" defense, how predictable.
From the compiler point of view unsafe Rust and Ada pragma'd to turn off runtime checks can express pretty much the same thing as equivalent C, it's the programmers which will get slower converting to Ada-SPARK and especially Rust.
From the compiler point of view unsafe Rust and Ada pragma'd to turn off runtime checks can express pretty much the same thing as equivalent C, it's the programmers which will get slower converting to Ada-SPARK and especially Rust.
The Ada-SPARK / ADA type languages, seem to need (I've not really looked into it, especially hard), the purchase, of presumably expensive compiler licenses. Which, compared to the wide range of apparently free interpreters and compilers available, such as GCC. Is difficult to stomach.
I.e. If we go back to the old days, when most compilers had to be paid for. It wasn't so much of a problem, as you could choose the language you wanted, then buy it (if affordable).
But nowadays, with so much free stuff to choose between. It is difficult to justify paying, just to use a particular language. Perhaps for a hobby project.
I think there is an old (not up to modern standards), open source, free version of ADA, floating about. But if I remember correctly, it seemed to old and using outdated language specifications, to try out.
If compilers should be free or not, I suppose is another topic of conversation.
It has no string or bit data types, it has a contrived clumsy way to let you declare bit fields, it doesn't let you access bits in bit fields using array subscript notation, it has no support for catching and handling divide by zero or null pointer references, it treats assignments as expressions, it restricts you to always base array elements starting at zero, it restricts you to map arrays in only on direction, it forces you to "forward declare" functions, it uses the term "static" to mean "private" but in other contexts it means something else.man i think i'm pretty clear what you are after. i dont want to go further with this type of argument, you have every right to not be happy with it. will be looking forward to see your new invention.
It has the totally useless keyword "void" which litters the source code and header files (and that has contaminated many languages since including C++, Java, C#...), it has reserved words (as do all derivatives of C). It's one of the oldest and feeblest programming languages still in regular use, it's dated, stuck in the past, hampers productivity, confuses new users with silliness like ++I and J-- and so on.
The Ada-SPARK / ADA type languages
Part of the popularity is of course non-technical, as always
make your own! just dont make another confusing/non-compliance standard such as printf2 or ditbho_printf, unless for your own.. ;DPart of the popularity is of course non-technical, as always"printf" was/is a good example here, in this forum ;D
(we tried to get rid of it, but ...)
The Ada-SPARK / ADA type languages, seem to need (I've not really looked into it, especially hard), the purchase, of presumably expensive compiler licenses. Which, compared to the wide range of apparently free interpreters and compilers available, such as GCC. Is difficult to stomach.
The Ada-SPARK / ADA type languages, seem to need (I've not really looked into it, especially hard), the purchase, of presumably expensive compiler licenses. Which, compared to the wide range of apparently free interpreters and compilers available, such as GCC. Is difficult to stomach.
GNAT (GNU Ada) comes at the exact same cost as any other GNU compiler: zero. It passes 100% of ACATS (kind of an Ada standards compliance test).
If you can't live with occasional snags, don't use a scalpel. Scalpels are for professionals only.
If you can't live with C's UB concept, don't use C. C is a sharp blade as well.
The problem is that very few programmers understand all the subtle and surprising causes of "undefined behaviour". Most think they do, of course :( That was famously illustrated when Hans Boehm thought it worthwhile to write a paper pointing out that you couldn't write a threading library in C - too many people presumed or "thought" you could.
C++ is worse. The committee defining the language didn't understand what they had created until their noses were rubbed in it by a short program that spat out the sequence of prime numbers while it ws being compiled. That's right; the design committee hadn't realised C++ templates are a Turing complete language in themselves - and that some valid C++ programs could never finish being compiled.
The least unsatisfactory alternative with C is to adhere to one of the relatively benign subsets of C, e.g. MISRA.
If you cover the scalpel, with cotton wool, safety tape and other precautions, making it so that even a toddler, could use it safely.
What do you think would happen to a patient, that urgently needs an operation to save their life, if the only tool available, was the aforementioned, scalpel?
If a tricky embedded system, needs to perform a very rapid software operation, in order to work, safely and correctly, otherwise, the voltage/system/etc will change too much (in real-time), ruining the motor controller, or whatever the system/software is trying to do.
Then your suggested 'new' language, i.e. NOT very high speed/efficient C, may make the task relatively unsolvable with the existing/affordable hardware.
In other words. Use the RIGHT TOOL for the RIGHT JOB. Not some film-flamcy cotton wool, safety tape covered, thing, that is not suitable for doing the job in hand.
From the compiler point of view unsafe Rust and Ada pragma'd to turn off runtime checks can express pretty much the same thing as equivalent C, it's the programmers which will get slower converting to Ada-SPARK and especially Rust.
The Ada-SPARK / ADA type languages, seem to need (I've not really looked into it, especially hard), the purchase, of presumably expensive compiler licenses. Which, compared to the wide range of apparently free interpreters and compilers available, such as GCC. Is difficult to stomach.
I.e. If we go back to the old days, when most compilers had to be paid for. It wasn't so much of a problem, as you could choose the language you wanted, then buy it (if affordable).
But nowadays, with so much free stuff to choose between. It is difficult to justify paying, just to use a particular language. Perhaps for a hobby project.
I think there is an old (not up to modern standards), open source, free version of ADA, floating about. But if I remember correctly, it seemed too old and using outdated language specifications, to try out.
If compilers should be free or not, I suppose is another topic of conversation.
I'm confused as to whether you are considering amateur or professional devices.
I could go on, but there's no point, you could accept at least some of these as valid criticisms rather than stubbornly defending the language no matter what I say.
"printf" was/is a good example here, in this forum ;D
(we tried to get rid of it, but ...)
I could go on, but there's no point, you could accept at least some of these as valid criticisms rather than stubbornly defending the language no matter what I say.
You misinterpret others because you are not paying attention, and because we are making it worse by making fun of your blindness.
Everyone agrees with (nearly) all points in that list of criticisms. There are good counter-points to many of those (more like explanations why it is so, not that it makes it any better), but not all, some are just objectively very stupid, such as the messed-up double use of keyword static.
It's just we have heard all of this thousands of times, we all know about it, and we agree with it.
So what's next? We want to hear constructive ideas how it could be done better. You got a pretty bad start with your thread because your mixed bag of random ideas, some just poorly though out syntactic sugar which does not taste any better than the "static" mess in C, just different, some having consequences in memory footprint making your language not suitable for small microcontrollers, contrary to the thread title, and so on and so on. Combined with an extremely overconfident "I know this stuff and you don't" attitude. A bad mix; assholes like me are tolerated when we are at least right or have good ideas 95% of the time.
So instead of repeating the "C is crap" ad nauseam, which we alreardy know, try something new. Or don't, finally it doesn't matter.
There are several things that a new language would bring, here's a summary of the more salient:
- No reserved words, thus enabling new keywords to be added over time.
- Support 'bit' as a native data type.
- Support 'strings' as a native type, BCD/decimal as well.
- Support for namespaces.
- Computed gotos.
- Flexible alignment, packing and padding directives.
- Nested functions.
- Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.
- Support for an async/await model.
These are the kinds of things that I've seen other people raise or complain about sometimes, things that a freshly designed language could readily accomodate.
At a minimum the language would include:
1. More storage classes than C (defined, based etc)
2. More data types including decimal/BCD and strings of fixed or variable length.
3. Arbitrary rank arrays with optional bound specifiers.
4. Distinguish between function and procedures.
5. Fine control over padding and alignment and field ordering etc.
6. No object oriented features, no GC, no virtual functions and so on.
7. No need for "forward declarations"
8. Nested functions/procedures.
9. Some kind of "computed" goto support.
10. No reserved words.
11. No explicit pointer arithmetic, like ptr++ and ptr-- and so on.
12. Invocable variables (a simpler form of C's function pointer)
That's the basic initial list anyway, entirely reasonable and feasible and something I've implemented successfully in the past.
I don't think you should be posting big chunks of stuff from a DIFFERENT thread, and discussing it in this one.
Best to keep the discussion to that other 17+? page thread.
Because it will just cause too much confusion, and damage/destroy/off-topic this thread, which somewhat belongs to someone else and/or this forum.
"printf" was/is a good example here, in this forum ;D
(we tried to get rid of it, but ...)
Printf has also been a big cause of exploits BTW.
But I can't really blame that purely on C, it's as much because of bad design ideas put in stone during the Unix era regardless of C. Combining escape sequences with un-trusted text input is an atrociously bad idea. By now it's so entrenched, people can't even conceive of alternatives.
It's a relatively bad idea given how it can go bad in so many ways, but I can understand the appeal. Formatted printing is convenient to use. Re-read the thread about printf() alternatives, many people don't want to have to deal with alternatives that require more programming effort. A format string is easy to use and (relatively) easy to read
c is not about safety nor protection to your heart content. Its just 'minimally works' i mean its standard library.. if anyone not happy with it, make another safe library on your own, dont use the standard printf, as the OP has figured out.. but then there's people on enterprise grade who want everything ready on silver platter safe and sound, regardless of what holy excuses you give them.. so its futile.. hows the NSA and CIA deals with SW things? What alternative? Visual Basic has been around for ages and pretty much covers everything mentioned here, about boundary check and safety exceptions.. its the most charming human friendly syntax, options to do safety boundary check, vs optimize for speed code during compilation etc etc, but it never went as popular and long lived as C, everybody went to bloated java or python. Siwastaja possibly explained it. Nothing technical, just support ecosystem, marketing, about the right timing or even money laundering whatever that made a language go, even if its crappier than the earlier."printf" was/is a good example here, in this forum ;D
(we tried to get rid of it, but ...)
Printf has also been a big cause of exploits BTW.
Yep. Now, as I just said in another thread, exploits are often more due to lack of proper input validation than on the function itself.
Think prevention vs. protection. But it's always easier to blame one's tools.But I can't really blame that purely on C, it's as much because of bad design ideas put in stone during the Unix era regardless of C. Combining escape sequences with un-trusted text input is an atrociously bad idea. By now it's so entrenched, people can't even conceive of alternatives.
It's a relatively bad idea given how it can go bad in so many ways, but I can understand the appeal. Formatted printing is convenient to use. Re-read the thread about printf() alternatives, many people don't want to have to deal with alternatives that require more programming effort. A format string is easy to use and (relatively) easy to read, and while much more elaborate ways of implementing them in a safer way are possible, that was certainly out of the question back in the early 70's.
But anyway - you just pinpointed the issue, as I said above.
VALIDATE YOUR INPUTS.
I would love to hear Sherlock Holmes's thoughts on PASCAL. It was around at about the same time as C in the early adoption of PCs, and a far more 'designed' language. "Turbo Pascal" and "Turbo C" were approximately equal languages at tools, and I've used both.i got the chance to try out Borland C++ Builder, Borland Delphi (GUI version using Pascal) and MSVStudio... those
Why did PASCAL not win, given its technical superiority?
I would love to hear Sherlock Holmes's thoughts on PASCAL. It was around at about the same time as C in the early adoption of PCs, and a far more 'designed' language. "Turbo Pascal" and "Turbo C" were approximately equal languages at tools, and I've used both.
Why did PASCAL not win, given its technical superiority?
The P0 spin on this myth is: if you’ve made a bug, make sure you don’t make another bug when you fix the first bug.
That would be nice, but, really? Programmers are supposed to take their usual process, which produced the bug in the first place, and turn on INFALLIBLE MODE so that there is no bug in the patch? How likely is that?
Here's an extremely thought provoking blog post (http://trevorjim.com/if-johnny-cant-patch-maybe-he-shouldnt-use-c-c++/) about languages, particularly about the cause of memory corruption, is it the language or the programmer...
I loved this:QuoteThe P0 spin on this myth is: if you’ve made a bug, make sure you don’t make another bug when you fix the first bug.
That would be nice, but, really? Programmers are supposed to take their usual process, which produced the bug in the first place, and turn on INFALLIBLE MODE so that there is no bug in the patch? How likely is that?
I would love to hear Sherlock Holmes's thoughts on PASCAL. It was around at about the same time as C in the early adoption of PCs, and a far more 'designed' language. "Turbo Pascal" and "Turbo C" were approximately equal languages at tools, and I've used both.
Why did PASCAL not win, given its technical superiority?
My understanding is pretty much the same as this, I think this explanation fits the facts.
https://qr.ae/pvcWaV
First almost always beats best. Look at the crippled Intel CPU used in the IBM PC, the 68000 was head and shoulders above Intel, yet by comparison became like Pascal. Intel's crippled CPU was simply ready earlier, the rest is history.
Here's an extremely thought provoking blog post (http://trevorjim.com/if-johnny-cant-patch-maybe-he-shouldnt-use-c-c++/) about languages, particularly about the cause of memory corruption, is it the language or the programmer...
I loved this:QuoteThe P0 spin on this myth is: if you’ve made a bug, make sure you don’t make another bug when you fix the first bug.
That would be nice, but, really? Programmers are supposed to take their usual process, which produced the bug in the first place, and turn on INFALLIBLE MODE so that there is no bug in the patch? How likely is that?
I don't really see the thought provoking factor here. He pretty much says what almost everyone else does. How is that provoking anything?
I would love to hear Sherlock Holmes's thoughts on PASCAL. It was around at about the same time as C in the early adoption of PCs, and a far more 'designed' language. "Turbo Pascal" and "Turbo C" were approximately equal languages at tools, and I've used both.
Why did PASCAL not win, given its technical superiority?
My understanding is pretty much the same as this, I think this explanation fits the facts.
https://qr.ae/pvcWaV
First almost always beats best. Look at the crippled Intel CPU used in the IBM PC, the 68000 was head and shoulders above Intel, yet by comparison became like Pascal. Intel's crippled CPU was simply ready earlier, the rest is history.
Gosh, that very much differs from my recollection. Here's mine:
PASCAL predated C by a few years and was also commonly available. I remember using it on an Apple II and it was often seen in the press (e.g. Byte Magazine, Dr Dobbs, books at the bookstore...). It took over as the preferred language for introductory CompSci at University (until it got pushed out by Java in the late 90s). Its syntax was also used as the bases for most academic pseudocode you find in books of that era.
What killed Pascal was:
- Early versions of Pascal didn't compile to native code. It ended up in a 'p-code' ( https://en.wikipedia.org/wiki/P-code_machine ) . This was great for portability, but poor for performance. You could not extract the full power of the machine - if you needed to you used something else. It valued portability above performance
- it wasn't flexible enough. Pascal had a runtime model that was very hard to get rid of, making writing system level code very challenging. With C you pretty much abandon the runtime library and do your own thing. This made Pascal unsuited to writing low-level code like building OSes.
- Code performance. When the first advanced optimizing C compilers came out PASCAL got left behind in performance. C's crudeness made for intermediate code that was easier to analyze and optimize in the resource constrained environments of the time. You ended up with smaller, faster code when you used C.
- Pascal users would still need to use assembly for performance critical code. C would let you replace most of your assembly code with something that could be more easily developed and maintained. It was not at all uncommon for Pascal users to link to modules written in C code for performance reasons - much like how Python used now interface to C when needed. Most developers were of the view that if you are going to invest time implementing your critical code in C, you may as well use it for the rest of your codebase.
This was all at a time where business applications were largely written in something else - either in COBOL or one of the many 4GL development tools like DBase, FoxPro, DataFlex, Magix, Pick, etc. Almost nobody was writing business applications in C.
The reasons that Pascal is less prevalent than C is nothing to do with any linguistic qualities but circumstances.define circumstances. if its including the fact that pascal is technically less capable than C, then you are right. nobody cares about linguistic if its capable enough.
The reasons that Pascal is less prevalent than C is nothing to do with any linguistic qualities but circumstances.define circumstances. if its including the fact that pascal is technically less capable than C, then you are right. nobody cares about linguistic if its capable enough.
nobody cares about linguistics if the language is capable enough.
you dont want to get an insult when i ask if its your place to comment C? (which clearly you are not!) but yet you are happy to give direct insult to others (in this case... me) when we give you our opinion (when you have nothing else better point or recent news to say)... how ironic! the new X generation on the street! that we worth ignoring! ;Dnobody cares about linguistics if the language is capable enough.One of the most amusing oxymorons I've seen this year!
I mean events, decisions, trends like Unix (which included C compilers) being given away free to universities, universities using that as a basis for some courses, students seeing examples of OS schedulers, utilities and network code all written in C and so on. Circumstances like Sun adopting Unix as the basis for their software strategy. As that person stated, Unix was developed by Bell Labs, which was part of a government-protected monopoly. The Unix OS and the multitude of tools it has were all written in C, therefore wherever Unix will go, C must follow.thats the history lesson and you got so left behind.. i ask you about today's events/circumstances... even though there are ecosystem such as borland delphi or embarcadero, why no body cares (or only minority?) to make another latest extension/development like free tools based on pascal language? or its structure/idea etc... at least not as popular as C/C++? there is no pascal++ to support operator overloading, polymosphism/template and i can find more... so bye bye pascal... your history lesson is a dissapointment/boring esp from someone who claimed a proficient programmer who built a big compiler... ;D
Anything you can write in C you can write in Pascal or C# or PL/I or COBOL or APLActually, you can't. At least not with the same level of control as you have in C. With the exception of C# (that has the same), none of the languages has any equivalent to the volatile expression to control the exact point of register loads and stores.
Anything you can write in C you can write in Pascal or C# or PL/I or COBOL or APL, yes, even assembler - they are all examples of Turing machines.
Anything you can write in C you can write in Pascal or C# or PL/I or COBOL or APLActually, you can't. At least not with the same level of control as you have in C. With the exception of C# (that has the same), none of the languages has any equivalent to the volatile expression to control the exact point of register loads and stores.
That either leaves you with the choice of not having certain optimisations or the inability to reliably write interrupt/exception handlers.
Anything you can write in C you can write in Pascal or C# or PL/I or COBOL or APLActually, you can't. At least not with the same level of control as you have in C. With the exception of C# (that has the same), none of the languages has any equivalent to the volatile expression to control the exact point of register loads and stores.
That either leaves you with the choice of not having certain optimisations or the inability to reliably write interrupt/exception handlers.
First, do you agree that we can't compare languages without talking about programming language syntax and semantics? We can't say C has a greater "level of control" unless we define what that means in linguistic terms? unless we can quantitatively measure that?
Second, my question was is there an input that can be consumed to produce an output that cannot be achieved with another language consuming the same input, and if there is an example.That question is nonsense, IMHO. What you request is us to prove that there are languages that are not Turing complete. By definition, programming languages have to be Turing complete or they don't deserve that designation. I won't discuss if a rose is a rose.
I think the answer to that is "no", if you disagree then we can explore that if you want.
As for "volatile" it typically has different semantics in different languages, so right away its hard to compare languages on that basis. More importantly "volatile" is nothing to with the language, it is simply a way of influencing the code optimizers and optimizers are an implementation matter not a language issue.
Volatile is a way of saying to the optimizer "please don't rewrite these parts of my code, please ensure the codes does exactly what I wrote".
you dont want to get an insult when i ask if its your place to comment C? (which clearly you are not!) but yet you are happy to give direct insult to others (in this case... me) when we give you our opinion (when you have nothing else better point or recent news to say)... how ironic! the new X generation on the street! that we worth ignoring! ;Dnobody cares about linguistics if the language is capable enough.One of the most amusing oxymorons I've seen this year!
I mean events, decisions, trends like Unix (which included C compilers) being given away free to universities, universities using that as a basis for some courses, students seeing examples of OS schedulers, utilities and network code all written in C and so on. Circumstances like Sun adopting Unix as the basis for their software strategy. As that person stated, Unix was developed by Bell Labs, which was part of a government-protected monopoly. The Unix OS and the multitude of tools it has were all written in C, therefore wherever Unix will go, C must follow.
Thats the history lesson and you got so left behind.. i ask you about today's events/circumstances... even though there are ecosystem such as borland delphi or embarcadero, why no body cares (or only minority?) to make another latest extension/development like free tools based on pascal language? or its structure/idea etc... at least not as popular as C/C++? there is no pascal++ to support operator overloading, polymosphism/template and i can find more... so bye bye pascal... your history lesson is a dissapointment/boring esp from someone who claimed a proficient programmer who built a big compiler... ;D
Anything you can write in C you can write in Pascal or C# or PL/I or COBOL or APLActually, you can't. At least not with the same level of control as you have in C. With the exception of C# (that has the same), none of the languages has any equivalent to the volatile expression to control the exact point of register loads and stores.
That either leaves you with the choice of not having certain optimisations or the inability to reliably write interrupt/exception handlers.
First, do you agree that we can't compare languages without talking about programming language syntax and semantics? We can't say C has a greater "level of control" unless we define what that means in linguistic terms? unless we can quantitatively measure that?
Umm, no. First and foremost, I need a language that suits my needs. A programming language is a tool and tools aren't rated by beauty contest, the only measure that counts if it they get the job done.
Second, my question was is there an input that can be consumed to produce an output that cannot be achieved with another language consuming the same input, and if there is an example.That question is nonsense, IMHO. What you request is us to prove that there are languages that are not Turing complete. By definition, programming languages have to be Turing complete or they don't deserve that designation. I won't discuss if a rose is a rose.
I think the answer to that is "no", if you disagree then we can explore that if you want.
As for "volatile" it typically has different semantics in different languages, so right away its hard to compare languages on that basis. More importantly "volatile" is nothing to with the language, it is simply a way of influencing the code optimizers and optimizers are an implementation matter not a language issue.
Volatile is a way of saying to the optimizer "please don't rewrite these parts of my code, please ensure the codes does exactly what I wrote".
Volatile has exactly one single semantic in C and C[derived] languages. And it provides required functionality to write proper exception handlers and multithreaded interprocess communication. C is *the* system programming language, if you like it or not.
I like PASCAL (and Modula II and Oberon), I even like Ada, but (except maybe for Ada, that indeed provides volatile) I'd never come up with the idea of attempting to write any code that directly needs to interact with hardware (or deal with multiple processors) in one of these languages.
In C, and consequently C++, the volatile keyword was intended to
o - allow access to memory-mapped I/O devices
o - allow uses of variables between setjmp and longjmp
o - allow uses of sig_atomic_t variables in signal handlers.
While intended by both C and C++, the C standards fail to express that the volatile semantics refer to the lvalue, not the referenced object. The respective defect report DR 476 (to C11) is still under review with C17.[2]
The following sections discuss the C semantics of the volatile keyword and show that they neither support existing practice nor, we believe, reflect the intent of the committee when they were crafted.
"printf" was/is a good example here, in this forum ;D
(we tried to get rid of it, but ...)
People don't realize how much of a technological achievement Rust is.
People don't realize how much of a technological achievement Rust is.
Rust reminds me a lot of Haskell, the code hurts my head but a lot of people I can readily recognize as smarter than me are adopting it. Has more industry adoption than Haskell ever had though, even if it's mostly just taking Google's CrosVM and repurposing it.
I'm not sure the industry has enough high IQ coders to make good use of it.
People don't realize how much of a technological achievement Rust is.
Rust reminds me a lot of Haskell, the code hurts my head but a lot of people I can readily recognize as smarter than me are adopting it. Has more industry adoption than Haskell ever had though, even if it's mostly just taking Google's CrosVM and repurposing it.
I'm not sure the industry has enough high IQ coders to make good use of it.
Zig manages to provide many of the same features with a single mechanism - compile-time execution of regular zig code. This comes will all kinds of pros and cons, but one large and important pro is that I already know how to write regular code so it's easy for me to just write down the thing that I want to happen.
Simple set theory says C is more than some other languages.
C gives you the ability to write a whole class of programs that are outside of what can be written in 'safe' language, or a highly managed runtime environment - those with memory leakage, smashed stacks and other "bad stuff". The set of possible programs achievable in C has to be larger than a 'safe' programming language. Most would argue that these programs sit outside of the C standards, but it is what it is.
Most of the time the overreach is not a good thing - but sometimes the ability to shoot yourself in the foot in a very controlled manner is important. Especially when directly controlling hardware, or OS HAL type work.
This in particular has now caught my attention:QuoteZig manages to provide many of the same features with a single mechanism - compile-time execution of regular zig code. This comes will all kinds of pros and cons, but one large and important pro is that I already know how to write regular code so it's easy for me to just write down the thing that I want to happen.
From here (https://www.scattered-thoughts.net/writing/assorted-thoughts-on-zig-and-rust/).
A language that is both "safe", good at high-level stuff, and also very nice for low-level stuff: Ada. Anyone serious about designing programming languages should at least have a look at it IMHO.
One important distinction between Ada and a language like C is that statements and expressions are very clearly distinguished. In Ada, if you try to use an expression where a statement is required then your program will fail to compile. This rule supports a useful stylistic principle: expressions are intended to deliver values, not to have side effects.
A language that is both "safe", good at high-level stuff, and also very nice for low-level stuff: Ada. Anyone serious about designing programming languages should at least have a look at it IMHO.
A language that is both "safe", good at high-level stuff, and also very nice for low-level stuff: Ada. Anyone serious about designing programming languages should at least have a look at it IMHO.
So why aren't we all using Ada for programming these microcontrollers? What's been the issue here?
A language that is both "safe", good at high-level stuff, and also very nice for low-level stuff: Ada. Anyone serious about designing programming languages should at least have a look at it IMHO.
So why aren't we all using Ada for programming these microcontrollers? What's been the issue here?