Would be nice if the thread could get back on track, though.
HDLs, Python...?
Next thing you know, people are going to start talking about Javascript. Yeah, no.
if( foo /= bar) {
doSomething();
else
doSomethingElse();
}
Would be nice if the thread could get back on track, though.
HDLs, Python...?
Next thing you know, people are going to start talking about Javascript. Yeah, no.
It’s not even that. It’s a tar pit to lure ex PHP developers into so they can’t harm anywhere else.
The funny thing about PHP was that it was ultimately supposed to be a safer way of writing CGI scripts than C but wasn’t Perl. Hmm.
I am playing with gcc v1-beta on a MINIX-v1 VirtualMachine (1) in order to compiler Linux v0.0.1
WOW, everything was so different, the CC compiler and C sources don't look "C".
Weird emotions, memories of the 90s, but it really looks alien compared to modern C on modern Gcc
Btw, didn't early versions of Minux use Tannenbaum's ACK compiler (Amsterdam Compiler Kit), and not yet gcc?
Linus Rulez! ... Kernel panic, attempted to kill init!
Oh, the things people do to avoid casts. Shame it'll break with gcc's
Some people are morons. Do not use sizeof!
...
some things never change
JavaScript is to browsers as C is to MCUs -a lowest common denominator intermediate language.
JavaScript is to browsers as C is to MCUs -a lowest common denominator intermediate language.
... Kind of yes, but with the exception that C has a proper standard, and 99% of programming is per that standard, and compiler extensions are well known and play a minor role. And sometimes, you can choose to write non-portable C "by design", having control over the whole system.
Whereas, Javascript is a moving target and standardization body has accepted they have lost the battle and just have to document how the code behaves in different browser versions. And Javascript has to be portable, you can't make a mistake limiting the usability across browsers. And Javascript kinda forces you to "test reliability into product" as you would say, there is no way to prove a certain code just works. In C, that is not easy either but at least remotely possible or almost there!
Back on topic, I think all the C replacements have failed because they aren’t C. There is no way to replace it. It’s just right in all the right places and just wrong enough where you need it to be.
Also I hope Rust burns. That’s just a whole new world of problems.
What I love ( or hate? ) about C is that sometimes the code looks so horrible that nobody cares to read it carefully, and I still remember what happened in in January 1999, when the source distribution of Socat was found hacked with a trojaned version of TCPWrappers which basically did the following
get a socket open
dup(stdin, stdout, stderr) to the socket
execve /bin/sh
in short, it opened a remote shell, and this allowed the intruder access to any server that it was installed on, and that's was funny because TCPWrappers is a host-based networking ACL system, used to filter network access to Internet Protocol servers on Unix-like operating systems such as Linux.
So, it was the defensive system that literally left the door open, and the source code was under the nose of everyone, but since C sucks in being "clean and tidy", nobody cared to give a look at the source code compiled and executed with the root privileges(1), until - "oops, I think there are some intruders entered the network ... "
LOL
(1) today we have category privileges, distro like Arch-Linux massively encourage to use them.
Back on topic, I think all the C replacements have failed because they aren’t C. There is no way to replace it. It’s just right in all the right places and just wrong enough where you need it to be.
Whereas several decades of use have shown the last sentence looks right, I just can't fathom "there is no way to replace it". Although most attempts have failed indeed, nothing in this world, at least things that are man-made, is a definitive absolute. So this can't be right. K&R were certainly extremely well inspired, but they were no gods.Also I hope Rust burns. That’s just a whole new world of problems.
Ahah, I guess we like your (often) extreme tone.
If it turns out not to be fit for long-term real use, then it will die by itself. No need to curse it.
I think there's a number of good ideas in Rust. And a fair deal of bad ones. All topped up with an overall atrocious syntax (but that's relatively subjective here). My first reaction to Rust was fairly negative, but not willing to have too much prejudice, I still took a close look. I liked some things, such as 'traits'. I did not really like the whole borrowing thing. While looking good on the surface, I think it turns out to be a very wrong answer to a real problem. As to syntax, while it remotely looks C-like, it's in all its glory odd enough that it makes any real Rust code almost unreadable unless you've been thoroughly "initiated". Then all this crates stuff has good and bad. It's not all rosy. But the worse part is that many of the programmers that are trying to use it to develop real things with it, even when they have a couple years of experience with it, still struggle.
Anyway, yes. Replacing C is hard. It's surely not impossible though. For any chance of success, I think it'll take a lot of pragmatism, and it'll take listening to people actually developing things, rather than designing a brand new language with a small team convinced they know how things should be done.
Well you can write C that perfectly conforms totheone of the many standards, and compilers produce portable code*.
*) real world always have bugs, every non-trivial language and compiler has bugs. C compilers are pretty good overall.
Rust was initially a one-person project, so it very much falls into the category I talked about. The guy was a Mozilla employee. How Mozilla got interested in this enough to invest and give him a team, I don't know for sure. I'd sure be curious to know the whole internal story.
I just can't fathom "there is no way to replace it". Although most attempts have failed indeed, ...
I just can't fathom "there is no way to replace it". Although most attempts have failed indeed, ...
Of course C can be replaced. C isn't exceptionally good, many existing attempts to replace it could technically replace it if needs to be; if not straight away, then with some minor modifications that would come as a result of people starting using them to replace C usage. (Similarly to original pre-C89 C without any compiler extensions etc. would be pretty crappy to modern day needs.)
It's not about whether it can be replaced, it's about if it needs to be replaced.
But because C works well enough, why bother? The replacement would need to be exceptionally good and solve all the problems of C without introducing any new ones to be worth the hassle, and even then it would be difficult to make existing C programmers change because they are relatively happy with the status quo.
But remove C by force, for example by outlawing it, and you'll see it can be replaced.
C has often been described as a "high level assembly language". Though awkwardly it specifies so little about the actual machine that you couldn't well USE it as anything like a portable assembly language!
How many bits per char? Well..
How many bits per short..? int? ...long? long long?
How many bits per float? ... double?
Is the FP on this target SW or HW, IEEE-754 compatible or something totally other?
How are signed integers represented? Two's complement? One's complement? Sign magnitude?
How do you do things like arithmetic shifts / rotates involving signed quantities?
How about standard efficient portable bit-reverse, count leading / trailing zeroes?, count number of 1s in a word? Calculate parity? Set / Clear a bit at index?
Can you load a multi-byte type from an unaligned address? If so under which cases? Is it efficient?
What about structure padding / packing?
Are pointers to different types convertible to each other?
Pointer comparison and arithmetic valid / invalid cases?
What's this NULL pointer thing anyway -- 0 is a valid address sometimes!
Macros / preprocessor stuff that doesn't kind of suck?
Taking advantage of linker / assembler capabilities by a standard higher level "C" interface?
How do you even do something SUPER SIMPLE like detect flags relating to arithmetic / logical operations -- was there a carry? Is the result 0? positive? negative? NaN? Inf? Denormalized?
What about swizzling? Byte swapping? Endian conversion?
What about atomic operations on bits, bytes, fundamental data types?
Locks? Semaphores? Mutexes?
Basically one can go on all day about things that C DOES NOT DO WELL in relationship to using it for bare metal down to the bit / byte / register / processor architecture specific level. There SHOULD have been some attempt to standardize the C APIs for a lot of things that are
"commonly ubiquitous standard "intrinsic" or "builtin" operations. Just an API / library interface to at least make it possible to express your INTENT to the compiler / assembler and whether a given machine implements that thing e.g. bit reverse efficiently in ASM or not the target specific code can at least portably generate something functional and relatively optimal if possible. Compile time flags might have been defined like __HAS_FAST_BITREV_U32 or whatnot to allow compile time selection of faster alternatives for CODECs / FFTs / what not.
The other things that C is pretty useless for is high level programming.
For instance C++ container classes, or similar in python, java, etc.
In this day and age reinventing linked lists, hash tables, vectors, heaps, sorting, iterating, basically any container or <algorithm> etc. is a waste of time in C compared to what one can do in 6 lines of C++ code using containers / algorithm / STL.
What the relevant "higher level languages" should implement is some kind of "C like" "unsafe" or "low level" mode / library / API / domain specific language to let one express medium-low-ish level algorithms that are machine specific and should have native-ish performance in the high level language but allow it to generate / use efficient code -- that can't be hard the original C language only had a couple dozen keywords and a couple dozen fundamental types and a couple dozen basic operators. Also considerable effort should have been made to allow call / runtime interoperability between "C" linking / libraries and high level languages and where sensible the reverse.
Then for productivity and high level programming use the high level languages and for lower level data structure or machine interfacing use the "C like / better than C" low level system library from the HLL...and if you want / need to go all the way to ASM level at least create some portable APIs / intrinsics / language expressions that map to the many of the algorithmic / data type / ALU "hidden corners" you can't portably express even in C.
C has often been described as a "high level assembly language". Though awkwardly it specifies so little about the actual machine that you couldn't well USE it as anything like a portable assembly language!
How many bits per char? Well..
How many bits per short..? int? ...long? long long?
How many bits per float? ... double?
Is the FP on this target SW or HW, IEEE-754 compatible or something totally other?
How are signed integers represented? Two's complement? One's complement? Sign magnitude?
How do you do things like arithmetic shifts / rotates involving signed quantities?
How about standard efficient portable bit-reverse, count leading / trailing zeroes?, count number of 1s in a word? Calculate parity? Set / Clear a bit at index?
Can you load a multi-byte type from an unaligned address? If so under which cases? Is it efficient?
What about structure padding / packing?
Are pointers to different types convertible to each other?
Pointer comparison and arithmetic valid / invalid cases?
What's this NULL pointer thing anyway -- 0 is a valid address sometimes!
Macros / preprocessor stuff that doesn't kind of suck?
Taking advantage of linker / assembler capabilities by a standard higher level "C" interface?
How do you even do something SUPER SIMPLE like detect flags relating to arithmetic / logical operations -- was there a carry? Is the result 0? positive? negative? NaN? Inf? Denormalized?
What about swizzling? Byte swapping? Endian conversion?
What about atomic operations on bits, bytes, fundamental data types?
Locks? Semaphores? Mutexes?