Author Topic: The Imperium programming language - IPL  (Read 70500 times)

0 Members and 5 Guests are viewing this topic.

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1731
  • Country: se
Re: A new, hardware "oriented" programming language
« Reply #100 on: November 24, 2022, 05:26:56 pm »
Stay relevant tomorrow and learn it today :)  8)
You know what?
I'm intrigued.
I have a small project coming: a simple SWR meter, based on the AD8310 and an ADC, a Pico board (or whatever I have at hand) and an I2C OLED display.

The HW is quite trivial, the SW too, I think I can do away without user input, so I think I'll try to go Rust.
We'll see how it pans out.
Nandemo wa shiranai wa yo, shitteru koto dake.
 
The following users thanked this post: alexanderbrevig

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #101 on: November 24, 2022, 05:27:49 pm »
Another thing to look at with a language is the way we interact with heaps and allocate and free and so on. I've written rather sophisticated concurrent shared memory allocators before and I have insights into this area.

Something that might help in an MCU world is not only different heap strategies but also dynamically created heaps, heaps created for use in certain bits of code for example.

Also heap metrics might be useful, the ability for code to dynamically probe a heap for total memory use and so on or gross and net allocation counts and so on.

These could be the province of a library of course but there might be some value in doing some of this within a language itself.

Looking at PL/I again we see it support several interesting ideas with memory, one is "controlled storage" effectively a stack concept. One can allocate a variable and then allocate it again and then free it, and the variable then refers to the first allocation. This isn't done through pointers, all the developer sees is a variable.

In pseudo code:

Code: [Select]

dcl measurement_details SomeType controlled;

allocate measurement_details;

// do stuff with the variable measurement_details.

...

// allocate another.

allocate measurement_details;

free measurement_details;

// adjust the initially allocated measurement_details
...

measurement_details.totals = 10;


Then there is also "based" declarations where the address of some datum is specified at declaration time and used implicitly thereafter.

Some of these memory specific features might have utility in an MCU world.

« Last Edit: November 24, 2022, 05:30:56 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #102 on: November 24, 2022, 05:28:59 pm »
The best programming language for embedded already exists!
Rust is amazing an a very good fit.

The notion that C or C++ is better is simply wrong.
It's better both in term of what people prefer (stack overflow) and by the number of bugs that pass by the compiler.
Stay relevant tomorrow and learn it today :)  8)

Rust is indeed popular but I learned recently of Zig and it seems it has much to offer the MCU developer.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19723
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #103 on: November 24, 2022, 05:51:54 pm »
The best programming language for embedded already exists!
Rust is amazing an a very good fit.

The notion that C or C++ is better is simply wrong.
It's better both in term of what people prefer (stack overflow) and by the number of bugs that pass by the compiler.
Stay relevant tomorrow and learn it today :)  8)

Rust is indeed popular but I learned recently of Zig and it seems it has much to offer the MCU developer.

I had a quick scan of the Zig blog you referred to, and I wasn't impressed. To over-simplify, it seemed to be too hung up on a few bits of syntax, and didn't address the semantics of the languages or the appropriate use-cases for languages.

Fundamentally syntax isn't very interesting: I can learn to type different things to represent the same concepts - but I will only do so if the language semantics give me some real advantages over existing languages. Hence I avoided learning Delphi because it didn't offer much over C. I ignored C# since it offers nothing over Java (and introduces significant problems too). Etc etc.

To over-simplify again, semantics is about "what don't I have to worry about because I know the language will correctly deal with it on my behalf as my programs get bigger and more complicated". One such example is memory management. Another is parallel processing, especially with many cores/processors. Another is "will my code meet the hard realtime deadline". And, of course, the interactions between all those.
« Last Edit: November 24, 2022, 05:54:15 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #104 on: November 24, 2022, 06:30:29 pm »
The best programming language for embedded already exists!
Rust is amazing an a very good fit.

The notion that C or C++ is better is simply wrong.
It's better both in term of what people prefer (stack overflow) and by the number of bugs that pass by the compiler.
Stay relevant tomorrow and learn it today :)  8)

Rust is indeed popular but I learned recently of Zig and it seems it has much to offer the MCU developer.

I had a quick scan of the Zig blog you referred to, and I wasn't impressed. To over-simplify, it seemed to be too hung up on a few bits of syntax, and didn't address the semantics of the languages or the appropriate use-cases for languages.

Fundamentally syntax isn't very interesting: I can learn to type different things to represent the same concepts - but I will only do so if the language semantics give me some real advantages over existing languages. Hence I avoided learning Delphi because it didn't offer much over C. I ignored C# since it offers nothing over Java (and introduces significant problems too). Etc etc.

To over-simplify again, semantics is about "what don't I have to worry about because I know the language will correctly deal with it on my behalf as my programs get bigger and more complicated". One such example is memory management. Another is parallel processing, especially with many cores/processors. Another is "will my code meet the hard realtime deadline". And, of course, the interactions between all those.

Well that's fine, at least you've looked at it, its a rich subject and a huge source of healthy disagreement!

Speaking of disagreement, you say "I ignored C# since it offers nothing over Java (and introduces significant problems too)".

I'd be interested in hearing more on the problems you found.

I use C# professionally and have used it since the earliest beta, I also often participate in the language discussion on Github where language changes and enhancements are hatched.

I'm not aware any linguistic advantage Java has over C#, Java is not standardized either whereas C# is defined by an international, non proprietary standard.

Java (because it is based on the C grammar) find its very hard to add new keywords and grammar changes, for example Java does not have properties, so one has to code get_ and set_ or some other contrivance.

Java does not have generics, in C# generics are instantiated at runtime, Java has a limited compile-time form of generics (type erasure) with consequent serious limitations.

C# has partial classes greatly simplifying system that rely in custom generated source code.

C# has indexers.

C# has ref/out argument passing and supports optional arguments and positional arguments, not so Java.

C# has superb interop capabilities Java has very little to offer.

C# offers pointers for scenarios where code must interact with memory not managed by the CLR.

C# supports iterators, async iterators, async and await as language concepts.

C# has come a long way over the past twenty years, my only criticism of C# is that it might be getting a bit unwieldy and some of that is due to (IMHO) the fact that it too, presumed the C grammar rather than a fresh clean sheet of paper approach.

Also you say "syntax isn't very interesting" and I don't know what that means, do you mean you don't care at all about notational aspects of a language? We reason about code by looking at it, a good syntax improves one's ability to reason over a poor syntax.

This is why mathematics succeeds, it us superb when it comes to notation, we can look at some mathematical equation and reason about it, but express the same equation in assembler or C and we lose that ability to a great extent.

The king of languages from the standpoint of notation is APL, initially that was an experiment in a new mathematical notation for certain types of problems, later it became a programming language and remains to this day the pinnacle of expressivity.

Iverson - who invented APL - wrote a superb paper on this, it was a talk he gave after receiving the Turing Award for his contribution to computer science, syntax is everything to Iverson.

Notation as a tool of thought.

In a similar way lambda calculus is better at expressing certain ideas than a Turing machine is.














« Last Edit: November 24, 2022, 06:43:10 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6349
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #105 on: November 24, 2022, 06:39:19 pm »
The prevalence of buffer overflow/underflow/overrun bugs in C is at least partly explained in that the fundamental unit of reference in the libraries is an address instead of an address range.

If we look at languages that natively support slicing, they conceptually use tuples (origin, step, length) or (origin, step, min-index, max-index) to describe array slices.

While current C compilers can check for array boundaries at compile time (with or without generating run-time code that checks the array boundaries at run time), the library interfaces are such that such compile time checks do not apply across API boundaries.

While interfaces like returntype functionname(size_t len, datatype data[len]) are supported since C99 –– data then has a variably modified type –– and compilers can detect simple overrun cases at compile time (and generate runtime bounds checking code) both for callers and callees using this kind of interface, they are extremely rare in practice: instead, returntype functionname(datatype *data, size_t len) is used, and this defeats the compile-time bounds checking, since the compiler cannot know that len is intended to be the length of the data array.  For the array slicing approach, only the latter (non-bounds-checkable) type of interface is possible.

One might argue that a language with C semantics, except replacing pointers with arrays so that pointers decay to arrays of size 1 with step zero, instead of arrays decaying to pointers to their first element, and with an optional standard library that uses the array slice approach, could be an obvious but relatively straightforward solution to the entire memory access and buffer "problem".

As to the actual generated machine code, one would still need a way to provide optimized implementations for specific step sizes for the same function, so some kind of conditional function overloading (which is not supported in C) would help.  Also, moving type, variable, and function (or scope) attributes into the syntax itself (perhaps similar to C++ [[attribute]] syntax) would be nice.

So, I don't really see that much wrong in C per se; it's mostly the C standard library that bugs me.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #106 on: November 24, 2022, 06:49:09 pm »
The prevalence of buffer overflow/underflow/overrun bugs in C is at least partly explained in that the fundamental unit of reference in the libraries is an address instead of an address range.

If we look at languages that natively support slicing, they conceptually use tuples (origin, step, length) or (origin, step, min-index, max-index) to describe array slices.

While current C compilers can check for array boundaries at compile time (with or without generating run-time code that checks the array boundaries at run time), the library interfaces are such that such compile time checks do not apply across API boundaries.

While interfaces like returntype functionname(size_t len, datatype data[len]) are supported since C99 –– data then has a variably modified type –– and compilers can detect simple overrun cases at compile time (and generate runtime bounds checking code) both for callers and callees using this kind of interface, they are extremely rare in practice: instead, returntype functionname(datatype *data, size_t len) is used, and this defeats the compile-time bounds checking, since the compiler cannot know that len is intended to be the length of the data array.  For the array slicing approach, only the latter (non-bounds-checkable) type of interface is possible.

One might argue that a language with C semantics, except replacing pointers with arrays so that pointers decay to arrays of size 1 with step zero, instead of arrays decaying to pointers to their first element, and with an optional standard library that uses the array slice approach, could be an obvious but relatively straightforward solution to the entire memory access and buffer "problem".

As to the actual generated machine code, one would still need a way to provide optimized implementations for specific step sizes for the same function, so some kind of conditional function overloading (which is not supported in C) would help.  Also, moving type, variable, and function (or scope) attributes into the syntax itself (perhaps similar to C++ [[attribute]] syntax) would be nice.

So, I don't really see that much wrong in C per se; it's mostly the C standard library that bugs me.

Well C is basically inextensible, new keywords pretty much cannot be added and so right there it's grammar becomes almost frozen. Of course most if not all modern languages suffer from this and - to a large degree - that is because they regard C as a reasonable starting point for their own grammar.

C places very little restriction on what one can do with/to a pointer, we can even change it programmatically to an illegal value, that's likely the Achilles heel of the language.

Instead thought should be given to why does someone want to change a pointer? what is they want to achieve by doing so? there are other ways to achieve things without needing the freedom to arbitrarily adjust or create, pointers.



« Last Edit: November 24, 2022, 07:00:42 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14607
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #107 on: November 24, 2022, 07:24:18 pm »
I don't really get the point of this keyword thing. Of course any language will limit the number of predefined keywords to avoid making it increasingly hard not to clash with users' identifiers.
OTOH, if you define very little to (at the extreme) no keyword in the language, you'll have to resort to cryptic base constructs using 'symbols' for your language syntax. This has been done before. This usually isn't pretty, but hey, beauty is in the eye of the beholder.

Not sure what you mean by "extensible". There have been hundreds of different ways of making a language "extensible", with various levels of success. Could you give examples of "extending", for instance, C, that would be actually useful? Extending the language on the syntax level is a slippery slope. It means basically that your language would be a constant moving target. Rarely a good thing. Now if by that you rather mean some macro system that is more elaborate than just text substitution, then sure. Why not. You'll find that in other languages already. Certainly the C preprocessor is very crude. But it's very flexible at the same time. So beware of trying to design a better macro system that would replace it: to get half of its flexibility, you're very likely to end up with an extremely complex system.

Regarding pointers, that's one of the key flexibility of C, but of course with power comes responsibility, as they say. Nobody forces you to use pointers in weird ways. You can even decide not to do any pointer arithmetic at all if you feel uncomfortable with that, and still write useful C.

Reading this thread, it all looks like you (and others) are pointing out roughly the same things that have been over and over again. I don't want to get you discouraged, but I'm failing to see what could be possibly new here.

While creating a new language is something that piques the interest of many if not most people writing software at some point in their life, most end up giving up and getting back to writing software. Getting around/doing with C quirks usually end up being more pragmatic and more productive than trying to create your own language, which in the end may not prove any better.

Or, you do as DiTBho describes. You write your own tools to fit your own needs, in which case you don't ask for others' opinions because they will not get you anywhere anyway - that would be like asking people what their favorite color is in order for you to pick the color of your next car. Just a thought.

I did create a "design a better C" thread quite a while ago as an experiment, and while there were a few interesting points, it went nowhere as expected. People more or less all have gripes about C, but when it comes to defining what exactly would make it better, it's suddenly a lot of silence, with the expected occasional "use Rust instead". ::)
« Last Edit: November 24, 2022, 07:26:36 pm by SiliconWizard »
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #108 on: November 24, 2022, 08:26:06 pm »
I don't really get the point of this keyword thing. Of course any language will limit the number of predefined keywords to avoid making it increasingly hard not to clash with users' identifiers.
OTOH, if you define very little to (at the extreme) no keyword in the language, you'll have to resort to cryptic base constructs using 'symbols' for your language syntax. This has been done before. This usually isn't pretty, but hey, beauty is in the eye of the beholder.

Not sure what you mean by "extensible". There have been hundreds of different ways of making a language "extensible", with various levels of success. Could you give examples of "extending", for instance, C, that would be actually useful? Extending the language on the syntax level is a slippery slope. It means basically that your language would be a constant moving target. Rarely a good thing. Now if by that you rather mean some macro system that is more elaborate than just text substitution, then sure. Why not. You'll find that in other languages already. Certainly the C preprocessor is very crude. But it's very flexible at the same time. So beware of trying to design a better macro system that would replace it: to get half of its flexibility, you're very likely to end up with an extremely complex system.

Regarding pointers, that's one of the key flexibility of C, but of course with power comes responsibility, as they say. Nobody forces you to use pointers in weird ways. You can even decide not to do any pointer arithmetic at all if you feel uncomfortable with that, and still write useful C.

Reading this thread, it all looks like you (and others) are pointing out roughly the same things that have been over and over again. I don't want to get you discouraged, but I'm failing to see what could be possibly new here.

While creating a new language is something that piques the interest of many if not most people writing software at some point in their life, most end up giving up and getting back to writing software. Getting around/doing with C quirks usually end up being more pragmatic and more productive than trying to create your own language, which in the end may not prove any better.

Or, you do as DiTBho describes. You write your own tools to fit your own needs, in which case you don't ask for others' opinions because they will not get you anywhere anyway - that would be like asking people what their favorite color is in order for you to pick the color of your next car. Just a thought.

I did create a "design a better C" thread quite a while ago as an experiment, and while there were a few interesting points, it went nowhere as expected. People more or less all have gripes about C, but when it comes to defining what exactly would make it better, it's suddenly a lot of silence, with the expected occasional "use Rust instead". ::)

Briefly, by extensible I mean trivial to freely add new keywords and attributes without breaking backward compatibility. The language grammar must be designed to meet that requirement else its either impossible or cumbersome at best. C's grammar is poor, it was the result of an exercise in minimalism and parser simplicity.

Examples? OK what about label variables an aspect of computed goto (this is pseudo code)

Code: [Select]

goto target[index];

target(1):
    // code

target(2):
   // code


That's a change to the goto syntax that likely won't be feasible for C. Or how about arrays of label variables? we could then alter the entries in that array and alter the behavior of "goto label[X];" this has applications in state machines.

Or iterators like we see in C#

Code: [Select]

iterator int get_randoms()
{
   for (int I=0; I < 100; I++)
        {
           yield random(I);
        }
}

main()
{

   foreach (R in get_random())
      {
      // do stuff
      }

}

In that example we have an expanded functionality for a function declaration and a new keyword "foreach". These are random examples and we can easily dream up more, things that would help a developer but in C its impossible to add these without being unable to compile earlier code.

As for pointers, we don't manipulate pointers for no reasons, there is always some ultimate objective. It is to either access data or update data. There are better ways to provide access just as flexibly without the crude unconstrained pointer concept, pointers are fine but there are superior ways to work with them, other ways that are less risky.

I have real experience with compiler design and implementation so my interest stems from that, I wrote a sophisticated compiler for the PL/I language in C, so I know those two languages well, that was a very interesting few years and I learned a lot about the good, the bad and the ugly with languages!

My position is that C has had a bad influence on languages, it has contaminated the grammars of a huge number of later languages. If we were designing a high level language for MCUs from scratch, there'd be no need, no value, in even considering C. It has given people "tunnel vision" they can often not see that there are other ways, other ideas.

Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?

https://www.adacore.com/nvidia

Quote
AdaCore is partnering with NVIDIA™ to implement Ada and SPARK programming languages for select safety-critical firmware used in their next-generation embedded systems platforms. Customer using products such as NVIDIA® Jetson™ for embedded artificial intelligence (AI) computing, NVIDIA DRIVE™ AGX for autonomous vehicle (AV) development, and Isaac™ for Robotics are able to use the same technology at the application level. At the core of NVIDIA’s embedded systems is a system on a chip (SoC) that integrates an ARM architecture central processing unit (CPU), graphics processing unit (GPU), memory controller and IO paths onto one package. AdaCore’s GNAT Pro Ada and SPARK compilers can compile code targeting the ARM CPU embedded on NVIDIA SoCs, and increase verification efficiencies to achieve compliance with the functional safety standard ISO-26262. Supported compilation modes include bare metal as well as Linux or QNX OSes.









« Last Edit: November 24, 2022, 08:36:21 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11322
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #109 on: November 24, 2022, 08:42:37 pm »
but in C its impossible to add these without being unable to compile earlier code.
This is not true and C is expanded all the time. C specification clearly defines what identifiers (stuff that starts with underscore and a capital letter, and a number of other cases) are reserved for future use and those are used to expand the language.

The features you want (like iterators) would imply mandatory heap allocation and other behaviour that C avoids because it is not possible on all platforms (including small PICs).

It is good to want to have those high level language features in a low leve language, but it is impossible or impractical to actually implement them. So, when typing out the sample code you want to work, also type out the assembly you want that to translate into to see potential problems.

And even when things may conflict, C standard authors may still introduce new things after assessment of potential impact. For example C23 would introduce a new keyword "constexpr", which would make some old code break.
« Last Edit: November 24, 2022, 08:46:17 pm by ataradov »
Alex
 
The following users thanked this post: newbrain, JPortici, SiliconWizard

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19723
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #110 on: November 24, 2022, 08:50:29 pm »
Well that's fine, at least you've looked at it, its a rich subject and a huge source of healthy disagreement!

Speaking of disagreement, you say "I ignored C# since it offers nothing over Java (and introduces significant problems too)".

I'd be interested in hearing more on the problems you found.

Easy. Once you have a keyword "unsafe", all bets are off.

I'm not a great fan of the ahead of time optimisation. I prefer HotSpot's ability to optimize what the code+data is actually doing. C#/C/C++ can only guess what will happen at runtime, which means it has to be either pessimistic or trust that the programmer has given it correct hints.

Howeve, that's not a clear unambiguous advantage to Java.

BTW my principal trajectory was Algol, C, Smalltalk, Objective-C, (rejected C++!), Java. But there are many many other languages I've used, from mathematical modelling, hardware description languages, stats languages, and more.

I first came across C# before it was released,  when Hejlsberg came and have a talk at the research labs. Despite being genetically and financially predisposed to tinker with new toys, I don't think anybody bothered with C#.

Quote
I'm not aware any linguistic advantage Java has over C#, Java is not standardized either whereas C# is defined by an international, non proprietary standard.

Java (because it is based on the C grammar) find its very hard to add new keywords and grammar changes, for example Java does not have properties, so one has to code get_ and set_ or some other contrivance.

Java does not have generics, in C# generics are instantiated at runtime, Java has a limited compile-time form of generics (type erasure) with consequent serious limitations.

C# has partial classes greatly simplifying system that rely in custom generated source code.

C# has indexers.

C# has ref/out argument passing and supports optional arguments and positional arguments, not so Java.

C# has superb interop capabilities Java has very little to offer.

C# offers pointers for scenarios where code must interact with memory not managed by the CLR.

C# supports iterators, async iterators, async and await as language concepts.

C# has come a long way over the past twenty years, my only criticism of C# is that it might be getting a bit unwieldy and some of that is due to (IMHO) the fact that it too, presumed the C grammar rather than a fresh clean sheet of paper approach.

Also you say "syntax isn't very interesting" and I don't know what that means, do you mean you don't care at all about notational aspects of a language? We reason about code by looking at it, a good syntax improves one's ability to reason over a poor syntax.

This is why mathematics succeeds, it us superb when it comes to notation, we can look at some mathematical equation and reason about it, but express the same equation in assembler or C and we lose that ability to a great extent.

The king of languages from the standpoint of notation is APL, initially that was an experiment in a new mathematical notation for certain types of problems, later it became a programming language and remains to this day the pinnacle of expressivity.

Iverson - who invented APL - wrote a superb paper on this, it was a talk he gave after receiving the Turing Award for his contribution to computer science, syntax is everything to Iverson.

Notation as a tool of thought.

In a similar way lambda calculus is better at expressing certain ideas than a Turing machine is.

International standardization is a red herring, and not a deciding factor. For anything complex each implementation will have its own quirks.

I hate language extensions for many reasons, especially because I (and toolsets) have to comprehend all the code in order.to understand what a+b means. The correct way to extend functionality is through libraries implemented in a simple language.

Getters and setters vs properties is mere syntactic sugar. Completely boring, and so trivial that tools do it for you.

Async and await have equivalents in Java lobraries, based on different primitives. No practical difference there.

I certainly don't think that Java is perfect; it isn't :) E.g its generics are useful in limited circumstances. And it too is suffering from becoming extended too much by second rate software weenies ;)

But the main point is that C# doesn't offer compelling advantages. Plus it doesn't run well on Linux systems (I'm not clever enough to be a season for Windows boxes!)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8728
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #111 on: November 24, 2022, 09:17:23 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".
If you are going to insult the fine Scottish people of the 1970s like that, you are clearly a troll.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #112 on: November 24, 2022, 09:47:23 pm »
but in C its impossible to add these without being unable to compile earlier code.
This is not true and C is expanded all the time. C specification clearly defines what identifiers (stuff that starts with underscore and a capital letter, and a number of other cases) are reserved for future use and those are used to expand the language.

Well yes, C does have reserved words, I said that. Ideally a language would not need to reserve any words at all, Rust has unused reserved words too, pretty much any language derived from C must.

Just to be clear too the term "reserved words" usually means, simply "set aside, cannot be used for identifiers" and that's what I mean by the term. The Rust language defines it as word that are not yet recognized language keywords but might be in the future.

Consider a "yield" keyword, in C we are allowed to use a parenthesized expression after return:

Code: [Select]

   return (result);


If we wanted to add an alternative, a "yield" we'd want symmetry, we want to therefore allow:

Code: [Select]

   yield (item);


But we cannot, because if the user's code had a function name "yield" accepting a single argument, then the compiler is stuck.

The same problem faced the C# language team, they resolved the problem by making the yield syntax:

Code: [Select]

   yield return (item);


They call "yield" a contextual keyword. So we can declare a name "yield" and C# is fine, but they did have to contrive a syntax to support the yield as a keyword.

In a language with no reserved words at all, it is trivial, we can pick any new keyword we like and just use it, so in PL/I it would be a no brainer to allow:

Code: [Select]

   yield (item);


The underscore and capital letter scheme too, is an ugly contrivance, all they did there is choose a form that makes it unlikely to be used by a developer. But think about it, how can you anticipate in 1972 all of the potential, likely, future keywords your language will need? You cannot.

The features you want (like iterators) would imply mandatory heap allocation and other behaviour that C avoids because it is not possible on all platforms (including small PICs).

First I do not "want" iterators, I used it as an example when discussing grammars. Second what you say may or may not be true, but that's an implementation question, I was not discussing how that might be implemented only how the syntax could look. The implementation might be feasible or it might not, but that's nothing to do with the grammar, "parseability" of the language.

It is good to want to have those high level language features in a low leve language, but it is impossible or impractical to actually implement them. So, when typing out the sample code you want to work, also type out the assembly you want that to translate into to see potential problems.

And even when things may conflict, C standard authors may still introduce new things after assessment of potential impact. For example C23 would introduce a new keyword "constexpr", which would make some old code break.


Implementation is indeed important, if I had been suggesting specifically to add iterators then I'd agree with you 100%, but it was just an example of a keyword that one could add.

Making old code break is absolutely unacceptable to me, it is one of the primary goals of any new language for me. 100% backward compatibility is a core goal. Being able to compile ten year old code with a new version of a language that has eight more keywords than the earlier version and guaranteed a clean compile is - IMHO - non negotiable, not if one takes language design seriously.




“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #113 on: November 24, 2022, 10:00:48 pm »
PL/I (yes, he's on about that again) had a neat way to support language functions. These too could be added without breaking older source code. Inherent language functions "part of the language" were referred to as "builtin" functions. Consider:

Code: [Select]

a = substring("this can't be real",4);


The "substring" is "part of" the language. But if as user created a function named, say "frameptr" that took some argument, like this:

Code: [Select]

f = frameptr(4); // get pointer to the 4th window frame.


Then there's a problem. If the language team ever wanted to add a language function "frameptr" we're stuck, the new compiler will have a problem resolving the user's own function from the new language function!

This is resolved elegantly though. The approach is that all use of language functions must be first declared, like this:

Code: [Select]

dcl frameptr builtin; // dcl the function to walk up the stack frames.

// other code...

So to use any, even new, language functions one must declare them as "builtin" and since the language knows the signature, there's no need to write anything other than "builtin" the return type and argument type's don't need to be specified.

This is very powerful and goes hand in hand with the absence of reserved words to absolutely guarantee 100% backward compatibility with any new keyword or any new language function.

Builtin functions ordinarily includes stuff like string functions, mathematics functions and so on. So to use some of these in your code you'd write:

Code: [Select]

dcl cos          builtin;
dcl sin          builtin;
dcl substring    builtin;
dcl max          builtin;
dcl sqrt         builtin;






« Last Edit: November 24, 2022, 10:18:05 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6349
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #114 on: November 24, 2022, 10:06:45 pm »
OK what about label variables an aspect of computed goto
Already supported in C by GCC, Clang, and Intel CC at least, via labels as values.  For example,
Code: [Select]
int is_even(int x)
{
    static const void *const target[2] = { &&even, &&odd };

    goto *target[x & 1];

odd:
    return 0;

even:
    return 1;
}

Or iterators like we see in C#
Those can be implemented via C coroutine support.  Granted, you need either POSIX or SuS support, or an assembly (machine code) implementation, to be able to obtain and swap stack and register file context, but many library implementations for C do exist.

Essentially, "yield()" and "next()" form a coroutine pair, where calling one switches execution to the other.

In a low level language, the implicit context swap would be a Big Problem.  It is like exceptions in C++, which tend to be too resource intensive to be worth implementing in microcontroller environments.  In a microcontroller, available stack is often counted in bytes, so having two or more stacks at the same time is a no-no.
« Last Edit: November 24, 2022, 10:08:32 pm by Nominal Animal »
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #115 on: November 24, 2022, 10:10:56 pm »
OK what about label variables an aspect of computed goto
Already supported in C by GCC, Clang, and Intel CC at least, via labels as values.  For example,
Code: [Select]
int is_even(int x)
{
    static const void *const target[2] = { &&even, &&odd };

    goto *target[x & 1];

odd:
    return 0;

even:
    return 1;
}

Or iterators like we see in C#
Those can be implemented via C coroutine support.  Granted, you need either POSIX or SuS support, or an assembly (machine code) implementation, to be able to obtain and swap stack and register file context, but they do exist.  Essentially, "yield()" and "next()" form a coroutine pair, where calling one switches execution to the other.

In a low level language, the implicit context swap would be a Big Problem.  It is like exceptions in C++, which tend to be too resource intensive to be worth implementing in microcontroller environments.  In a microcontroller, available stack is often counted in bytes, so having two or more stacks at the same time is a no-no.

Yes, there are various vendor specific changes to C and this is one of them, but it's rather a bit of a bolt-on job. Here's how it could look:

Code: [Select]

   goto target(x+1);

target(0):
   return 0;

target(1):
   return 1;


Yes, yield and iterators can be viewed as a coroutine concept, I did call out coroutine support too in my earlier list of features. Also how something might be implemented is a valid conversation but I'm mainly talking about grammars just now.

If something cannot reasonably be accommodated then one would not implement the feature, but until that's been determined one can consider the feature as a possibility.
« Last Edit: November 24, 2022, 10:17:04 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11322
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #116 on: November 24, 2022, 10:25:35 pm »
You are focusing way too much on "nice looking" syntax. This is not the most important part. If your code is full of those computed gotos, then you are likely doing something wrong anyway.

And I'm lost again what is being discussed. You throw some random examples of code that you don't want to work.

Yes, it is possible to make a language exactly as you like, but that happens by opening a text editor and typing the code, not forum posts.
Alex
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #117 on: November 24, 2022, 10:38:02 pm »
You are focusing way too much on "nice looking" syntax. This is not the most important part. If your code is full of those computed gotos, then you are likely doing something wrong anyway.

And I'm lost again what is being discussed. You throw some random examples of code that you don't want to work.

Yes, it is possible to make a language exactly as you like, but that happens by opening a text editor and typing the code, not forum posts.

You could be contributing to the conversation (instead of trolling) as several other people are, I'm sure you have much to offer but it seems you simply resent me discussing the subject. Several posts have contained genuinely useful responses, things that give me pause for thought.

I will discuss what I want to discuss with whomever wants to discuss it with me, your approval or disapproval of the subject matter is unimportant.

Oh, and regarding the naïve advice "opening a text editor and typing the code, not forum posts" is a common failing; starting to write software to solve a problem before that problem is defined, is the cause of many of today's ills in this business.



« Last Edit: November 24, 2022, 10:44:18 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4069
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #118 on: November 24, 2022, 10:39:16 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".
If you are going to insult the fine Scottish people of the 1970s like that, you are clearly a troll.

Some variety of small furry animal for sure.

And I think you mean "Pict".
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8728
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #119 on: November 24, 2022, 10:42:28 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".
If you are going to insult the fine Scottish people of the 1970s like that, you are clearly a troll.

Some variety of small furry animal for sure.

And I think you mean "Pict".
The PIC was developed in Glenrothes. I think the Picts were farther north than that.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #120 on: November 24, 2022, 10:44:53 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".
If you are going to insult the fine Scottish people of the 1970s like that, you are clearly a troll.

I am Scots, even we had trouble with the picts - or is it pickets...I forget...

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11322
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #121 on: November 24, 2022, 10:46:33 pm »
(instead of trolling)
I'm not trolling, I've just seen this exact thread dozens of times. Nothing practical will come out of it.

If the goal is to get generally educated on why certain compilers do something one way or the other, I'm perfectly fine with that. But you start your claims with stuff like "C can't do this" or "PIC authors don't know", which are all incorrect.

And that discussion of Scots is sure on-topic here.

Sure, having computed gotos with your syntax would be nice. What's next?
« Last Edit: November 24, 2022, 10:48:10 pm by ataradov »
Alex
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6349
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #122 on: November 24, 2022, 10:49:26 pm »
Here's how it could look:
Me no like.  How do you do range checking on x?  And the syntax looks too much like function calls to me.  The colon at the end is so easy to confuse with a semicolon.  Obfuscated C material, methinks...

Anyway, I have used the labels-as-values for state machines.  However, I do prefer a function pointer variant, with each state or transition (depending on how you define your state machine and events) being a separate function call, instead of a single huge switch() or goto/label mess.

For things like command interfaces –– basically name-to-function mappings ––, I like to use ELF section support.  (All architectures I use support ELF object files.)  Basically, a preprocessor macro is used to generate a static const structure in a dedicated section.  The linker will collect these at link time from all object files, and concatenate them into a single array, and the start and end addresses are exported by symbols visible to the C code.
Note: This is not a C feature, but a feature provided by C compilers and linkers when they use ELF format object files.

This means that if I write a G-code interpreter, I can use a dedicated section for supported G-codes, and another for M-codes.  In the source code, anywhere a new one is defined, it gets declared using something like DECLARE_G(code, function, "description") or DECLARE_M(code, function, "description"), and the interpreter will look up each one using the linker-provided section symbols.

This works amazingly well.  I can put each feature into a separate source file, and let the builder select the desired set at compile time.  The only thing that annoys me is that the linker does not yet know how to sort those entries at link time.  I often have a key, here code (a positive integer), and if the array entries were sorted according to the key, I could use a binary search to find the entry in O(log N) time.  If I want to do that now, I need to do a separate post-link object file modification step.  And if I have to do that, I can spend a little more time doing that, and make the array into a hash table with the hash function provided by the post-link step, to find a near-perfect hash function for the data, for O(L) time lookups, where L is the maximum collision chain length.

You are focusing way too much on "nice looking" syntax. This is not the most important part. If your code is full of those computed gotos, then you are likely doing something wrong anyway.

And I'm lost again what is being discussed. You throw some random examples of code that you don't want to work.

Yes, it is possible to make a language exactly as you like, but that happens by opening a text editor and typing the code, not forum posts.
You could be contributing to the conversation (instead of trolling) as several other people are
I do not always agree with ataradov, but here, he has a particularly apt point.

The way the code looks is not important.  Readability is important, but "target(0):" is less readable than "target_0" because it is so easy to confuse with "target(0);", so you are demonstrably focusing on the wrong thing, at least in this example.

What is important, is how easy or hard it is to describe an expression or concept in the given language with exactly the desired side effects and no others.
This means that the most important things to achieve with a new low-level language with the intent of replacing C, is to be able to express everything that can be expressed in C already, and more, or at least better, to some definition of "better".  "Visually more appealing" is not "better", though.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #123 on: November 24, 2022, 10:52:59 pm »
(instead of trolling)
I'm not trolling, I've just seen this exact thread dozens of times. Nothing practical will come out of it.

If the goal is to get generally educated on why certain compilers do something one way or the other, I'm perfectly fine with that. But you start your claims with stuff like "C can't do this" or "PIC authors don't know", which are all incorrect.

You're trying to stifle conversation, stifle dissent, you disapprove of someone who sees things differently to you, that's pretty much it isn't it Alex?. Well not here, that's the stuff of Orwell, 1984, the film Brazil, intolerant, repressive communist regimes, but not here.

So I don't care what you're "perfectly OK with" that's not of interest to me.

And that discussion of Scots is sure on-topic here.

I didn't bring it up buddy.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14607
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #124 on: November 24, 2022, 10:54:32 pm »
Look at Ada, did you know that Nvidia recently decided to start sidelining C and C++ for all future firmware development in favor of Ada?

https://www.adacore.com/nvidia

That seems reasonable. Much more so than switching to uh, Rust, for instance. ;D
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf