Author Topic: The Imperium programming language - IPL  (Read 67691 times)

0 Members and 1 Guest are viewing this topic.

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #75 on: November 23, 2022, 10:16:14 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #76 on: November 23, 2022, 10:33:28 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #77 on: November 23, 2022, 11:02:32 pm »
So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

I can assure you, Microchip is painfully well aware of this.

Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".

No, because they've been trying for decades to move people to their (licensed-in or outright bought) AVR, PIC32, and ARM product lines, all of which support high level languages such as C very well. Possibly dsPIC too, but I don't know anything about it.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #78 on: November 23, 2022, 11:19:44 pm »
Would you say then that this could also apply to them "it is becoming more and more apparent that they don't know the first thing about either CPU instruction sets or programming languages, and they won't listen to those who do".
There is only so much you can do to improve architecture from 1976. PICs are really bad (by modern standard) as far as architecture goes. But people keep buying them and they have the highest profit margin. There is no incentive to do anything here.

Smart people will buy much cheaper an more capable ARM-based MCUs. But there are a lot of people stuck in the past and afraid of change.

There are also legacy designs that don't need a lot of maintenance anymore.
Alex
 
The following users thanked this post: Someone, newbrain, Jacon

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #79 on: November 23, 2022, 11:53:04 pm »
I spent quite a bit of time with the PIC 16F877(A) and I have the following observations:

The stack is only 8 addresses deep and there are no push or pop instructions to allow it to be used for passing function arguments.  C calling conventions are right out the window.  A stack has to be created and managed in software.

Emitted code from C compilers will spend a lot of space on code for bank and page switching.  There's only 8k of flash and the C compilers I have used burned through a lot of it.  I haven't looked at the PIC 16F series in about 10 years, maybe the C compilers are better but I doubt it.  The hardware simply doesn't support high level languages.

There are only 36 instructions and assembly language programming is fairly simple.  There simply are no complex addressing schemes and everything is dead simple.

Code will be tight, it will be laid out to eliminate paging as much as possible and bank switching will also be optimized.

The fact that the mid-range PIC is still around is testament to the fact that many applications can run on processors with just a single core and no hardware support for higher level languages.

I prefer some of the Atmel chips like the ATmega128.  It is a great chip to program with C.  The Arduino AMmega328P chip is also fun to work with at the bare metal level.


 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #80 on: November 23, 2022, 11:58:51 pm »
I do hope you aren't planning a language that is specific to current PIC processors and current PIC peripherals.

If that isn't the case, what will your strategy for inclusion/exclusion of a language feature? Either a small language that is all useful with all processors/peripherals. Or a large language that contains all possibilities, only some of which are exploitable with given processor/peripherals.

Without being able to articulate your strategy (to an audience including yourself), you will flail around in different directions.

These are good points. No, there is no special focus on PIC products, I just wanted to include some of the problems experienced in that domain, into the overall picture, if there was some very obscure thing, specific to some very narrow device then there's little to be gained from considering that unless the effort is not too high.

As for inclusion/exclusion strategies I think that's a fascinating question and I've not thought about it in any real detail yet.

It would include some kind of classification of language features I suppose. I suppose one could start listing features (whatever these get defined as) and then tabling  these for their utility across a range of devices.

Yes, it is an interesting question - and one that you can see being played out in all sorts of arenas, for a variety of reasons. Some reasons are academic, some commercial, some pragmatic, some concealed.

There's no single correct answer. But the wrong  answer often starts with not stating clearly the objectives and non-objectives.

Quote
Anyway you do raise some important questions, clearly there is more to be done in this regard.

Frequently answering questions is trivial, but knowing the right question to ask is more difficult and more important.

Over the decades I've come to appreciate skill in asking the right simple question - one  where the answer illuminates all sorts of assumptions and biasses.

For hardware, "what are the various clock domains?"

For software, "what is the concept and definition of 'address'?" What is the definition of "unique"?
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #81 on: November 24, 2022, 12:03:42 am »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

That is because 8 bit PIC is utterly unsuited to running any kind of modern high level compiled language.

That can't be true Bruce, CP/M for example - an OS no less - was written in PL/I a high level compiled language, CP/M ran on the 8080, Z80 and 8085 all 8 bit devices and has been ported to other 8 bit chips too.

And this has what to do with PIC, exactly?

8080/z80 is about 100x more compiler-friendly than PIC. But about 10x less compiler-friendly than the also 8 bit 6809 or AVR.

Spot on!

(Except that the Z80 has far too many gaping holes in its instruction map!)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: A new, hardware "oriented" programming language
« Reply #82 on: November 24, 2022, 12:08:52 am »
Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.
That one is certainly not going to happen. Very few machines now permit cycle exact execution of anything. Even really small devices have at least some elementary kind of cache, that messes with the instruction timing.
That's simply not true, we can emit multiple platform specific NOP's today by embedding assembler in C. There are assembler MCU developers out there that struggle to use C because of this kind of thing. They have carefully crafted code where the want some execution path to take exactly the same number of clock cycles as some other, so they embed multiple NOPs sometimes, their designs require that.
The days that you need exact execution times are long gone. Any language chasing features like that is outdated before it has been designed.

The future is with languages that are better at dealing with complex firmware and take away pitfalls like pointers and buffer overruns from programmers so that effort can be directed at adding functionality.

Languages like Rust, Lua and Python make much more sense on a microcontroller nowadays.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #83 on: November 24, 2022, 12:28:49 am »
There is one application where exact loop timing is required: Directly generating graphic images for games.  Typically the superloop will have several paths depending on which layer of the image it is generating.  Sprites come to mind.  It is critical that every pixel lay down at exactly the right time if the image is to make sense.  This kind of thing was common on the Apple II and the Commodore 128.

Clearly, it is an old concept but it worked real well as long as the cycle count was accurate..

Look at line 276+ for display loop timing.  There is also a specific delay() that is used throughout the program.

Grab the .zip file at the bottom of the page
https://www.dos4ever.com/upong/upong.html

This is a PAL implementation but there are NTSC versions out on the web.
« Last Edit: November 24, 2022, 01:13:41 am by rstofer »
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #84 on: November 24, 2022, 12:36:55 am »
Who is generating graphics on the fly from the microcontroller anymore?
Alex
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #85 on: November 24, 2022, 12:50:12 am »
There is one application where exact loop timing is required: Directly generating graphic images for games.  Typically the superloop will have several paths depending on which layer of the image it is generating.  Sprites come to mind.  It is critical that every pixel lay down at exactly the right time if the image is to make sense.  This kind of thing was common on the Apple II and the Commodore 128.

Clearly, it is an old concept but it worked real well as long as the cycle count was accurate..

VGA 640x480@30 dot rate is 12.58 MHz. Or 1.57 MHz for each byte of b&w pixels.

Sure, that's tough on a 1 or 2 MHz machine.

But do it on a 50+ MHz machine and you can afford to take an interrupt and save all your registers for EVERY BYTE. With 500 MHz you could take a timer interrupt for every pixel.
 
The following users thanked this post: Siwastaja

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #86 on: November 24, 2022, 12:51:13 am »
Who is generating graphics on the fly from the microcontroller anymore?

Atari 2600 chads. Gigatron.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #87 on: November 24, 2022, 01:11:27 am »
Retro is a big deal! 

I have implemented several CP/M machines over the last dozen years.  I still enjoy using it and, at 50 MHz, it scoots right along.

Then there is the FPGA implementation of the IBM1130 that I first started programming in 1970 (FORTRAN and a bit of RPG). Certainly retro because the original machine ran at around 400 kHz and my FPGA runs at 50 MHz.  I swear I'm going to move it to an Artix 7 and kick it up to 100 MHz.  One of these days...

Then there is the FPGA implementation of a Z80 making up PacMan - more retro!  We're talking PacMan with the original ROMs (converted to BlockRAM of course).  Very cool!

Yes, I have the Teensy 4.1 when I want to get things done in a hurry.  I like the LPC1768 mbed but it is now obsolete and ARM is changing the toolchain.  I need to do some testing...

Pong on an 8 pin uC?  Most people think 8 pins is relegated to 555 projects or op amps...  Very retro.

 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14445
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #88 on: November 24, 2022, 01:19:03 am »
Even if you are, *with proper peripherals* you absolutely do not need to have code execute with exact timings - you only need it to execute within some max time.
Only if you use old MCUs with basically no peripheral other than very basic stuff and very limited RAM (thus preventing any double buffering) do you have to time code like this. Or even older CPUs.

Coding like this on any modern MCU doesn't make sense, and I think that was the question ataradov asked. Now emulating old gear with modern hardware is a weird endeavour with very much its own weird "rules". So, anything goes with a rationale that only people doing it can explain to themselves.

In other words, can you code emulators even on modest modern MCUs without coding like this, having to time code execution to the cycle? Absolutely. Are there still people willing to code to the cycle using old methods? Absolutely too. To each their own.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #89 on: November 24, 2022, 01:22:37 am »
Atari 2600 chads. Gigatron.
Ok, but that's not a typical use case. This is just doing random stuff for fun. Designing the language that would align stuff for you just for this is an overkill.

And people working on Gigatron managed to get things going without that.

I can't imagine any actual mass market application for that.
Alex
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #90 on: November 24, 2022, 01:24:20 am »
Pong on an 8 pin uC?  Most people think 8 pins is relegated to 555 projects or op amps...  Very retro.

8 pins? GND and Vcc, two ADCs for paddles, start/pause button, video out.

10 cent 8 pin 48 MHz 32 bit RISC-V with 16 KB flash and 2 KB RAM? Might just about manage Pong :-)

With 16-ε registers, you can probably do it with zero RAM, just the MMIO.
 

Offline lyxmoo

  • Contributor
  • Posts: 13
  • Country: cn
Re: A new, hardware "oriented" programming language
« Reply #91 on: November 24, 2022, 01:54:16 am »
less code, or minimize code under MCU enviroment, it's will be great for no code with programming just like PLC, combine functional bricks together.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #92 on: November 24, 2022, 01:59:00 am »
Different applications have different needs. Programming PLCs is like hell on Earth for me personally. I would not want to program general purpose stuff that way.

Plus there are drag and drop systems. They are just not very useful for real use. Plus doing "code" versioning on graphics input stuff is next to impossible. This would throw away all the modern development flows.
Alex
 

Offline JPortici

  • Super Contributor
  • ***
  • Posts: 3461
  • Country: it
Re: A new, hardware "oriented" programming language
« Reply #93 on: November 24, 2022, 05:23:36 am »
That's simply not true, we can emit multiple platform specific NOP's today by embedding assembler in C. There are assembler MCU developers out there that struggle to use C because of this kind of thing. They have carefully crafted code where the want some execution path to take exactly the same number of clock cycles as some other, so they embed multiple NOPs sometimes, their designs require that.

Simple enough microcontrollers that can produce cycle-accurate predictable timing by simple instruction counting are becoming exceedingly rare. These 8-bitters still exist, but such cycle-accuracy combined to generally very crappy performance is something people do not actually want. Instead, people buy modern high-performance microcontrollers which can still produce very accurate timing by just scaling the absolute time used per cycle down, by utilizing higher clock speed. For example, a 12-cycle ISR latency on a 400MHz MCU looks like HALF a cycle on a 16MHz AVR/PIC; at that point you simply do no care if it sometimes takes 14 cycles due to some pipeline or branch mispredict or something. I have done pretty timing-sensitive things simply on interrupt handlers, and the advantage is ease of writing, reading, and maintaining that code. Manual cycle counting, or automated version thereof, is simply not needed anymore, except in very rare cases, which require careful understanding anyway.

The problem with this "predictive timing language" is, it becomes tied to certain type of hardware, and then you have replicated what the XCORE folks have done (apparently pretty well).

This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

All I'm arguing here really is for a new language to have features that help avoid these kinds of problems, where some aspects of the generated code are more controllable in a fine grained way, the NOP idea is just an example of what's required rather than a formal proposed language feature.

I believe you.
What i do in these cases is write an assembly module for the really time critical stuff (or stuff that doesn't translate well in c) and link that in, call it like a C function if it doesn't live and die on its own context (ISR).
Bulk of the firmware stays in C with all the benefits of moving from assembly, as only the truly special parts are in assembly.
Granted it's more difficult for 8bit PIC if you use XC8/HiTech C8 because of the compiled stack, but it's not impossible.
That doesn't make me feel the need for a special language or a special compiler for which i have to remember the dialect

Also, bit manipulation, exact number of NOPs, all things that can be done in XC8 by the use of the compiler builtins. Maybe it won't solve every issue, but it's there
« Last Edit: November 24, 2022, 05:25:37 am by JPortici »
 
The following users thanked this post: SiliconWizard

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #94 on: November 24, 2022, 10:27:10 am »
Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.
That one is certainly not going to happen. Very few machines now permit cycle exact execution of anything. Even really small devices have at least some elementary kind of cache, that messes with the instruction timing.
That's simply not true, we can emit multiple platform specific NOP's today by embedding assembler in C. There are assembler MCU developers out there that struggle to use C because of this kind of thing. They have carefully crafted code where the want some execution path to take exactly the same number of clock cycles as some other, so they embed multiple NOPs sometimes, their designs require that.
The days that you need exact execution times are long gone. Any language chasing features like that is outdated before it has been designed.

True, but... in a hard realtime application you do need to know the worst case loop execution time and/or latency between input and output.

That implies accurate cycle counting, but not NOP insertion.


Quote
The future is with languages that are better at dealing with complex firmware and take away pitfalls like pointers and buffer overruns from programmers so that effort can be directed at adding functionality.

Languages like Rust, Lua and Python make much more sense on a microcontroller nowadays.

Python has the global interpreter lock and can garbage collect. If those aren't a problem for an application, fine.

In the medium term I would bet on Rust becoming important for embedded systems. At the moment it is on the bleeding edge.

I don't know enough about Lua to have a valid opinion.

Overall the language is relatively unimportant, provided it can control the hardware as expected and you can get staff willing to take the time to have it on their CV. The latter is a significant hurdle for custom languages, including DSLs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #95 on: November 24, 2022, 10:29:54 am »
Even if you are, *with proper peripherals* you absolutely do not need to have code execute with exact timings - you only need it to execute within some max time.

That's the point!

With modern MCUs becoming ever more powerful (good), complex (bad), and complicated (bad bad), knowing the worst case timing is becoming more difficult. In a hard realtime environment, the mean/typical timing is irrelevant.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6239
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #96 on: November 24, 2022, 03:27:02 pm »
With modern MCUs becoming ever more powerful (good), complex (bad), and complicated (bad bad), knowing the worst case timing is becoming more difficult. In a hard realtime environment, the mean/typical timing is irrelevant.
I believe, that –– perhaps surprisingly! –– distributed processing is a valid solution to this.

In a very real sense, it reduces complications, because tasks are done in separate, isolated units, that simply communicate with each other.
Message passing interfaces (with input buffers, mailboxes, and/or DMA and completion signals/flags) lets computation proceed simultaneously with communication, but it is much easier to provide/prove worst case timings for any computation.

For example, instead of integrating quadrature encoders in an MCU, I'd prefer to stick a few-cent dedicated chip on each encoder, with a shared bus to the MCU, with the MCU just accumulating the encoder state changes in a dedicated "register" per encoder.  The MCU hardware peripheral is a bit more complex, but would need few pins to support even a large number of encoders.  No software interrupts would be required for normal operation, so software timing would not be affected.  (I'd use two addresses for each "register", though: one for normal reads, and the other for reads that clear the "register" atomically when read, for easy delta tracking without having to worry about atomicity.)

Then again, I might just be crazy.  :-//
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #97 on: November 24, 2022, 03:41:06 pm »
With modern MCUs becoming ever more powerful (good), complex (bad), and complicated (bad bad), knowing the worst case timing is becoming more difficult. In a hard realtime environment, the mean/typical timing is irrelevant.
I believe, that –– perhaps surprisingly! –– distributed processing is a valid solution to this.

In a very real sense, it reduces complications, because tasks are done in separate, isolated units, that simply communicate with each other.
Message passing interfaces (with input buffers, mailboxes, and/or DMA and completion signals/flags) lets computation proceed simultaneously with communication, but it is much easier to provide/prove worst case timings for any computation.

For example, instead of integrating quadrature encoders in an MCU, I'd prefer to stick a few-cent dedicated chip on each encoder, with a shared bus to the MCU, with the MCU just accumulating the encoder state changes in a dedicated "register" per encoder.  The MCU hardware peripheral is a bit more complex, but would need few pins to support even a large number of encoders.  No software interrupts would be required for normal operation, so software timing would not be affected.  (I'd use two addresses for each "register", though: one for normal reads, and the other for reads that clear the "register" atomically when read, for easy delta tracking without having to worry about atomicity.)

Then again, I might just be crazy.  :-//

No, you are not crazy. That is an extremely sane response.

It even exists for hard realtime embedded systems, and you can buy it from DigiKey:
  • the XMOS xCORE processors, up to 4000MIPS 32 cores/chip (expandable), crossbar message passing between i/o and cores
  • xC language based on event loops, message passing (like Occam and CSP), and no conceptual difference between peripheral i/o and comms between cores. Plus constructs to tell any of the peripherals when to output a result and to determine when an input occurred
  • no interrupts; just have a core waiting for one of several input events
  • no caches, so loop timing is calculated by the toolset, not measured in the hope the worst case has been encountered

Further advances will need to be made in the heavily multicore future, but xCORE+xC shows what can already easily be achieved in this space. Shame most of the world is left sticking band aid plasters on environments that were state of the art half a century ago.

We'll get to A Better Place (just as COBOL has been superceded), but it won't be in my lifetime.
« Last Edit: November 24, 2022, 03:44:33 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #98 on: November 24, 2022, 04:26:05 pm »
Well, some very interesting posts recently, I appreciate this kind of discussion.

I'm now satisfied that the PIC 8 stuff is a bit of an odd man out, the exception rather than the typical.

This wasn't apparent earlier, it does seem like it is pretty unusual architecturally.

As for the precise timing stuff though, this strikes me as being a specific kind of optimization, rather than a language feature. I mean the language could expose some keyword or something to identify such code, but ultimately it becomes just a specific kind of optimization.

So if incorporating this isn't a huge cost then I lean toward including this a goal, at least until some good reason comes up to abandon the idea.

The motive for this and some other things too, is to free the mind from preconceptions, use a clean sheet of paper so to speak and not subconsciously constrain a new language.

Even features that might only be used by a small number of users are sometimes still desirable to have if the cost of including them isn't high.

As for Rust that is a "better" language that C but I learned recently from someone here of something even better Zig. and here's a very detailed blog post by a clearly competent professional, as to why he is impressed by Zig.

He was impressed by the "inline for" feature too, and the language exposes "vectors" these are the kinds if things that interest me.

I'd be interested to hear what things about Zig do not appeal to experienced MCU developers, it looks like Zig is rather good, but it perhaps was not designed with MCU's in mind so might have gaps in that respect, note too that a motivation for Zig was to improve upon C.

One advocate of Zig points this out, a view I share wholeheartedly:

Quote
"The Zig project for me," Andrew tells us, "was peeling off these abstraction layers [of C] that we've all taken for granted and examining their foundational premises and saying, 'Is that really how it has to work? Let's question some of this stuff. Maybe it can be better'."

While investigating why features in C operate the way they do, Andrew came to realize that the reasoning behind many of them are not well justified.






« Last Edit: November 24, 2022, 04:42:10 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline alexanderbrevig

  • Frequent Contributor
  • **
  • Posts: 700
  • Country: no
  • Musician, developer and EE hobbyist
    • alexanderbrevig.com
Re: A new, hardware "oriented" programming language
« Reply #99 on: November 24, 2022, 05:12:34 pm »
The best programming language for embedded already exists!
Rust is amazing an a very good fit.

The notion that C or C++ is better is simply wrong.
It's better both in term of what people prefer (stack overflow) and by the number of bugs that pass by the compiler.
Stay relevant tomorrow and learn it today :)  8)
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf