Author Topic: The Imperium programming language - IPL  (Read 67831 times)

0 Members and 1 Guest are viewing this topic.

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4032
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #175 on: November 26, 2022, 12:09:33 pm »
Even for the people who just want Carry, I highly doubt they are prepared for the different implementations of the carry flag with respect to subtraction and compare.

And it's simply not hard to express what you ACTUALLY WANT in portable C anyway. The compiler will automatically use the flags where available.

Bignums aren't easy in portable C :)

Well, yes they are, as just demonstrated.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #176 on: November 26, 2022, 02:22:53 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Bullshit.

It's frequently my job to "rock the boat" by proposing changes and improvements to processes, libraries, programming languages, and even new machine code instructions. You can find my name for example in the credits for the RISC-V base instruction set and the B and V extensions.

Your proposed changes address ONE aspect of the problem:

- does the change make a programmer's life slightly more convenient?

You totally ignore every other aspect, such as:

- can the proposed feature be efficiently implemented on the target devices?

- is it a significant improvement vs using a library (function, macro) or maybe code generator instead?

- is the product of the improvement and the frequency of use sufficient to justify the incremental cumulative increase in complexity of the language, compiler, manuals, training?

No, take a good look, you'll see my remark has some basis.

For an engineer, you're not very thorough or objective, you'll find no suggestion from me about "convenience", you're paraphrasing.

You've also not asked me a single specific question about any feature on any target device, furthermore at this stage I've been specifically focusing on grammar not semantics,  I said this several times too.

Your position is pessimistic, discouraging, not open minded, this is nothing to do with your undoubted credentials either, it is you, your hostile tone. This is just a discussion and you're frothing and fuming.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6253
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #177 on: November 26, 2022, 03:02:10 pm »
I used a naked "8-pin Arduino" in my Vetinari clock, because that was the quick way to get an undemanding result.

But when making something that required very low power consumption, I simply wrote C to peek/poke the atmega328 registers. That avoided me having to work out how Arduino libraries might be frustrating me.
That's exactly why ATmega32u4 is one of my favourite 8-bitters.  It has native full-speed USB (12 Mbit/s, can do 1 Mbyte/sec even over USB Serial), but still is completely programmable without any HAL or similar, on bare metal, using nothing but avr-gcc (and avr-libc, if you really want), in (freestanding) C or C++.

I note that one of the desires there was "give easy access to the flags". I pointed out that many ISAs don't even *have* flags.
Yep.  In particular, you helped me realize I did not actually desire access to flags, just multiple function result values, to fulfill my needs.

An extremely common pattern for functions is to return a value or error.  Obviously, with scalar return types, when you return a single scalar, some values must be reserved for error codes.  (That can get messy, like it once did in the Linux kernel, where certain filesystem drivers could not handle reads over 231 bytes (due to widespread use of int), and returned negative values.  Now, the Linux kernel limits all single read() and write() syscalls to 231 less one page automagically.)  Having a separate value and status returns is sometimes more robust, even if the status return is just a single "OK"/"Error" flag.

However, SysV ABI on x86-64 (LP64, so ints are 32-bit, long and pointers are 64-bit) as used on Linux, macOS, FreeBSD, and others, already supports this (by using rax and rdx registers for 128-bit non-FP return values), so code like
    typedef struct { long x, y } ipair;
    lpair somefunc(long x, long y) { return (lpair){ .x = x, .y = y }; }
compiles to
    somefunc:
        mov  rax, rdi
        mov  rdx, rsi
        ret
because rdi and rsi are the first two (non-FP) parameters, and the pair of return values are returned in rax and rdx.
This suffices for my needs, although it is slightly odd-looking at first, wrapping return values into structs.  It would be nicer to have syntactic sugar for this and not require struct use, but this works too.

Because this is architecture ABI, any compiler for this architecture can use this.  Other ABIs, however, need to do annoying "emulation" by e.g. passing a pointer to the additional return values, similar to how larger (struct-type) result values are passed.  A compiler can obviously use its own function call ABI internally, but it does have to have some kind of foreign function interface* for bindings to binary libraries (if used on hosted environments/full OSes, and not just bare metal embedded).  It might be useful to use a rarely used register for the status/error return, so that it is not so often clobbered by ordinary code.

Simply put, I wasn't hankering so much for syntactic sugar, but for more sensible ABIs like SysV on x86-64.  Brucehoult put me straight on this, albeit indirectly.  :-+



* Part of the reason I like using Python for graphical user interfaces is that it has a very nice interface to calling native code, without needing any native code support; using pure Python code for the interface.  Certain libraries, like glib's gobject introspection for C, provide introspection information (.gi or gir files) that means full interfaces can be constructed without any bespoke code (machine or Python), using the introspection information only.  It also means that when the library and corresponding introspection file is updated, the new features are immediately available in Python also.  Very useful.

Because native machine code is dynamically loaded for access from Python, this also provides a perfect mechanism for license separation.  You can keep the user interface open source, modifiable by end users, but keep all Sekret Sauce proprietary code in your binary-only dynamic library.  In my experience, this provides the optimal balance between user access and proprietary requirements when going full open source isn't viable, e.g. for commercial reasons.
 
The following users thanked this post: DiTBho

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6253
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #178 on: November 26, 2022, 03:04:57 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Bullshit.

It's frequently my job to "rock the boat" by proposing changes and improvements to processes, libraries, programming languages, and even new machine code instructions. You can find my name for example in the credits for the RISC-V base instruction set and the B and V extensions.

Your position is pessimistic, discouraging, not open minded, this is nothing to do with your undoubted credentials either, it is you, your hostile tone. This is just a discussion and you're frothing and fuming.
Oh come on, now you're just sounding like a petulant child that is disappointed that someone is pointing out a flaw in their Perfect Plan yourself; one that is expecting adulation and praise.  Get your ego out of the discussion, and concentrate on the subject matter instead.
 
The following users thanked this post: MK14, JPortici, pcprogrammer

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #179 on: November 26, 2022, 03:45:20 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Bullshit.

It's frequently my job to "rock the boat" by proposing changes and improvements to processes, libraries, programming languages, and even new machine code instructions. You can find my name for example in the credits for the RISC-V base instruction set and the B and V extensions.

Your position is pessimistic, discouraging, not open minded, this is nothing to do with your undoubted credentials either, it is you, your hostile tone. This is just a discussion and you're frothing and fuming.
Oh come on, now you're just sounding like a petulant child that is disappointed that someone is pointing out a flaw in their Perfect Plan yourself; one that is expecting adulation and praise.  Get your ego out of the discussion, and concentrate on the subject matter instead.

So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers? Simply bleating (as some are) that "there's nothing wrong with C, I'm satisfied with C, discussing always leads nowhere" is pointless, anyone who's not the slightest bit interested in possibilities can just ignore the thread, isn't that reasonable?
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6253
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #180 on: November 26, 2022, 04:11:09 pm »
Bignums aren't easy in portable C :)
If your limbs are of uint32_t type, you can use uint_fast64_t for the intermediate results.  On architectures without a 32×32=64-bit unsigned multiplication or 64/32=32+32 unsigned division with remainder, the machine code will be pretty inefficient, but it will work.

uint32_t, uint_fast64_t et cetera are declared in <stdint.h>, which is provided by the compiler (since C99, so for over two decades now), and is available even in freestanding environments where the standard C library is not available.

So, they ackshually are.  Efficient bignums, on the other hand...

For binary (integer, fixed point, floating point) to decimal (string) conversions and vice versa, on 32-bit architectures (embedded ARMs), I like to use something I call m28, an array of uint32_t's where the array index zero is just next to the decimal point, and grows away from the decimal point; and each 32-bit unsigned integer contains 28 bits of the composite bignum.  (m28i is for integer parts, least significant limb first; and m28f for fractional parts, most significant limb first.)

This means that multiplying a bignum by an integer between 2 and 15, inclusive, only requires one 32×32=32-bit multiplication per limb; and dividing a bignum by an integer between 2 and 15, inclusive, only one 32-bit division with remainder, or 32-bit division without remainder and a multiplication, per limb.
Addition and subtraction of bignums does need an AND mask (by 0x0FFFFFFF) and a 28-bit right shift and two additions/subtractions per limb (due to carry), or an addition/subtraction and a conditional increment/decrement.

This is amazingly efficient.  (I just need to find out how to avoid having to have an array of all powers of ten precomputed, to implement float/double conversion support with strict upper limit on the memory use during conversion, with exact correct LSB rounding in place.  With floats, it's only 83 constants, requiring max. 2988 bytes of Flash/ROM, so maybe it would be acceptable.. Undecided.)

In essence, I'm spending 14.3% of memory extra to make the computation simpler and more effective for single digit octal/decimal/hexadecimal extraction/insertion, but it turns out it's worth it on typical 32-bit architectures (ARM cores, RISC-V, using Compiler Explorer/godbolt.org to experiment with).
 
The following users thanked this post: MK14, DiTBho

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #181 on: November 26, 2022, 04:19:01 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?

There is very little (if anything) about language grammar that could be useful to MCU developers.

If extensibility is important, then domain specific libraries are the way to go.

Your initial questions were about far more than grammar, and therefore potentially more interesting. That's why people contributed.

You have consistently failed to understand that the main topic of importance is the runtime behaviour (in a broad sense) of something expressed in a language. Anything that improves the runtime behaviour is potentially important.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: Nominal Animal, DiTBho, pcprogrammer

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6253
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #182 on: November 26, 2022, 04:27:32 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
I'm saying that they are not at all important to microcontroller or embedded developers at all.

I am saying that you are trying to solve a problem by daydreaming about irrelevant stuff, when you could be working on making things actually better.

Are you sure you're not a Windows C# enthusiast who dreams of creating a C#-like language that makes all embedded devices look like Windows or at least .Net runtime –– because "Windows is basically the entire world, after all", and being adulated for it?  You definitely sound like one.  I've met many, and all have either crashed or burned, or keep making unreliable unsecure shit and getting by with their social skills alone.
 
The following users thanked this post: DiTBho, pcprogrammer

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #183 on: November 26, 2022, 05:38:04 pm »
There is no surprise that there are engineers quite content with C, or even committed to C.

When talking about low level programming languages –– which is what I understand 'a hardware "oriented" programming language' to mean ––, C is just the one with the best proven track record decades long.  It isn't that great, it's just the 'benchmark' one for others, due to its widespread use and role in systems programming and embedded development.

For examples of nearly bug-free programs written in C in the systems programming domain, go check out D. J. Bernstein's djbdns, qmail, daemontools, cdb.  This is the guy behind Curve25519, having released it in 2005.

Like it, dislike it, doesn't matter, C is just a tool.  But as a tool, its features and track record are significant.  So are its deficiencies, and that means any real effort to do better is valuable.

In comparison, C# is a managed language. .NET Micro requires at least 256k of RAM. .NET nanoFramework requires at least 64k of RAM, and runs on Cortex-M and RISC-V (ESP32-C3) cores.  So, perhaps suitable for medium to large embedded devices, but decidedly unsuitable for small ARMs and anything less than 32-bit architecures.

Ada can be used to program AVR 8-bit microcontrollers (see AVR-Ada), but it is still relatively little used.  One possible reason is that while GCC GNAT is GPL3+ licensed with a runtime library exception, AdaCore sells GNAT Pro, and the FSF/GCC GNAT is seen as "inferior", with the "proper" version being the sole product of a commercial company.  (Or maybe that's just me.)

I get that some consider this pointless

No, that's not it at all.  Not pointless, more like bass-ackwards.  We want the results too, we just have seen your approach before leading to nowhere.  We're trying to steer you to not repeat that, but actually produce something interesting.

If you start a language design from scratch, you must understand the amount of design choices already made for existing languages.  The ones in languages that have survived use in anger, are the ones where the choices support a programming paradigm the users find intuitive and effective.

Why did DiTBHo not start from scratch, and instead pared down C to a subset with some changes and additions, to arrive at their my-C, designed for strictly controlled and enforced embedded use cases?  Because they needed a tool fit for a purpose, and it was a straightforward way to achieve it.  Results matter.

Why did SiliconWizard's Design a better "C" thread 'not go anywhere'?  It just sprawled around, with individual features and other languages discussed.  In fact, it really showed how complicated and hard it is to do better than C from scratch; with other languages like Ada discussed but nobody knowing exactly why they never got as much traction as C.  Just consider this post by brucehoult about midway in the thread, about how C with its warts and all still maps to different hardware so well.

Me, I have worked on replacing the standard C library with something better.  Because the C standard defines freestanding environment where the C standard library is not available in quite a detail –– unlike say C++, which also has the same concept, but leaves it basically completely up to implementations to define what it means ––, this is doable.  I aim to fix many of the issues others have with C.  With C23 around the corner, the one change I think might actually make a difference is to arrays not decay to pointers, and instead conceptually use arrays everywhere to describe memory ranges.  Even just allowing type variation based on a later variable in the same argument list would make it possible to replace buffer overrun prone standard library functions with almost identical replacements, that would allow the C compiler to detect buffer under- and overruns at compile time.  It would only be a small addition, perhaps a builtin, to make it possible to prove via static analysis that all memory accesses are valid.
In other words, I'm looking to change the parts of C that hinder me or others, not start from scratch.

Am I a C fanboi?  No.  If you look at my posting history, you'll see that I actually recommend using an interpreted language, currently Python, for user interfaces (for multiple reasons).

I currently use C for some embedded (AVRs, mainly), and a mixed C/C++ freestanding environment for embedded ARM development; I also use POSIX C for systems programming in Linux (mostly on x86-64).  (I sometimes do secure programming, dealing with privileges and capabilities; got some experience as a sysadmin at a couple of universities, and making customized access solutions for e.g. when you have a playground with many users at different privilege levels, and subsections with their own admins, including sub-websites open to the internet.  It's not simple when you're responsible nothing leaks that shouldn't leak.)

Okay, so if we believe that a ground-up design from scratch is unlikely to lead to an actual project solving the underlying problems OP (Sherlock Holmes) wants to solve, what would?

Pick a language, and a compiler, you feel you can work with.  It could be C, it could be Ada, it could be whatever you want.  Obviously, it should have somewhat close syntax to what you prefer, but it doesn't have to be an exact match.  I'll use C as the language example below for simplicity only; feel free to substitute it with something else.

Pick a problem, find languages that solve it better than C, or invent your own new solution.  Trace it down to the generated machine code, and find a way to port it back to C, replacing the way C currently solves it.  Apply it in real life, writing real-world code that heavily uses that modified feature.  Get other people to comment on it, and maybe even test it.  Find out if the replacement solution actually helps with real-world code.  That often means getting an unsuspecting victim, and having them re-solve a problem using the modified feature, using only your documentation of the feature as a guide.

Keep a journal of your findings.

At some point, you find that you have enough of those new solutions to construct a completely new language.  At this point, you can tweak the syntax to be more to your liking.  Start writing your own compiler, but also document the language the compiler works with, precisely.  As usual, something like ABNF is sufficient for syntax, but for the paradigm, the approach, I suggest writing additional documentation explaining your earlier findings, and the solution approach.  Small examples are gold here.  The idea is that other people, reading this additional documentation, can see how you thought, so they can orient themselves to best use the new language.

Theory is nice, but practical reality always trumps theory.  Just because the ABNF of a language looks nice, doesn't mean it is an effective language.  As soon as you can compile running native binaries, start creating actual utilities – sort, grep, bc for example –, and look at the machine code the compiler generates.  Just because the code is nice and the abstractions just perfect, does not mean they are fit for generating machine code.  Compare the machine code to what the original language and other languages produce, when optimizations are disabled (for a more sensible comparison).

During this process, do feel free to occasionally branch into designing your language from scratch.  If you keep tabs on your designs as your understanding evolves, you'll understand viscerally what the 'amount of design choices' I wrote above really means.  It can be overwhelming, if you think of it, but going at it systematically, piece by piece, with each design choice having an explanation/justification in your journal and/or documentation, it can be done, and done better than what we have now.

Finally: I for one prefer passionate, honest, detailed posts over dispassionate politically correct smooth-talk.


That's a very good post, thank you for spending the time to articulate that.

I don't really have an "approach" here, this thread was expected to just be a good natured chin wag about languages, like, dislikes, aspirations with - ideally - constructive input from professional MCU developers, input about what they'd like to see done better in a language or what things they can't do that they'd like to be able to do, so that was where I'd expected this thread to go.

I have used C professionally for decades, I used to work at the Financial Times and the London Stock Exchange, designing data feed handlers for umpteen protocols and data formats (X.25, async, IBM Bisync, TCP etc) in the 1980s on Stratus hardware fault tolerant computers, I lead teams, I was a lead designer.

I've also designed and developed high performance shared memory data storage technology for Windows NT with a concurrent heap that can support multiple processes each with multiple threads all allocating/freeing 24/7 from a common shared heap - I designed and developed that as large Win32 API that used a single source code base to build a 32-bit and 64-bit versions of the API, all of that code (many thousands of lines) was system level programming, interacting with the Windows kernel object manager and so on, all of that was crafted to a good standard in C, I know the language well.

I mention this to underline the fact that I do understand the problem domain, I know C well, I've designed and built compilers and code generators in C, I'm not naïve as some of the comments here seem to want to insinuate.

So having said that allow me to respond:

When talking about low level programming languages –– which is what I understand 'a hardware "oriented" programming language' to mean ––, C is just the one with the best proven track record decades long.  It isn't that great, it's just the 'benchmark' one for others, due to its widespread use and role in systems programming and embedded development.

For examples of nearly bug-free programs written in C in the systems programming domain, go check out D. J. Bernstein's djbdns, qmail, daemontools, cdb.  This is the guy behind Curve25519, having released it in 2005.


Well as I say, I've written very sophisticated C heap APIs for concurrent threaded use 24/7 managing high speed incoming data feeds, in C. That code was and had to be able to run 24/7 it did, I had a set of principles and guidelines that helped to achieve that, some of these were specifically to plug "holes" that - like it or not - were attributable to the language itself, so yes one can write robust code in C but one can write robust code in assembler too! but that isn't an argument for assembler for C, the reality is one can write good code in C, but not because of C, but in spite of C.

Like it, dislike it, doesn't matter, C is just a tool.  But as a tool, its features and track record are significant.  So are its deficiencies, and that means any real effort to do better is valuable.

In comparison, C# is a managed language. .NET Micro requires at least 256k of RAM. .NET nanoFramework requires at least 64k of RAM, and runs on Cortex-M and RISC-V (ESP32-C3) cores.  So, perhaps suitable for medium to large embedded devices, but decidedly unsuitable for small ARMs and anything less than 32-bit architecures.

Ada can be used to program AVR 8-bit microcontrollers (see AVR-Ada), but it is still relatively little used.  One possible reason is that while GCC GNAT is GPL3+ licensed with a runtime library exception, AdaCore sells GNAT Pro, and the FSF/GCC GNAT is seen as "inferior", with the "proper" version being the sole product of a commercial company.  (Or maybe that's just me.)


C# is of course too large to run on resource constrained devices, the CLR is pretty sophisticated as is the garbage collector. It was also never aimed at MCUs, that's not a consideration the C# team have. I've used Nano framework and I've used TinyCLR too, these are fascinating and I was contributing to these once, the fact is there is no immediate prospect of these supporting generics or async/await, so that makes them very restrictive indeed as 99% of .Net library packages use generics and/or async, if MS would define a precise subset of .Net for MCUs then that's would help, but they haven't so the goals of these various initiatives is vague to put it mildly.

No, that's not it at all.  Not pointless, more like bass-ackwards.  We want the results too, we just have seen your approach before leading to nowhere.  We're trying to steer you to not repeat that, but actually produce something interesting.

If you start a language design from scratch, you must understand the amount of design choices already made for existing languages.  The ones in languages that have survived use in anger, are the ones where the choices support a programming paradigm the users find intuitive and effective.

Why did DiTBHo not start from scratch, and instead pared down C to a subset with some changes and additions, to arrive at their my-C, designed for strictly controlled and enforced embedded use cases?  Because they needed a tool fit for a purpose, and it was a straightforward way to achieve it.  Results matter.


Perhaps we need more clarity over the term "from scratch". I for one, have referred to PL/I several times for some of the strengths that language offered, that seems to me a good starting point, not starting from C I agree, but it's not starting from scratch by any means.

Okay, so if we believe that a ground-up design from scratch is unlikely to lead to an actual project solving the underlying problems OP (Sherlock Holmes) wants to solve, what would?


A grammar is necessary (not sufficient of course), unavoidable in fact so we need a grammar and I'm of the opinion that a grammar based on PL/I subset G is extremely attractive for reasons I've already articulated. A criticism I have of many, many newer languages like Rust, like Hare like Zig like Go like Swift even C# is that they are based on the C grammar and therefore do - unfortunately - stand to repeat some of the sins of the past.

Theory is nice, but practical reality always trumps theory.  Just because the ABNF of a language looks nice, doesn't mean it is an effective language.  As soon as you can compile running native binaries, start creating actual utilities – sort, grep, bc for example –, and look at the machine code the compiler generates.  Just because the code is nice and the abstractions just perfect, does not mean they are fit for generating machine code.  Compare the machine code to what the original language and other languages produce, when optimizations are disabled (for a more sensible comparison).


I suppose this might sound odd, but the generation of code, instruction sequence, object code that links and runs is - for me - rather academic, I've done it, it's obviously important but it isn't the most fruitful place to innovate in my opinion, code is code, instructions are instructions. Yes there is a lot involved but it's primarily about implementation, optimization not language, the language begins with the grammar, its expressivity.

One likely wouldn't see a huge glaring difference in the generated code from a good C compiler, a good Ada compiler and a good C++ compiler, at the end of the day they will look similar, loading registers, performing arithmetic, calculating addresses and so on.

Of course I accept that until one has produced linkable runnable code one doesn't really have a compiler, but I'm aware of that, I don't disagree at all.

What's frustrating in this thread that there are a few who just repeat the mantra "but you can do that in C if...", clearly those advocating that position are content with C and see no justification in even discussing a new language, in which case what's the point of participating in a thread about a new language!

I enjoyed your post, it is the kind of discourse that I hoped to see.











« Last Edit: November 26, 2022, 05:59:20 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #184 on: November 26, 2022, 05:52:13 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
There is very little (if anything) about language grammar that could be useful to MCU developers.

How did you establish that? what evidence supports this claim? Here's a post by an engineer that proves you wrong, I quote:

Quote
The first challenge I ran into with Rust was getting my firmware to run on hardware varying from 4-button dev-kit PCBs to the left/right halves of a wireless split to a single Atreus.
Varying the features of firmware at compile-time is known as “conditional compilation”. (It needs to be done at compile-time rather than run-time because microcontrollers have limited program space, roughly 10–100kB in my case.). Rust’s solution to this problem is “features”.

Quote
Conceptually Zig’s inline for is solving the same problem that Rust’s syntax macro solves (generating type-specific code at compile-time), but without the side quest of learning a lil’ pattern matching/expansion language. Rust has many language features and they’re all largely disjoint from each other, so knowing some doesn’t help me guess the others.

Quote
Conversely, this “consistency” principle also explains why I had such an easy time picking up Zig — it absolutely excels in this department. Not only are there many fewer features to learn in the first place, they seem to all fit together nicely: The comptime and inline for keywords, for example, allowed me to leverage at compile-time all the looping, conditions, arithmetic, and control flow I wanted using the syntax and semantics I’d already learned — Zig!

Anyway, moving on...


If extensibility is important, then domain specific libraries are the way to go.

They are certainly a way to go, I'm not advocating against libraries! A language that had no "volatile" specifier could be fine if it relied on a library to manipulate such data, but does that mean you therefore disapprove of C having a "volatile" keyword? Which do you think is preferable, a keyword for it or a library? Not to mention the obvious either but libraries also have to be written, they don't grow on trees!

Your initial questions were about far more than grammar, and therefore potentially more interesting. That's why people contributed.

You have consistently failed to understand that the main topic of importance is the runtime behaviour (in a broad sense) of something expressed in a language. Anything that improves the runtime behaviour is potentially important.

Please do not venture to speculate what you think I understand or do not understand, that's a disparaging remark, an ad-hominem.



« Last Edit: November 26, 2022, 06:05:05 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #185 on: November 26, 2022, 05:56:47 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
I'm saying that they are not at all important to microcontroller or embedded developers at all.

Sweeping speculation, if you have evidence I'd like to see it, otherwise it's opinion and we all have those. Also you too might find this of interest.

I am saying that you are trying to solve a problem by daydreaming about irrelevant stuff, when you could be working on making things actually better.

Are you sure you're not a Windows C# enthusiast who dreams of creating a C#-like language that makes all embedded devices look like Windows or at least .Net runtime –– because "Windows is basically the entire world, after all", and being adulated for it?  You definitely sound like one.  I've met many, and all have either crashed or burned, or keep making unreliable unsecure shit and getting by with their social skills alone.

Very good, a strawman and ad-hominem all in one, if you can't argue logically, rationally,  factually then do not participate.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #186 on: November 26, 2022, 06:22:25 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
There is very little (if anything) about language grammar that could be useful to MCU developers.

How did you establish that? what evidence supports this claim?

Observation over the decades. Basically MCU developers can swap grammars relatively easily and quickly. What trips them up is the behaviour, i.e. how the concepts expressed in a grammar (any grammar!) map onto the MCU's runtime behaviour.

Quote
Here's a post by an engineer that proves you wrong, I quote:

Quote
The first challenge I ran into with Rust was getting my firmware to run on hardware varying from 4-button dev-kit PCBs to the left/right halves of a wireless split to a single Atreus.
Varying the features of firmware at compile-time is known as “conditional compilation”. (It needs to be done at compile-time rather than run-time because microcontrollers have limited program space, roughly 10–100kB in my case.). Rust’s solution to this problem is “features”.

Quote
Conceptually Zig’s inline for is solving the same problem that Rust’s syntax macro solves (generating type-specific code at compile-time), but without the side quest of learning a lil’ pattern matching/expansion language. Rust has many language features and they’re all largely disjoint from each other, so knowing some doesn’t help me guess the others.

Quote
Conversely, this “consistency” principle also explains why I had such an easy time picking up Zig — it absolutely excels in this department. Not only are there many fewer features to learn in the first place, they seem to all fit together nicely: The comptime and inline for keywords, for example, allowed me to leverage at compile-time all the looping, conditions, arithmetic, and control flow I wanted using the syntax and semantics I’d already learned — Zig!

One data point (and a very arguable data point at that) does not constitute a convincing argument.

Quote

If extensibility is important, then domain specific libraries are the way to go.

They are certainly a way to go, I'm not advocating against libraries! A language that had no "volatile" specifier could be fine if it relied on a library to manipulate such data, but does that mean you therefore disapprove of C having a "volatile" keyword? Which do you think is preferable, a keyword for it or a librart?

The significance of a keyword is not that it is one symbol (of many symbols) in a grammar. The significance is how it is mapped onto runtime behaviour. The objective is to ensure that multiple sources that cause data mutation (threads, hardware registers, interrupts etc) have defined predictable useful behaviour. All that is necessary is that primitives (language and hardware) exist to express the necessary concepts.

There are, of course, several such low-level mechanisms described in the literature over the decades, and different languages include different low-level mechanisms. Those low-level mechanisms are usually "wrapped" into several more useful high-level conceptual mechanisms in the form of libraries expressing useful "Design Patterns", e.g. Posix threads or Doug Lea's Java Concurrency Library.

"Naked" use of the primitives (rather than well designed and debugged) libraries of design patterns is a frequent source of subtle unrepeatable errors.

Notice that the language grammar is completely irrelevant in that respect; the runtime behaviour is what's important.

Quote
Your initial questions were about far more than grammar, and therefore potentially more interesting. That's why people contributed.

You have consistently failed to understand that the main topic of importance is the runtime behaviour (in a broad sense) of something expressed in a language. Anything that improves the runtime behaviour is potentially important.

Please do not venture to speculate what you think I understand or do not understand, that's a disparaging remark, an ad-hominem.

I call 'em as I see 'em. Others are making similar observations.

I don't think you understand what an ad-hominem argument is and isn't.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: MK14

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #187 on: November 26, 2022, 06:30:04 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
There is very little (if anything) about language grammar that could be useful to MCU developers.

How did you establish that? what evidence supports this claim?

Observation over the decades. Basically MCU developers can swap grammars relatively easily and quickly. What trips them up is the behaviour, i.e. how the concepts expressed in a grammar (any grammar!) map onto the MCU's runtime behaviour.

Quote
Here's a post by an engineer that proves you wrong, I quote:

Quote
The first challenge I ran into with Rust was getting my firmware to run on hardware varying from 4-button dev-kit PCBs to the left/right halves of a wireless split to a single Atreus.
Varying the features of firmware at compile-time is known as “conditional compilation”. (It needs to be done at compile-time rather than run-time because microcontrollers have limited program space, roughly 10–100kB in my case.). Rust’s solution to this problem is “features”.

Quote
Conceptually Zig’s inline for is solving the same problem that Rust’s syntax macro solves (generating type-specific code at compile-time), but without the side quest of learning a lil’ pattern matching/expansion language. Rust has many language features and they’re all largely disjoint from each other, so knowing some doesn’t help me guess the others.

Quote
Conversely, this “consistency” principle also explains why I had such an easy time picking up Zig — it absolutely excels in this department. Not only are there many fewer features to learn in the first place, they seem to all fit together nicely: The comptime and inline for keywords, for example, allowed me to leverage at compile-time all the looping, conditions, arithmetic, and control flow I wanted using the syntax and semantics I’d already learned — Zig!

One data point (and a very arguable data point at that) does not constitute a convincing argument.

I agree, but is so much better than no data points - if you get my drift. Your observations are fine, but all observations are interpretations of data, you cannot exclude your own biases and prejudices.

Quote

If extensibility is important, then domain specific libraries are the way to go.

They are certainly a way to go, I'm not advocating against libraries! A language that had no "volatile" specifier could be fine if it relied on a library to manipulate such data, but does that mean you therefore disapprove of C having a "volatile" keyword? Which do you think is preferable, a keyword for it or a librart?

The significance of a keyword is not that it is one symbol (of many symbols) in a grammar. The significance is how it is mapped onto runtime behaviour. The objective is to ensure that multiple sources that cause data mutation (threads, hardware registers, interrupts etc) have defined predictable useful behaviour. All that is necessary is that primitives (language and hardware) exist to express the necessary concepts.

There are, of course, several such low-level mechanisms described in the literature over the decades, and different languages include different low-level mechanisms. Those low-level mechanisms are usually "wrapped" into several more useful high-level conceptual mechanisms in the form of libraries expressing useful "Design Patterns", e.g. Posix threads or Doug Lea's Java Concurrency Library.

"Naked" use of the primitives (rather than well designed and debugged) libraries of design patterns is a frequent source of subtle unrepeatable errors.

Notice that the language grammar is completely irrelevant in that respect; the runtime behaviour is what's important.

Yes you keep saying that but you never answered the question - if libraries are the "way to go" do you think C should not have a "volatile" keyword and instead rely on a library? if you don't understand the question then say so, if you do then is your answer "yes" or "no" or "I don't know"?

Quote
Your initial questions were about far more than grammar, and therefore potentially more interesting. That's why people contributed.

You have consistently failed to understand that the main topic of importance is the runtime behaviour (in a broad sense) of something expressed in a language. Anything that improves the runtime behaviour is potentially important.

Please do not venture to speculate what you think I understand or do not understand, that's a disparaging remark, an ad-hominem.

I call 'em as I see 'em. Others are making similar observations.

I don't think you understand what an ad-hominem argument is and isn't.

A true classic oxymoron "I don't think you understand what an ad-hominem argument is" a bit like cheating on one's ethics final!

Let me say this, if you prefer debating to discussing the subject of the thread then feel free to start a thread or pick an existing thread in an appropriate thread or section of the forum and I'll give you some free lessons.

« Last Edit: November 26, 2022, 06:40:47 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #188 on: November 26, 2022, 07:15:47 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
There is very little (if anything) about language grammar that could be useful to MCU developers.

How did you establish that? what evidence supports this claim?

Observation over the decades. Basically MCU developers can swap grammars relatively easily and quickly. What trips them up is the behaviour, i.e. how the concepts expressed in a grammar (any grammar!) map onto the MCU's runtime behaviour.

Quote
Here's a post by an engineer that proves you wrong, I quote:

Quote
The first challenge I ran into with Rust was getting my firmware to run on hardware varying from 4-button dev-kit PCBs to the left/right halves of a wireless split to a single Atreus.
Varying the features of firmware at compile-time is known as “conditional compilation”. (It needs to be done at compile-time rather than run-time because microcontrollers have limited program space, roughly 10–100kB in my case.). Rust’s solution to this problem is “features”.

Quote
Conceptually Zig’s inline for is solving the same problem that Rust’s syntax macro solves (generating type-specific code at compile-time), but without the side quest of learning a lil’ pattern matching/expansion language. Rust has many language features and they’re all largely disjoint from each other, so knowing some doesn’t help me guess the others.

Quote
Conversely, this “consistency” principle also explains why I had such an easy time picking up Zig — it absolutely excels in this department. Not only are there many fewer features to learn in the first place, they seem to all fit together nicely: The comptime and inline for keywords, for example, allowed me to leverage at compile-time all the looping, conditions, arithmetic, and control flow I wanted using the syntax and semantics I’d already learned — Zig!

One data point (and a very arguable data point at that) does not constitute a convincing argument.

I agree, but is so much better than no data points - if you get my drift. Your observations are fine, but all observations are interpretations of data, you cannot exclude your own biases and prejudices.

Quote

If extensibility is important, then domain specific libraries are the way to go.

They are certainly a way to go, I'm not advocating against libraries! A language that had no "volatile" specifier could be fine if it relied on a library to manipulate such data, but does that mean you therefore disapprove of C having a "volatile" keyword? Which do you think is preferable, a keyword for it or a librart?

The significance of a keyword is not that it is one symbol (of many symbols) in a grammar. The significance is how it is mapped onto runtime behaviour. The objective is to ensure that multiple sources that cause data mutation (threads, hardware registers, interrupts etc) have defined predictable useful behaviour. All that is necessary is that primitives (language and hardware) exist to express the necessary concepts.

There are, of course, several such low-level mechanisms described in the literature over the decades, and different languages include different low-level mechanisms. Those low-level mechanisms are usually "wrapped" into several more useful high-level conceptual mechanisms in the form of libraries expressing useful "Design Patterns", e.g. Posix threads or Doug Lea's Java Concurrency Library.

"Naked" use of the primitives (rather than well designed and debugged) libraries of design patterns is a frequent source of subtle unrepeatable errors.

Notice that the language grammar is completely irrelevant in that respect; the runtime behaviour is what's important.

Yes you keep saying that but you never answered the question - if libraries are the "way to go" do you think C should not have a "volatile" keyword and instead rely on a library? if you don't understand the question then say so, if you do then is your answer "yes" or "no" or "I don't know"?

Sigh. That's a false dichotomy.

Not only isn't your question the right question, it isn't even the wrong question. It is, however, a reflection of the point I've been trying (and failing) to get you to understand: the difference between syntax/language and semantics/meaning/behaviour. Most people here care deeply about the latter, but don't care about the former.

The behaviour I mentioned above is needed, the keyword isn't. Whatever syntax and primitives are used, they will usually be wrapped up in a library.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14464
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #189 on: November 26, 2022, 07:22:28 pm »
Even for the people who just want Carry, I highly doubt they are prepared for the different implementations of the carry flag with respect to subtraction and compare.

And it's simply not hard to express what you ACTUALLY WANT in portable C anyway. The compiler will automatically use the flags where available.

Bignums aren't easy in portable C :)

Well, yes they are, as just demonstrated.

Yep they are indeed.

I have implemented some kind of arbitrary precision library. It's able to do with various base integer widths without a problem. I've admittedly used some of the GCC's builtins to speed things up (such as the 'compute with overflow' kind of builtins), which themselves are reasonably portable if you stick to GCC, but I could have perfectly done without them and make the code 100% standard C code.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #190 on: November 26, 2022, 07:54:13 pm »
So, what specifically did you want to say about language grammar, extensibility, features that could be helpful to MCU developers?
There is very little (if anything) about language grammar that could be useful to MCU developers.

How did you establish that? what evidence supports this claim?

Observation over the decades. Basically MCU developers can swap grammars relatively easily and quickly. What trips them up is the behaviour, i.e. how the concepts expressed in a grammar (any grammar!) map onto the MCU's runtime behaviour.

Quote
Here's a post by an engineer that proves you wrong, I quote:

Quote
The first challenge I ran into with Rust was getting my firmware to run on hardware varying from 4-button dev-kit PCBs to the left/right halves of a wireless split to a single Atreus.
Varying the features of firmware at compile-time is known as “conditional compilation”. (It needs to be done at compile-time rather than run-time because microcontrollers have limited program space, roughly 10–100kB in my case.). Rust’s solution to this problem is “features”.

Quote
Conceptually Zig’s inline for is solving the same problem that Rust’s syntax macro solves (generating type-specific code at compile-time), but without the side quest of learning a lil’ pattern matching/expansion language. Rust has many language features and they’re all largely disjoint from each other, so knowing some doesn’t help me guess the others.

Quote
Conversely, this “consistency” principle also explains why I had such an easy time picking up Zig — it absolutely excels in this department. Not only are there many fewer features to learn in the first place, they seem to all fit together nicely: The comptime and inline for keywords, for example, allowed me to leverage at compile-time all the looping, conditions, arithmetic, and control flow I wanted using the syntax and semantics I’d already learned — Zig!

One data point (and a very arguable data point at that) does not constitute a convincing argument.

I agree, but is so much better than no data points - if you get my drift. Your observations are fine, but all observations are interpretations of data, you cannot exclude your own biases and prejudices.

Quote

If extensibility is important, then domain specific libraries are the way to go.

They are certainly a way to go, I'm not advocating against libraries! A language that had no "volatile" specifier could be fine if it relied on a library to manipulate such data, but does that mean you therefore disapprove of C having a "volatile" keyword? Which do you think is preferable, a keyword for it or a librart?

The significance of a keyword is not that it is one symbol (of many symbols) in a grammar. The significance is how it is mapped onto runtime behaviour. The objective is to ensure that multiple sources that cause data mutation (threads, hardware registers, interrupts etc) have defined predictable useful behaviour. All that is necessary is that primitives (language and hardware) exist to express the necessary concepts.

There are, of course, several such low-level mechanisms described in the literature over the decades, and different languages include different low-level mechanisms. Those low-level mechanisms are usually "wrapped" into several more useful high-level conceptual mechanisms in the form of libraries expressing useful "Design Patterns", e.g. Posix threads or Doug Lea's Java Concurrency Library.

"Naked" use of the primitives (rather than well designed and debugged) libraries of design patterns is a frequent source of subtle unrepeatable errors.

Notice that the language grammar is completely irrelevant in that respect; the runtime behaviour is what's important.

Yes you keep saying that but you never answered the question - if libraries are the "way to go" do you think C should not have a "volatile" keyword and instead rely on a library? if you don't understand the question then say so, if you do then is your answer "yes" or "no" or "I don't know"?

Sigh. That's a false dichotomy.

Not only isn't your question the right question, it isn't even the wrong question. It is, however, a reflection of the point I've been trying (and failing) to get you to understand: the difference between syntax/language and semantics/meaning/behaviour. Most people here care deeply about the latter, but don't care about the former.

The behaviour I mentioned above is needed, the keyword isn't. Whatever syntax and primitives are used, they will usually be wrapped up in a library.

Very well I'll tell you, the answer is in fact "no" because C with a volatile keyword is preferable to a library method invocation both syntactically and semantically and for performance reasons too.

You can argue otherwise if you want, I'm all ears, but dismissing the question because you'd rather not dignify it with an answer isn't logical, it's evasive.

Now you're at it again with "most people here care deeply about the latter, but don't care about the former" this is no doubt another of your personal observations and as I said all observations involve interpretation and all interpretation involves biases, prejudices and existing beliefs - upshot? it's a meaningless claim.

Now, what are examples of these semantic, behavioral concepts you "care deeply about" I'm as interested in this as I am grammar, but was focusing on grammar initially, if you want to start discussing behaviors then fine.

One cannot have a language without a grammar so one must - somehow - identify a grammar to use, it is an essential part of a programming language, you seem to be saying it is irrelevant, well if that were true use assembler or perhaps COBOL or RPG?

Finally as I just pointed out but you seem to have missed, libraries do not grow on trees, they are written and they are written using a language, you cannot wave away the language issue by replacing it with a library, all that does is move the code, it doesn't eliminate it.






« Last Edit: November 26, 2022, 07:58:28 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #191 on: November 26, 2022, 07:59:40 pm »
Even for the people who just want Carry, I highly doubt they are prepared for the different implementations of the carry flag with respect to subtraction and compare.

And it's simply not hard to express what you ACTUALLY WANT in portable C anyway. The compiler will automatically use the flags where available.

Bignums aren't easy in portable C :)

Well, yes they are, as just demonstrated.

Yep they are indeed.

I have implemented some kind of arbitrary precision library. It's able to do with various base integer widths without a problem. I've admittedly used some of the GCC's builtins to speed things up (such as the 'compute with overflow' kind of builtins), which themselves are reasonably portable if you stick to GCC, but I could have perfectly done without them and make the code 100% standard C code.

When I last looked at the problem, c1986, it had to be portable across machines, the language was much simpler and any given compiler targeted, at best, a small subset of machines.

My comment was based on erroneously ignoring the changes.since then. Mea culpa.

I had implemented a 32 bit floating point package on a 6800 a decade earlier, in a medium level language. Without checking, I would have used the carry flag, which doesn't exist in C
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: MK14

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6253
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #192 on: November 26, 2022, 08:18:40 pm »
A grammar is necessary (not sufficient of course), unavoidable in fact so we need a grammar and I'm of the opinion that a grammar based on PL/I subset G is extremely attractive for reasons I've already articulated. A criticism I have of many, many newer languages like Rust, like Hare like Zig like Go like Swift even C# is that they are based on the C grammar and therefore do - unfortunately - stand to repeat some of the sins of the past.
I see grammar as an arbitrary choice, related more to what the designer considers "pretty" than anything else.
It is the language paradigm (generic approach to problem solving), and concepts, that matter.

If you start with a grammar, you are basically starting with the statement "This is the way all possible concepts in this programming language shall be expressed."

If you start at the conceptual level, you sketch out the features of the language, and can then choose a grammar that best suits the needs.

In my opinion, the only reason to start from the grammar upwards is if you 1) believe you already know everything you need to know about the language being designed, and/or 2) how it looks is of paramount importance to you.  (These are the two reasons I've seen in the wild, that is.)

Let's use my own gripes with C as a starting point, and examine how the above affects the design process and the result.

Because of embedded uses, I want to allow pointers to specific addresses and to individual objects, but I do not want them to be extensible to arrays, except via explicit constructs that also specify the size/range of the array.

I do not want to force the ABI to pass arrays as (origin, length) or (origin, step, length) tuples, because my aim is to be able to prove that all accesses are valid using static code analysis tools.  If a cosmic ray flips a bit and that causes a crash because of lack of runtime bounds checking, I'm okay with that.  (This, of course, is a design choice, and by no means the only one possible!)

I also like being able to redefine basic operators for custom object types.  I don't want to go whole-hog object-oriented, making the operator overloading object-specific; I'm fine with the operator overloading being purely based on static types only.  (I am aiming at small to medium-sized embedded devices, and the cost of the indirection via object pointers is too much for this niche, in my opinion.  Another design choice.)

I now cobble together something that seems to compile a small snippet of source code to pseudo-assembly.

Uh-oh, problem.  Because I do not force ABIs to explicitly pass a parameter describing the array length, I need a way to explicitly extract the origin and the length of an array.  I could use built-in functions or address and length operators.  Except that if I have already decided on a grammar, my choice is dictated by that grammar.  Oops.

In real life, both options have their upsides and downsides.  When the length is needed in a function where the array was passed as a parameter, we really do need to pass the length also.  There are several options for this particular design choice, but they all boil down to the fact that in a function specification, we need a way to specify that a parameter is the length of a specific array also passed as a parameter.  (With this information, my compiler can tell at compile-time whether the length operator or built-in function is allowed.)  This significantly affects the grammar, of course!
If my grammar was locked in already, I'd have to cobble together a less than optimal workaround.

Even though I have piles of experience myself, I know I am not aware of all the features and details beforehand.  But I also know that I can construct the grammar as I go along, and collect and implement all the features of the new programming language.  Having the grammar be defined by the features, instead of vice versa, gives me the most leeway.

For example, if I wanted this language to be mostly compatible with C otherwise, I would make the address-of and length-of operators, so that those porting code would need to pay attention to every case where a pointer is used as an array or vice versa.  C already has an address-of operator &, but it might be confusing, and make the static code analysis more difficult.  Instead, I might choose say @ and #, or origin and length (keyword operators, like sizeof).  But I do not know yet; I would experiment with it in practice –– on unsuspecting victims, preferably –– to see if they grasp the concept intuitively, and therefore would be likely to apply this to write memory-safer code.  Locking in the grammar up front makes such experiments irrelevant; the decision has been made already.

Thus, the obvious conclusion from going at it grammar-first is that it is grandiose, futile, waste of time, or all three.



Do not forget that others are not commenting about you personally, they are commenting your approach and what you describe in your output.

This is also why I use a pseudonym, instead of my real name: it helps me remember that any negative feedback I get is based on my output, my communications, and not my person.  I can change my communication style somewhat –– although verbosity seems to be a fixed feature for me ––, but there is no reason to think my person is under attack, so although discussion can get very heated at times, there is no reason to consider it as between persons: it is just a heated discussion.  In a different concurrent discussion with the same members, I can be (and have been), quite calm and relaxed –– again, because it is not about persons, the heat is only about the subject matter.  I think sometimes a little heat is good, because it shows that others care, and because it (at least for me) causes one to reconsider their own output, to see if the heat is on track, or just a mistargeted flamethrower.  Useful, in other words.
« Last Edit: November 26, 2022, 08:21:54 pm by Nominal Animal »
 
The following users thanked this post: MK14, DiTBho

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #193 on: November 26, 2022, 08:20:58 pm »
I had implemented a 32 bit floating point package on a 6800 a decade earlier, in a medium level language. Without checking, I would have used the carry flag, which doesn't exist in C

This caught my eye "carry flag" are you of the opinion this could be useful? if it had existed in C?
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #194 on: November 26, 2022, 08:34:48 pm »
I had implemented a 32 bit floating point package on a 6800 a decade earlier, in a medium level language. Without checking, I would have used the carry flag, which doesn't exist in C

This caught my eye "carry flag" are you of the opinion this could be useful? if it had existed in C?

It would have been convenient in that case, but different processors have very different semantics for their carry flag. Some calculate it for every instruction, but since it is on the critical path, others don't. For all I know there may be processors without a carry flag; I no longer bother to keep up with processor ISAs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #195 on: November 26, 2022, 09:01:54 pm »

Yes you keep saying that but you never answered the question - if libraries are the "way to go" do you think C should not have a "volatile" keyword and instead rely on a library? if you don't understand the question then say so, if you do then is your answer "yes" or "no" or "I don't know"?

Sigh. That's a false dichotomy.

Not only isn't your question the right question, it isn't even the wrong question. It is, however, a reflection of the point I've been trying (and failing) to get you to understand: the difference between syntax/language and semantics/meaning/behaviour. Most people here care deeply about the latter, but don't care about the former.

The behaviour I mentioned above is needed, the keyword isn't. Whatever syntax and primitives are used, they will usually be wrapped up in a library.

Very well I'll tell you, the answer is in fact "no" because C with a volatile keyword is preferable to a library method invocation both syntactically and semantically and for performance reasons too.

Firstly a decent compiler will remove any inefficiency associated with a procedure call. (In Java's case it can even remove it at runtime based on how the code+data is actually using it; C# can only make compile time guesses)

Secondly the reason for wrapping a primitive in a procedure is to encapsulate and reuse well-thought out higher level concepts, rather that to re-create wheels that turn out not to be circular.

Examples of such design patterns are mailboxes, rendezvous, queues, counting semaphores, channels, timeouts, fork-join, etc etc. One example of many is that Doug Lea did a good job of pulling many of these together in his concurrency library; FFI see https://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html

Quote
Now, what are examples of these semantic, behavioral concepts you "care deeply about" I'm as interested in this as I am grammar, but was focusing on grammar initially, if you want to start discussing behaviors then fine.

I gave a quick list of them earlier in this thread; I'm not going to repeat them. It doesn't surprise me that you haven't noticed them.

Quote
One cannot have a language without a grammar so one must - somehow - identify a grammar to use, it is an essential part of a programming language, you seem to be saying it is irrelevant, well if that were true use assembler or perhaps COBOL or RPG?

I have done. Well not COBOL and RPG for real time applications, but multiple assemblers, macro languages, high level languages of various "persuasions", and even hardware design languages.

Quote
Finally as I just pointed out but you seem to have missed, libraries do not grow on trees, they are written and they are written using a language, you cannot wave away the language issue by replacing it with a library, all that does is move the code, it doesn't eliminate it.

Actually, in very deep and useful ways you can do exactly that.

An extreme and good case is a current commercial processor effectlvely implements "library level" concepts directly in hardware, e.g. channels, timeouts and even the RTOS.

Yes, the RTOS is in hardware. Yes, there are language constructs that make use of it. I've mentioned it earlier, but again I'm not surprised you haven't picked up on it.

Background: when I was in a research lab, the director stated that the only unforgiveable sin was to not know the literature. You need a wider understanding of what is out there, what works, and what the limitations might be. Without that you will, at best, take a long time inventing an elliptical wheel.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #196 on: November 26, 2022, 09:03:04 pm »
A grammar is necessary (not sufficient of course), unavoidable in fact so we need a grammar and I'm of the opinion that a grammar based on PL/I subset G is extremely attractive for reasons I've already articulated. A criticism I have of many, many newer languages like Rust, like Hare like Zig like Go like Swift even C# is that they are based on the C grammar and therefore do - unfortunately - stand to repeat some of the sins of the past.

I see grammar as an arbitrary choice, related more to what the designer considers "pretty" than anything else.
It is the language paradigm (generic approach to problem solving), and concepts, that matter.

I don't know what you mean by "pretty". Grammars have properties, not all grammars share the same identical properties, there are differences and those are things that should influence the choice of grammar. To glibly say the choice is arbitrary is tantamount to saying the properties do not matter, but that isn't true.

If you start with a grammar, you are basically starting with the statement "This is the way all possible concepts in this programming language shall be expressed."

If you start at the conceptual level, you sketch out the features of the language, and can then choose a grammar that best suits the needs.

Very well, so what at the conceptual level do you think should play into the grammar?

In my opinion, the only reason to start from the grammar upwards is if you 1) believe you already know everything you need to know about the language being designed, and/or 2) how it looks is of paramount importance to you.  (These are the two reasons I've seen in the wild, that is.)

I don't think its very easy to devise a programming language grammar that is 100% free from the influence of other previous languages. There's no such thing as a totally, brand new, fresh grammar, or it's very very rare. APL could be considered an example of that though, I suppose.

Consider too C#, Java, C++, Swift, Objective-C, Perl, PHP, JavaScript, Go, Rust and more, all are regarded as being derived from the C grammar. So all of these must have selected - up front - C as the basis for their grammar.

As for PL/I there are aspects of its grammar that I think are valuable and that's why it is my starting point. Yes the problem domain is important but few have wanted to discuss that here, I've asked what kind of things an MCU developer might find helpful but sadly there's been rather a lot of disparaging remarks and not that much about actual requirements.

Let's use my own gripes with C as a starting point, and examine how the above affects the design process and the result.

Because of embedded uses, I want to allow pointers to specific addresses and to individual objects, but I do not want them to be extensible to arrays, except via explicit constructs that also specify the size/range of the array.

I do not want to force the ABI to pass arrays as (origin, length) or (origin, step, length) tuples, because my aim is to be able to prove that all accesses are valid using static code analysis tools.  If a cosmic ray flips a bit and that causes a crash because of lack of runtime bounds checking, I'm okay with that.  (This, of course, is a design choice, and by no means the only one possible!)

I also like being able to redefine basic operators for custom object types.  I don't want to go whole-hog object-oriented, making the operator overloading object-specific; I'm fine with the operator overloading being purely based on static types only.  (I am aiming at small to medium-sized embedded devices, and the cost of the indirection via object pointers is too much for this niche, in my opinion.  Another design choice.)

I now cobble together something that seems to compile a small snippet of source code to pseudo-assembly.

Uh-oh, problem.  Because I do not force ABIs to explicitly pass a parameter describing the array length, I need a way to explicitly extract the origin and the length of an array.  I could use built-in functions or address and length operators.  Except that if I have already decided on a grammar, my choice is dictated by that grammar.  Oops.

In real life, both options have their upsides and downsides.  When the length is needed in a function where the array was passed as a parameter, we really do need to pass the length also.  There are several options for this particular design choice, but they all boil down to the fact that in a function specification, we need a way to specify that a parameter is the length of a specific array also passed as a parameter.  (With this information, my compiler can tell at compile-time whether the length operator or built-in function is allowed.)  This significantly affects the grammar, of course!
If my grammar was locked in already, I'd have to cobble together a less than optimal workaround.

Those are not strictly grammar questions (well operator overloading is). How the sizes of arrays is represented and accessible at runtime is a semantic issue. One could add that to C, one could write a C compiler that looks like C but implements arrays differently to normal, in principle anyway. Arrays can be described by simple metadata, in fact in PL/I (as well as Pascal and Fortran and COBOL...) there 's no need to care because that metadata is always present under the hood. It's a tiny amount of data too.

IMHO this is something I'd absolutely include in a new language, I'd also support optional user specified bounds too, that way we could for example, declare a 2D arrays of doubles where there subscripts range from (say) -9 thru +9 and support that for an arbitrary number of dimensions.

PL/I also supported the novel "isub" defined arrays. In that abstraction we can declare a 2D 10 x 10 array and then declare another 1D 10 element array where the elements of the latter map to the elements of the former in some defined way. So one could access the diagonal of a matrix just by a suitable declaration no logic, no code, no bugs.

This is achieved using the "defined" storage class basically meaning overlayed, like a "union" but much less restricted. A similar thing is true for strings, these can be fixed length or varying length and what's more we can declare a parameter to have a "*" length, that means you can pass it any fixed length string and the called code can easily find the length - even fixed length strings have length metadata.

The engineers that created PL/I were very experienced assembler programmers too, they were expert IBM mainframe programmers who understood hardware very well indeed. Many of the things included in the language stem from that, overlaying different variables on a common address, accessing data through implicit pointers, not just explicit pointers as in C.

Even though I have piles of experience myself, I know I am not aware of all the features and details beforehand.  But I also know that I can construct the grammar as I go along, and collect and implement all the features of the new programming language.  Having the grammar be defined by the features, instead of vice versa, gives me the most leeway.

For example, if I wanted this language to be mostly compatible with C otherwise, I would make the address-of and length-of operators, so that those porting code would need to pay attention to every case where a pointer is used as an array or vice versa.  C already has an address-of operator &, but it might be confusing, and make the static code analysis more difficult.  Instead, I might choose say @ and #, or origin and length (keyword operators, like sizeof).  But I do not know yet; I would experiment with it in practice –– on unsuspecting victims, preferably –– to see if they grasp the concept intuitively, and therefore would be likely to apply this to write memory-safer code.  Locking in the grammar up front makes such experiments irrelevant; the decision has been made already.

Thus, the obvious conclusion from going at it grammar-first is that it is grandiose, futile, waste of time, or all three.

Do not forget that others are not commenting about you personally, they are commenting your approach and what you describe in your output.

Well we can't not define a grammar AND develop a new language at the same time. I've asked what kind of things would be helpful in a new language because I want  to set boundaries for a grammar, I've based it on PL/I because that has some desirable properties not seen in other grammars, like no reserved words, so this is not specifically a PL/I grammar it is a grammar that - like PL/I - will have no reserved words but unlike PL/I would use { and } for block delimiters. At the time PL/I was active in industry people did not have keyboards, very few people used a keyboard or a terminal, they wrote code - I wrote code - on coding sheets and these were then "punched in" by the "punch girls". I'd get compiler listing the next day, syntax errors and all.

The point I'm making is that some things in the original IBM PL/I grammar like being all UPPERCASE and terms like DO and DO WHILE and IF X THE DO... etc are primarily there because of the nature of the industry at that time. Writing a { or a } on a coding sheet was a risk, it could easily by misread. But not today, so things like that can be discarded and we can use "{" and "}" rather than "do;" and "end;"

Someone else mentioned that "the carry flag isn't available in C" well this is the kind if thing I'd like to hear more about, would it, could it, be helpful to expose such a concept in the language?

This is also why I use a pseudonym, instead of my real name: it helps me remember that any negative feedback I get is based on my output, my communications, and not my person.  I can change my communication style somewhat –– although verbosity seems to be a fixed feature for me ––, but there is no reason to think my person is under attack, so although discussion can get very heated at times, there is no reason to consider it as between persons: it is just a heated discussion.  In a different concurrent discussion with the same members, I can be (and have been), quite calm and relaxed –– again, because it is not about persons, the heat is only about the subject matter.  I think sometimes a little heat is good, because it shows that others care, and because it (at least for me) causes one to reconsider their own output, to see if the heat is on track, or just a mistargeted flamethrower.  Useful, in other words.

I have a simple rule - never write anything to or about a person in a forum that I would not be totally willing and comfortable to say to them in person, in a meeting with others present, other team members, visitors to the company, vendors even HR, I try to communicate with that rule in mind, I sometimes fail but I really do try, many here do not adhere to such a rule, they speak and accuse and insult in a way that would be very unlikely to do in a group meeting without making fools of themselves or getting reprimanded by someone more senior.
« Last Edit: November 26, 2022, 09:32:32 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19484
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #197 on: November 26, 2022, 09:32:13 pm »
I don't think its very easy to devise a programming language grammar that is 100% free from the influence of other previous languages. There's no such thing as a totally, brand new, fresh grammar, or it's very very rare. APL could be considered an example of that though, I suppose.

I've used LISP, Prolog, Forth, Smalltalk, Algol-60, various hardware definition languages. They all have radically different syntaxes.

Quote
Consider too C#, Java, C++, Swift, Objective-C, Perl, PHP, JavaScript, Go, Rust and more, all are regarded as being derived from the C grammar. So all of these must have selected - up front - C as the basis for their grammar.

And there you unwittingly demonstrate our point. The interesting and important differences between those languages has nothing to do with grammar.

Quote
Someone else mentioned that "the carry flag isn't available in C" well this is the kind if thing I'd like to hear more about, would it, could it, be helpful to expose such a concept in the language?

There are far more important concepts to expose in a language. Concentrate on those.

Start by understanding the concepts in https://www.jameswhanlon.com/the-xc-programming-language.html amd https://www.xmos.ai/download/XMOS-Programming-Guide-(documentation)(F).pdf
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #198 on: November 26, 2022, 09:40:42 pm »
I don't think its very easy to devise a programming language grammar that is 100% free from the influence of other previous languages. There's no such thing as a totally, brand new, fresh grammar, or it's very very rare. APL could be considered an example of that though, I suppose.

I've used LISP, Prolog, Forth, Smalltalk, Algol-60, various hardware definition languages. They all have radically different syntaxes.

Quote
Consider too C#, Java, C++, Swift, Objective-C, Perl, PHP, JavaScript, Go, Rust and more, all are regarded as being derived from the C grammar. So all of these must have selected - up front - C as the basis for their grammar.

And there you unwittingly demonstrate our point. The interesting and important differences between those languages has nothing to do with grammar.

Who said otherwise? The fact is that some of the problems common to all of them is attributable to the grammar - so in fact you unwittingly conformed my point!

Quote
Someone else mentioned that "the carry flag isn't available in C" well this is the kind if thing I'd like to hear more about, would it, could it, be helpful to expose such a concept in the language?

There are far more important concepts to expose in a language. Concentrate on those.

Start by understanding the concepts in https://www.jameswhanlon.com/the-xc-programming-language.html amd https://www.xmos.ai/download/XMOS-Programming-Guide-(documentation)(F).pdf

I will take a peek, how many people here use XC to write software on MCUs? It seems to bear some similarities to Occam. Are you trying to say that some of things in XC would be useful in a new language? If there are specific features why not call them out? I've asked several times and seen a lackluster response, even here you are just glibly telling me to read some manual rather than being clear about the features you value.

You're speaking about parallelism? message passing? what exactly? concentrate on my questions.
 



« Last Edit: November 26, 2022, 09:54:04 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4032
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #199 on: November 26, 2022, 10:15:31 pm »
I note that one of the desires there was "give easy access to the flags". I pointed out that many ISAs don't even *have* flags.
Yep.  In particular, you helped me realize I did not actually desire access to flags, just multiple function result values, to fulfill my needs.

Yes. Multiple return values are very desirable. Returning structures can substitute, but it's notationally annoying.

Most modern ABIs allow returning two word-size or one double sized value. With register based ABIs, usually in the same registers as used for the first two arguments. x86_64 is a weird exception with the first argument passed in RDI but the result in RAX.

But why can't you pass the same number of results as arguments? Function return is semantically no different to function call -- it is "calling the continuation". It would clearly be very very easy to enable this in modern ABIs that pass up to 6 or 8 arguments in registers -- it's simply relaxing a restriction. Figuring out where to put return values that overflow into RAM is harder. If the caller knows how many results are to be returned then it can allocate stack space for max(arguments, results). Support an unbounded number of return values would be a lot harder. But it's also hard to see a reason to support it. VARARGS seems like mostly a hack to support printf and friends.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf