EEVblog Electronics Community Forum
Products => Computers => Programming => Topic started by: SiliconWizard on August 25, 2019, 04:04:26 pm
-
Has anyone ever heard of it?
http://bsharplanguage.org/ (http://bsharplanguage.org/)
They have a twitter account: https://twitter.com/BSharpTeam (https://twitter.com/BSharpTeam)
last tweet from 2016.
Apparently the project started in 2006 (or possibly earlier?): https://www.embedded.com/design/prototyping-and-development/4006620/B--A-programming-language-for-small-footprint-embedded-systems-applications-Part-1 (https://www.embedded.com/design/prototyping-and-development/4006620/B--A-programming-language-for-small-footprint-embedded-systems-applications-Part-1)
still nothing to be seen. Is it just another vaporware project? ;D
-
Never heard of it. But it's a interpreter... so....
I never understood these choices. On one hand embedded devs are bitching about how in efficient, bloated etc all but their hand-optimized asm code is (overstated to make a point ;D). And on the other hand we create levels of overhead that are just not necessary...
On the other hand, hardware is getting bigger, better, faster and cheaper. So why not use a RaspberryPI and boot into Linux to read a temperature sensor... :-//
-
So why not use a RaspberryPI and boot into Linux to read a temperature sensor... :-//
Because you use too much power and can be hacked with all the outdated open source sw. :)
I saw there are now uC that use tens of nA instead of mA.
-
https://www.youtube.com/watch?v=Yr3dBOB7nXI (https://www.youtube.com/watch?v=Yr3dBOB7nXI)
-
Never heard of it. But it's a interpreter... so....
Well, as I got it, it was supposed to run on some kind of VM, like Java. Not strictly an interpreter, although I guess you can call it that...
Some people have actually advocated the use of Java on embedded targets. Whether it makes sense is probably a topic that would make for a very long and heated debate.
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
-
I've once seen a embedded Java product. Required lots of memory and didn't perform well.
Perhaps a good prototyping tool, but not for something serious.
I don't think either end of the spectrum is a good place to be. You can't be productive if you have to spell out all the bits in a low level language (ASM) but you can not expect that you can get away with every abstraction (overhead) that works on a normal PC either...
But with hardware becoming bigger and cheaper, the balance can shift more towards the abstract (overhead) end, without paying too much a price - depending on the project of course. Lots of project don't need/have critical timing that would require dropping down to ASM, some do.
Oh and the PI comment was meant as sarcasm... ;D
-
Oh and the PI comment was meant as sarcasm... ;D
I got that. ;D
"Software is getting slower more rapidly than hardware is becoming faster."
https://en.wikipedia.org/wiki/Wirth%27s_law
Still completely holds IMO.
-
Yes, but for most that is not the point. Economics and time to market are the point.
For a developer that translates to: the more productive you can be, the better. Productivity lies in tools and language (to some degree).
The main problem is that the vendors do not put much effort in innovating their toolset (compilers, -new- language support etc). That and the conservative nature of the seasoned embedded developer, means that C is still regarded as king (generalizing here of course).
-
I've once seen a embedded Java product. Required lots of memory and didn't perform well.
Perhaps a good prototyping tool, but not for something serious.
You can't judge from single sample. There are great chances that SIM card in you phone is Java VM (ETSI TS 102 241 (https://www.etsi.org/deliver/etsi_ts/102200_102299/102241/13.00.00_60/ts_102241v130000p.pdf)). Also many smart cards happen to be JavaCard (https://en.wikipedia.org/wiki/Java_Card) type.
-
I've once seen a embedded Java product. Required lots of memory and didn't perform well.
Perhaps a good prototyping tool, but not for something serious.
You can't judge from single sample. There are great chances that SIM card in you phone is Java VM (ETSI TS 102 241 (https://www.etsi.org/deliver/etsi_ts/102200_102299/102241/13.00.00_60/ts_102241v130000p.pdf)). Also many smart cards happen to be JavaCard (https://en.wikipedia.org/wiki/Java_Card) type.
Just because it exists almost everywhere and is running on billions of devices doesn't mean it's the best thing ever like a lot of people want it to be. Just look at android, developing anything on android isn't the best experience you're gonna have. IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
But at the same time I'm not some corporation who prioritizes development speed(ie money) more than the actual outcome.
-
Just because it exists almost everywhere and is running on billions of devices doesn't mean it's the best thing ever like a lot of people want it to be. Just look at android, developing anything on android isn't the best experience you're gonna have. IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
But at the same time I'm not some corporation who prioritizes development speed(ie money) more than the actual outcome.
Hammer and screwdriver both are tools, just different. The same about almost everything including programming languages and operating systems - there are and always will be *different* tools for *different* jobs. Speaking of B# language - to me it looks like one of many (failed) startups that hoped to be bought by big business and make founders rich, yet did not succeed.
-
While the use of the # is annoying, at least they aren't using it to signal a CLR based language.
That said, I'm not sure whether it's better than a main stream Java implementation. Though the no GC is a good point (can't remember if the JVM for embedded provides this or not).
Post back with your findings if you do use it. I'd be interested in hearing them.
TonyG
-
That said, I'm not sure whether it's better than a main stream Java implementation.
Java needs way more resources than B#: "Minimal Java ME Embedded configuration: 32-bit MCU, 130 KB RAM, 350 KB Flash/ROM"
https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/ (https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/)
Such competing language can be better than Java in other sense as well - by being cheaper. This is for SE, only royalties (not including tools & support) but anyway:
https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained (https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained)
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else. These 'unnecessary' layers of abstraction are needed for when project become more complex and elaborate. At a certain point you cannot hold the details of the entire program in you head and you need to hide complexity behind an abstraction. Also when the API is designed well, it's very easy and quick to learn and use it in your program.
As for the language, I think a we can do a lot better than C. I would go for a subset of C++ just because most vendors have some level of support for it. Also meta-programming (C++20) is something to keep an eye on (beyond constexpr). That would allow you to express a lot of static stuff at compile time and control what can and cannot be done at runtime. "Wrong code should not compile".
I have serious doubts if any custom language could ever get the traction and maturity of a well established language, like C++ (breadth of support, tools, community etc). If you want to try something new, I would take look at Rust, which looks promising.
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else.
"API language"? What are you talking about? :-// Hardware Abstraction Layer shall never be part of programming language. It's just library at best, sometimes not even that. Language shall provide hardware *access*, not abstraction.
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else.
"API language"? What are you talking about? :-// Hardware Abstraction Layer shall never be part of programming language. It's just library at best, sometimes not even that. Language shall provide hardware *access*, not abstraction.
I meant the metaphor or analogy the API is expressed in. The logic and coherence between the different parts of the API. The better this 'language' (not a programming language!) the easier it is to learn it and use it - once you understand the premise on which it is based...
Better?
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else. These 'unnecessary' layers of abstraction are needed for when project become more complex and elaborate. At a certain point you cannot hold the details of the entire program in you head and you need to hide complexity behind an abstraction. Also when the API is designed well, it's very easy and quick to learn and use it in your program.
As for the language, I think a we can do a lot better than C. I would go for a subset of C++ just because most vendors have some level of support for it. Also meta-programming (C++20) is something to keep an eye on (beyond constexpr). That would allow you to express a lot of static stuff at compile time and control what can and cannot be done at runtime. "Wrong code should not compile".
I have serious doubts if any custom language could ever get the traction and maturity of a well established language, like C++ (breadth of support, tools, community etc). If you want to try something new, I would take look at Rust, which looks promising.
I'm not saying all layers of abstraction are bad, it's the unnecessary layers that are. This is happening more and more these days and is probably the reason why software is getting slower. And not to forget all the extra stuff that isn't really necessary, but that's another topic for another time.
-
Java needs way more resources than B#: "Minimal Java ME Embedded configuration: 32-bit MCU, 130 KB RAM, 350 KB Flash/ROM"
https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/ (https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/)
Such competing language can be better than Java in other sense as well - by being cheaper. This is for SE, only royalties (not including tools & support) but anyway:
https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained (https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained)
Really good point. I've spent way too much time working with i.MX 6 lately :)
TonyG
-
That said, I'm not sure whether it's better than a main stream Java implementation.
Java needs way more resources than B#: "Minimal Java ME Embedded configuration: 32-bit MCU, 130 KB RAM, 350 KB Flash/ROM"
https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/ (https://terrencebarr.wordpress.com/2013/04/15/this-is-big-java-on-arm-cortex-m3m4/)
Such competing language can be better than Java in other sense as well - by being cheaper. This is for SE, only royalties (not including tools & support) but anyway:
https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained (https://blogs.oracle.com/jtc/java-se-embedded-pricing-explained)
Both valid points. Unfortunately, as I said, they never released anything. Not even a single document AFAIK in over 13 years.
;D
-
I'm not saying all layers of abstraction are bad, it's the unnecessary layers that are. This is happening more and more these days and is probably the reason why software is getting slower.
If something is unnecessary, then what's the point of using it? Typical example - embedded programmers complaining that stm32 HAL libraries are "bloated". Well.. they are - for those who want to compile their "Hello World" example fast and for free. So what? If you want something better then buy it or create yourself.
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else.
"API language"? What are you talking about? :-// Hardware Abstraction Layer shall never be part of programming language. It's just library at best, sometimes not even that. Language shall provide hardware *access*, not abstraction.
API is just part of the system knowledge or process knowledge. Nothing to do with programming or programming language itself.. Same with memory management, it is just pea counting.
-
IMO I just feel limited by all these abstractions and layers to "simplify" things, but at some point you're just shooting yourself in the foot.
Choosing the right abstracts and "API language" is a true art. When done well, programming peripherals become one-liners that you still understand after 6 months doing something else. These 'unnecessary' layers of abstraction are needed for when project become more complex and elaborate. At a certain point you cannot hold the details of the entire program in you head and you need to hide complexity behind an abstraction. Also when the API is designed well, it's very easy and quick to learn and use it in your program.
As for the language, I think a we can do a lot better than C. I would go for a subset of C++ just because most vendors have some level of support for it. Also meta-programming (C++20) is something to keep an eye on (beyond constexpr). That would allow you to express a lot of static stuff at compile time and control what can and cannot be done at runtime. "Wrong code should not compile".
I have serious doubts if any custom language could ever get the traction and maturity of a well established language, like C++ (breadth of support, tools, community etc). If you want to try something new, I would take look at Rust, which looks promising.
I'm not saying all layers of abstraction are bad, it's the unnecessary layers that are. This is happening more and more these days and is probably the reason why software is getting slower. And not to forget all the extra stuff that isn't really necessary, but that's another topic for another time.
It is how you make the abstraction, if done correctly it simplifies if done poorly is makes things more complex. For my thin (I do mean thin) knowledge and experience with many system with "abstraction" it usually makes things more complex and harder to really understand, like example poorly documented do it all libraries (with usually colliding logic or nomenclature with parent system). Besides when talking general programming languages the interface is always textual functions, calling layers of more or less mystery functions. Which is tried to compensate with fancy monsters (even more hidden behavior with textual interface) and tabulations.
-
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
Other than assembly language, only 3 languages pass the test of time: FORTRAN, COBOL and C. Everything else is just a passing fad.
FORTRAN 1957 - still used for scientific computing
COBOL 1959 - still used in banking
C 1972 - A newcomer but it seems to have staying power. We’ll have to watch and see.
-
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
Other than assembly language, only 3 languages pass the test of time: FORTRAN, COBOL and C. Everything else is just a passing fad.
FORTRAN 1957 - still used for scientific computing
COBOL 1959 - still used in banking
C 1972 - A newcomer but it seems to have staying power. We’ll have to watch and see.
You forgot FORTH :-DD
-
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
Other than assembly language, only 3 languages pass the test of time: FORTRAN, COBOL and C. Everything else is just a passing fad.
FORTRAN 1957 - still used for scientific computing
COBOL 1959 - still used in banking
C 1972 - A newcomer but it seems to have staying power. We’ll have to watch and see.
You forgot FORTH :-DD
It's only been around since 1970 (coincidentally, the year I started using the IBM 1130 and Fortran) but I don't consider it in common use. More of a curiousity and popular in certain niche applications but definitely not mainstream. Neither is APL and although Algol is the grandfather of all block structured languages, it too lies on the ash heap of history.
-
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
Other than assembly language, only 3 languages pass the test of time: FORTRAN, COBOL and C. Everything else is just a passing fad.
FORTRAN 1957 - still used for scientific computing
COBOL 1959 - still used in banking
C 1972 - A newcomer but it seems to have staying power. We’ll have to watch and see.
As I said in the Rust thread, it's interesting to try and understand why this is so, and why we have basically failed to design new languages with as much popularity (I would keep COBOL outside of it, as I guess it's mainly legacy stuff still running, and maintained, but I'm not sure there are a lot of new full developments in COBOL, whereas there clearly are in C and FORTRAN.)
Sure both C++ and Java look like candidates. But not quite IMO. There are many possible reasons why they won't be.
C++ has become a huge monster. Much too complex. Whereas it has gotten some traction for embedded development (although relatively little), it looks like the majority is running away from it, in favor of Java, Go, maybe now Rust... so its future is really unknown to me.
Java is suffering from other problems. One of them is fragmentation. Java is of course heavily tied to its runtime. Each new runtime environment version may break compatibility in various ways. It's a broken approach IMO as a language.
And sure there will always be languages that last in niche areas, such as Ada, or even more niche, Forth.
-
It's "interesting" though to see how much programming languages still spark initiatives all over the place (while successes are very few...)
Other than assembly language, only 3 languages pass the test of time: FORTRAN, COBOL and C. Everything else is just a passing fad.
FORTRAN 1957 - still used for scientific computing
COBOL 1959 - still used in banking
C 1972 - A newcomer but it seems to have staying power. We’ll have to watch and see.
As I said in the Rust thread, it's interesting to try and understand why this is so, and why we have basically failed to design new languages with as much popularity (I would keep COBOL outside of it, as I guess it's mainly legacy stuff still running, and maintained, but I'm not sure there are a lot of new full developments in COBOL, whereas there clearly are in C and FORTRAN.)
Sure both C++ and Java look like candidates. But not quite IMO. There are many possible reasons why they won't be.
C++ has become a huge monster. Much too complex. Whereas it has gotten some traction for embedded development (although relatively little), it looks like the majority is running away from it, in favor of Java, Go, maybe now Rust... so its future is really unknown to me.
Java is suffering from other problems. One of them is fragmentation. Java is of course heavily tied to its runtime. Each new runtime environment version may break compatibility in various ways. It's a broken approach IMO as a language.
And sure there will always be languages that last in niche areas, such as Ada, or even more niche, Forth.
I started with essentially Fortran IV and I can still use that simple style in Fortran 77 or Fortran 90. If I don't want to use the new features, I'm not compelled to do so. Similarly, there's nothing in C++ that prevents it from taking C code (I could be way wrong on this, I don't use C++).
I have always thought of the other OO languages as kind of an elitist deal. See how smart we are? We think we understand C++! Good luck with that!
So, I like simple languages. If I want a variable to exist, I don't mind declaring it and I don't see where polymorphism plays into embedded programming. I suppose you could abstract a serial port, but why? Setting up two ports, in the same C file, using common code with different register locations isn't too much to ask. I really don't need a UART class to get the job done.
Science people (and aerospace, and math types) want a simple language that will allow them to play with equations. There is clearly a lot of this being done in MATLAB and wxMAXIMA but a lot is still being done fast in Fortran. The other day I was playing with Euler's Method for evaluating differential equations. I did it with MATLAB using script code (a lot like Fortran without the declarations) and using the built-in ODE45 method. But I also did it in Fortran just for giggles. There is something satisfying about Fortran 77 under BSD Unix or the Photran package (up to Fortran 90) for Eclipse. Oddly, all 4 methods got the same results!
I think people like, and want, simple languages. Too many features lead to too much frustration and not much productivity. We look back at K&R C and think it is rather Spartan. Well, yes, it is! But it was a nice language for creating Unix and that was the goal for the language. I think the authors are amazed that it is still in use. Dennis Ritchie passed away several years ago but apparently Brian Kernighan is still around. He's only 4 years older than me so he must have been very young when we worked on C. Smart guy!
Python actually might have a future. It is a fairly simple language with a lot of packages for various classes of work. The built-in data structures are useful without being overwhelming. There are a lot of things to dislike but I'm gradually getting into using it. Slowly...
So to the language inventors of the world: Keep It Simple - Stupid! The KISS principal. Don't create the next DEBE language (Does Everything But Eat).
ETA:
Here's something cool: The same Fortran IV code that will run on my FPGA reincarnation of the IBM 1130 will also run on F77 under 2.11BSD Unix on my PDP11 emulations (simh) and the Photron package (gfortran) under Eclipse on Windows 10. That's pretty nice portability!
-
Similarly, there's nothing in C++ that prevents it from taking C code (I could be way wrong on this, I don't use C++).
Indeed, that's mostly wrong. C and C++ are definitely two different languages with a different set of rules.
Unless you write C that *complies* with C++ rules, it won't compile as C++. A typical C project would need a lot of adjustments to compile as C++. Not worth it.
So, I like simple languages.
Simple is good. Wirth agrees. As simple as possible, but not simpler!
Multiple inheritance, for instance, is a plague that many serious computer scientists have asked to avoid at all costs.
OO-programming is pretty old stuff actually. It became popular as a hype, but has existed long before it became popular. Unfortunately, the whole hype around it pushed the release of atrociously complex languages that went way too far to solve inexistent problems by creating new ones.
So to the language inventors of the world: Keep It Simple - Stupid! The KISS principal. Don't create the next DEBE language (Does Everything But Eat).
Keeping things simple yet useful and consistent is a tough thing. That's been Wirth's battle for most of his life.
Popularity is a special beast as well. C has been popular because it's simple enough (thus simple to code for, and write compilers for...), does the job and has been well thought out. C++ became popular (even though maybe not as much) because it was the first language supporting OO paradigms that was at the same time almost as low-level and flexible as C (thus piggy-backing on C's merits). Java became popular mainly thanks to its "run anywhere" promise.
All in all, for a language to succeed, it has to solve problems that could not be solved before, or at least a lot less conveniently.
Many attempts at creating the next PL have failed mainly, IMO, because they didn't really solve anything that hadn't been solved already. Or because they were way too complex to handle. Or often both.
Some attempts should have had better luck though... but they were probably there either too early or too late.
Take a look at Modula-3 for instance. A serious look.
-
I am a huge fan of Niklaus Wirth, particularly his Pascal project. A lot of companies took his P4 compiler and built the interpreter and were off to the races. UCSD Pascal looks suspiciously like P4.
I think his book "Algorithms + Data Structures = Programs" should be required reading. Yes, it uses Pascal but that is less important than the lessons learned about the various algorithms. There's even a tiny Pascal compiler in the back of the book. Very limited but it demonstrates the clarity of the code. Truly elegant!
As to Modula 2 and Oberon, the lack of a 'free' toolchain kept me away because I simply didn't need the languages even though they are truly elegant. I like code that looks pretty and reads well.
There is an Oberon compiler for ARM and it isn't terribly overpriced and, in fact, is FREE for personal use. I don't know why I'm not using it. I just saw where it can do 'mbed' style drag and drop programming of certain boards. That is terrific, no JTAG! It also produces code that works with ST-Link.
https://www.astrobe.com/Cortex-M7/matrix.htm (https://www.astrobe.com/Cortex-M7/matrix.htm)
Cortext M7, that's the new Teensy 4.0
I'll look at Modula 3 but I mostly write code for embedded systems. I'll have to see if there is an ARM version.
-
As to Modula 2 and Oberon, the lack of a 'free' toolchain kept me away because I simply didn't need the languages even though they are truly elegant. I like code that looks pretty and reads well.
Yes, that's the problem with most Wirth's languages after Pascal. The lack of available compilers.
There is an Oberon compiler for ARM and it isn't terribly overpriced and, in fact, is FREE for personal use.
Yes, Astrobe is interesting. I think the guy behind it actually worked with Wirth, if I'm not mistaken. A book was released about Oberon on ARM, before Astrobe became a product.
It supports the latest "version" of Oberon, Oberon-07.
If you're interested in Oberon, there are a couple open source Oberon to C translators. One which is still actively maintained is OBNC. I experimented a bit with it, and it's pretty nice.
Given that it produces C code, it should be doable to use it for embedded development. I've looked a bit at this actually.
I'll look at Modula 3 but I mostly write code for embedded systems. I'll have to see if there is an ARM version.
The only thing you'll find is this I think: https://modula3.elegosoft.com/cm3/
-
What I found amazing about Wirth's syntax diagrams and his P4 compiler was the correlation between the two. The compiler block outline was derived exactly from the syntax diagrams. In my view, the syntax diagrams are far superior to BNF or EBNF. If you have all the diagrams, you can start writing a recursive descent compiler directly.
The ability to do this a predicated on a particular non-ambiguous grammar but that is not a limitation as the language clearly proves.
-
The ability to do this a predicated on a particular non-ambiguous grammar but that is not a limitation as the language clearly proves.
For simple parsing, a completely context-free grammar is required. Grammars of many languages, including C, are not context-free, so that makes parsers a hell to write with many context-dependent cases.
One fun example is the parsing of C types. Compare that to parsing Pascal (and all derivatives) types...
For some fun: https://cdecl.org/
Really, as much as I like C, I think Ritchie must have been high on some heavy stuff when designing the declaration of C types. That was the early seventies and all...