EEVblog Electronics Community Forum

Products => Computers => Programming => Topic started by: bjdhjy888 on September 24, 2019, 12:29:52 am

Title: Too many programming languages?
Post by: bjdhjy888 on September 24, 2019, 12:29:52 am
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?

p.s.: and one programming language's introduction textbook is often as thick as a brick used in the Empire State Building.

  :horse: :popcorn:
Title: Re: Too many programming languages?
Post by: wraper on September 24, 2019, 12:35:52 am
For MCU you are basically limited to C (8 bit) and C/C++ (ARM) for any serious use. And then there is Verilog/ VHDL if you want to use FPGA.
Title: Re: Too many programming languages?
Post by: Mechatrommer on September 24, 2019, 12:36:22 am
it been discussed many times everybody with opinions... but i tell you only once this time and then no more... if you want to risk obsolescence and have to relearn new language in the future, learn anything other than C/C++ now mix up your mind with what replies that will come... :scared:
Title: Re: Too many programming languages?
Post by: RoGeorge on September 24, 2019, 12:44:39 am
The thing is, as an electronics enginner, which one should I learn most!?

C and Python
Title: Re: Too many programming languages?
Post by: beanflying on September 24, 2019, 01:09:58 am
As someone who started programming with Basic/QBasic then Fortran and Pascal and punching Hex codes into 6800 Micros it's all easy  :scared:

I got back into programming Micros after a long long time away and like the others C or C+ makes most sense as a gateway to some of the others. Java is maybe on the way down compared to only a few years ago but Python is on my list of really must get some skills in.

Instead of a textbook and killing a few trees grab the Arduino IDE and a starter kit depending on where you already are. Run through the stock bits then come up with a job of your own and start to learn how to write your own. This worked for me and to get the rusty brain wheels turning again. Then kick onto other Micros. None of the skills learned with this step will be wasted moving forward.
Title: Re: Too many programming languages?
Post by: knapik on September 24, 2019, 03:27:11 am
Definitely learn C as its the must know language for anything related to embedded design.

When it comes to general programming for PCs, languages start to be chosen as the right tool for the right job, so the diversity can be quite handy. Most of the popular programming languages mainly boil down to C-like with object oriented programming built in, so learning C is a good stepping stone.
Title: Re: Too many programming languages?
Post by: MarkF on September 24, 2019, 03:54:31 am
Since I started programming in the late 1970's, I have used various assembly languages, Fortran, Ada, C/C++ and some Java.  In the last few years I've seen a call for C/C++ for application work and Java for web based interfaces.

On top of all that are all the different graphic and GUI needs.  The last place I worked at started requiring programs to be written with Qt because they needed to be cross-platform (Windows and Linux).  And don't forget database work is big too.

Most of my work was done on mainframe machines.  Since PCs have grown much more powerful, they are going away except for specialized cases.

I would definitely start with C/C++ and Java for general applications.  Just take a look at jobs being advertised (Indeed for example).


I would say skip the punch cards and paper tape that I had to deal with.   :-DD


Edit:  Programming is logic.  Language is syntax.  Simple...
Title: Re: Too many programming languages?
Post by: CatalinaWOW on September 24, 2019, 04:34:29 am
There are too many languages.  But it is the way it is so you will have to deal with it.

1.  Once you have learned one procedural language the others will come relatively easy.  Don't fret too much over which one you learn first.

2.  They are all hard unless you sit down and code in them.  Few people can learn any of these languages by reading a book.  That means that a language fits your current situation should be your first language.  Whether it is a programming course in school, a lab in school, a work project or a hobby project the need to generate code to solve your problem will force you to learn the language at hand, and most of these situations come with a predefined language requirement.

3.  Your later languages will follow this pattern - you will pick them up as needed for the current task.  Even within a language you will be relearning different flavors as time goes on. 

If these methods still leave you unsure where to start you can do worse than to start with either Python or C.  Both have broad application in engineering and will likely be around for at least a decade or two more, giving you plenty of time to learn whatever might displace them.

Title: Re: Too many programming languages?
Post by: techman-001 on September 24, 2019, 04:49:54 am
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?

p.s.: and one programming language's introduction textbook is often as thick as a brick used in the Empire State Building.

  :horse: :popcorn:

It depends on what you want to do. You said you're a electronics engineer so I'll assume you meant embedded control.

This is a contentious and flame prone question, and no doubt will soon devolve into rants, fud and being closed by the admins, especially if one offends the Gods of C, but before it gets that far I recommend :-

Low level you can't beat Forth
High level you can't beat Forth

So my advice is Forth. It's not 'popular' like C, but it's simple enough that you can learn all of Forth and even write your own Forth in a decade or so providing you put in some effort every day. After that you can hold the entire concept in your head.

You will need to be fluent in Assembler for your chip type(s) but you can learn Assembler as you learn Forth, they compliment each other.

It's quite unlikely you can learn all of C and write your own comprehensive C compiler in that same time frame.

Here is a  Forth sample: I recommend asking other programming language advocates to submit their code for this same task so you can compare them. Note I didn't need any libraries, there are no hidden include files, everything is present here for a STM32F0xx, Cortex-M0 MCU.

This Program calculates baudrates by reading the MCU configuration. Only works with USART1 16 bit oversampling which is the reboot default and will exit if 8 bit oversampling is in use.

It uses Mecrisp-Stellaris s31.32 fixed point support to calculate the Baudrate to two decimal places so you can choose the best BRR integer to use when setting up your terminal baudrate.

More information about Forth is available in my SIG below.

................... start.................
 $40021000 constant RCC ( Reset and clock control )
 RCC $4 + constant RCC_CFGR ( Clock configuration register  RCC_CFGR )
 $40013800 constant USART1 ( Universal synchronous asynchronous receiver  transmitter )
 USART1 $0 + constant USART1_CR1 ( Control register 1 )
 
 8000000 constant clock  \ default, change if not.
 : rcc_cfgr_pllmul? ( -- ) %1111  18  lshift $40021004 @ and 18 rshift ;  ( PLL Multiplication Factor )

 : clock_multiplier? ( -- ) RCC_CFGR_PLLMUL? 2 + ;  \ clock multiplier value

 : usart1_cr1_over8? ( oversampling8? -- true )  %1 15 lshift usart1_cr1 bit@ ;  \ oversampling mode. reboot default: 16

 : oversampling? ( -- )
   usart1_cr1_over8?  if ." USART: 8 bit oversampling ! MUST use 16 bit oversampling" exit then
 ;

 : >s31.32 ( u -- s31.32 ) 0 swap ;     \ convert to a s31.32 number   

 : brr? ( desired baud rate -- brr )              \ brr =  ((pll multiplier * clock)/desired baud)/2
   >r
   clock_multiplier?  clock *  >s31.32     \ calculate bus clock, convert to s31.32
   r@ >s31.32                             \ desired baud rate
   f/                                       \ f. divide
   2  >s31.32                            \ last step is divide by 2
   f/                                      \  f. divide
   2                                     \ only display 2 comma places
   ." for a baud rate of " r> . ." the usart1_brr should be: " f.n cr
 ;

 : baud?  ( -- ) cr cr
   oversampling?
   ." CLOCK: " clock . ." PLL multiplier: " clock_multiplier? . ." BUS CLOCK: " clock clock_multiplier? * . ." Hz " cr
   115200 brr?
   460800 brr?
   921600 brr?
   1843200 brr?
   cr 
 ;

......... end............

\ Output
\ baud ?
\
\ CLOCK: 8000000 PLL multiplier: 6 BUS CLOCK: 48000000 Hz
\ for a baud rate of 115200 the usart1_brr should be: 208,33
\ for a baud rate of 460800 the usart1_brr should be: 52,08
\ for a baud rate of 921600 the usart1_brr should be: 26,04
\ for a baud rate of 1843200 the usart1_brr should be: 13,02
Title: Re: Too many programming languages?
Post by: MarkF on September 24, 2019, 05:53:39 am
I'm sorry.  But I have never seen a military or commercial contract that requested Forth be used.
I worked for someone years ago who loved Forth solely for how much he could do in 'one' line of code. 
@techman-001 appears to be another.  Not a reason to invest time learning Forth.

And as far as Assembler, you would have to be doing something very very critical NOT to use a higher level language.  Even for todays microcontrollers. 
The customer will always opt for the 'readability' and 'long term support' of a high level language. 
I have seen 5000+ line Assembly programs.  It is NOT going to happen any more.


Unless it is for a home project.  Then have at whatever suites your fancy.
If you are looking for a career, stick with the main stream languages.
Title: Re: Too many programming languages?
Post by: Berni on September 24, 2019, 05:58:34 am
As others have said C

Not only is it a popular language that is similar in syntax to a lot of others, but it also teaches you a lot of low level things about how computers work, yet is high level enough to be useful for large projects.

If you are going to be programming modern microcontrollers you are going to be using C/C++ because this is what everyone else uses.

If you are going to be programing for PCs then the choice is a lot more broad as the resource hunger of bigger languages is not a problem due to how powerful PCs are. There you might be looking at C#, Java, Python... And if you work with web applications then also PHP, Javascript, Ruby...
Title: Re: Too many programming languages?
Post by: tggzzz on September 24, 2019, 07:55:10 am
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?

You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple.

So, what types are there?

Procedural: C is the best choice.

Object Oriented: Java is the best choice, C# is a me-too version of Java, C++ is an abortionate mess where knowledgable people spend a lot of time arguing over what ought to happen. Neither Java nor C# are for embedded applications.

FPGA: either Verilog or VHDL.

Multicore/multithreaded: xC for embedded, since that has the only decent theoretical and practical pedigree - and its concepts appear in more mainstream languages.

Statistics: I'll leave others to comment.

General purpose "get answers soonest": Python.

System simulation: that entirely depends on the application domain, and you won't have much choice.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on September 24, 2019, 08:31:20 am
 :popcorn:
Title: Re: Too many programming languages?
Post by: techman-001 on September 24, 2019, 10:08:53 am
I'm sorry.  But I have never seen a military or commercial contract that requested Forth be used.
I worked for someone years ago who loved Forth solely for how much he could do in 'one' line of code. 
@techman-001 appears to be another.  Not a reason to invest time learning Forth.

And as far as Assembler, you would have to be doing something very very critical NOT to use a higher level language.  Even for todays microcontrollers. 
The customer will always opt for the 'readability' and 'long term support' of a high level language. 
I have seen 5000+ line Assembly programs.  It is NOT going to happen any more.


Unless it is for a home project.  Then have at whatever suites your fancy.
If you are looking for a career, stick with the main stream languages.


There is no need to be sorry, the OP didn't mention "military or commercial contracts" or even career, this is all you. I do note that you have left out "space". This is a list of 64 space/military projects that featured Forth.  https://www.forth.com/resources/space-applications/ (https://www.forth.com/resources/space-applications/)

The OP  asked "The thing is, as an electronics engineer, which one should I learn most!?".

I think you need to start a new topic if you want to argue against Forth.
Title: Re: Too many programming languages?
Post by: Howardlong on September 24, 2019, 10:45:36 am
I'm sorry.  But I have never seen a military or commercial contract that requested Forth be used.
I worked for someone years ago who loved Forth solely for how much he could do in 'one' line of code. 
@techman-001 appears to be another.  Not a reason to invest time learning Forth.

And as far as Assembler, you would have to be doing something very very critical NOT to use a higher level language.  Even for todays microcontrollers. 
The customer will always opt for the 'readability' and 'long term support' of a high level language. 
I have seen 5000+ line Assembly programs.  It is NOT going to happen any more.


Unless it is for a home project.  Then have at whatever suites your fancy.
If you are looking for a career, stick with the main stream languages.


There is no need to be sorry, the OP didn't mention "military or commercial contracts" or even career, this is all you. I do note that you have left out "space". This is a list of 64 space/military projects that featured Forth.  https://www.forth.com/resources/space-applications/ (https://www.forth.com/resources/space-applications/)

The OP  asked "The thing is, as an electronics engineer, which one should I learn most!?".

I think you need to start a new topic if you want to argue against Forth.

Forth (and derivatives) has definitely been used extensively in space applications in the last century, but I'm not sure how much nowadays. Key was that it allowed for compiled high performance code in space and performance constrained systems, at a time when C was relatively young and lacked cross compilation tools for such limited systems.

Me, I haven't touched Forth with a vengence since the 1980s, although around 1999/2000, I had some cursory involvement in an aerospace project that used it because it had a proven heritage in space. Re-use of apparently ancient put proven hardware and software designs and parts happens quite extensively in aerospace as a means of risk mitigation.
Title: Re: Too many programming languages?
Post by: techman-001 on September 24, 2019, 11:05:38 am

Forth (and derivatives) has definitely been used extensively in space applications in the last century, but I'm not sure how much nowadays. Key was that it allowed for compiled high performance code in space and performance constrained systems, at a time when C was relatively young and lacked cross compilation tools for such limited systems.

Me, I haven't touched Forth with a vengence since the 1980s, although around 1999/2000, I had some cursory involvement in an aerospace project that used it because it had a proven heritage in space. Re-use of apparently ancient put proven hardware and software designs and parts happens quite extensively in aerospace as a means of risk mitigation.

You're absolutely right of course, spacecraft do demand proven systems, and because Forth was there *first* it has the oldest proven systems. The cost of spacecraft failure is always astronomical  ;)

It's all well and good to have the latest "mega libraried, C++ syntax sensitive, self healing, quantum phase locked Arduino Meravigliarsi" but if it's not radiation hardened and spaceflight proven, who will risk a billion dollar spacecraft on it ?

There were eight Forth MCU's on the 100kg  "Philae Lander" that landed on Comet 67P/Churyumov-Gerasimenko in 2014, tho it was launched in 2004.


Title: Re: Too many programming languages?
Post by: SiliconWizard on September 24, 2019, 03:24:20 pm
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!
(...)
The thing is, as an electronics engineer, which one should I learn most!?

I suppose you mean embedded software dev, if you strictly talk about "electronics engineer" work. And even mostly dev on low-level targets such as MCUs or DSPs. (Of course an EE could be developing software for higher-level targets, but that's kinda off-topic. How much electronics is there left in that? You could also be an EE an be baking bread.)

So, with this in mind, there are actually NOT too many prog languages around, as many have already pointed out. For serious work in real industrial environments. So no need to be overwhelmed really. If you're feeling overwhelmed here, it's probably just because you're suffering from listening too hard to the hype, and not to what gets work done in real professional life.

 :horse:
Title: Re: Too many programming languages?
Post by: legacy on September 24, 2019, 03:39:47 pm
Statistics: I'll leave others to comment.

"R" (yup, "the R language", so it's called) and Matlab :D

p.s.
Mission Critical network applications and databases: erLang!
Title: Re: Too many programming languages?
Post by: legacy on September 24, 2019, 03:41:37 pm
Embedded Games: eLUA!
Title: Re: Too many programming languages?
Post by: Nusa on September 24, 2019, 06:52:27 pm
There is no obligation to learn all the languages. The important thing is that you have a solid foundation in the concepts and principles of programming. When you do need to become competent in another language, much knowledge is transferable from languages that you do know. The bulk of programming expertise is in knowing how to plan and solve the problem in front of you. The language is merely a tool to express your solution.

Similarly, if you're designing circuits, you don't need to know all the CAD packages. One is sufficient, until you have some overriding need to master another one. And that's just details, you already know the principles of making schematics.
Title: Re: Too many programming languages?
Post by: rstofer on September 24, 2019, 08:37:14 pm
In my view, the ordering is C, MATLAB and Fortran.  C++ and Java are at the bottom of the pile.

If you're going to design uC systems, C is the main language.  There may be some C++ around the edges so you might need to understand that version as well.

If you're going to do ENGINEERING, with numbers, charts and graphs you will need MATLAB.  There isn't much you can't do with MATLAB including interfacing with an Arduino (and the idea is workable on other processors).  System simulation is also a feature.  I REALLY like MATLAB.

Fortran - some days you just want to play with numbers.  There's a reason that a lot of scientific work is still done in Fortran, more than 60 years since it was introduced.  If you want to play with formulas, it is a great way to get it done.  There's an enormous code base written in Fortran.

Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 25, 2019, 12:30:40 am
haha, MATLAB is indeed very sexy. I'm learning it.
wait, ain't it a product with commands? but not a programming language?
 :-DD
Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 25, 2019, 12:33:55 am
My purposes of learning programming languages:

1. To make my STM32 IMU project.
          - Schematic and PCB layout done.
          - Parts soldered
          - Troubleshooting its I2C to read RX data on my terminal window  :palm:

2. To make a game like Diablo III.
          - Do I learn C, C++ to make it? I know, it's a huge project.  :P

Title: Re: Too many programming languages?
Post by: Fungus on September 25, 2019, 01:01:46 am
For MCU you are basically limited to C (8 bit) and C/C++ (ARM) for any serious use.

Arduinos are programmed in C++, they're 8 bits.  :popcorn:

Title: Re: Too many programming languages?
Post by: westfw on September 25, 2019, 01:22:12 am
Quote
Multicore/multithreaded: xC for embedded
Maybe still Fortran, if you want fancy math parallelized :-(
Be aware that for many languages, there is a subset of features used that is actually quite small, but the knowledge you'll need is more about standard and commercial libraries than the language itself.  (and I'm talking about things like "QT" and "beautifulSoup", not "STL" or "Posix."   Being able to program a cell-phone app is all about understanding the services available and the libraries to access them, rather than Swift vs Java or whatever (at least, as far as I've been able to tell.)  Being able to program deep embedded applications is more about understanding bare metal and critical timing, and being able to program complex physics is about understanding the math.)
Title: Re: Too many programming languages?
Post by: westfw on September 25, 2019, 01:27:23 am
Quote
[what do I need to]2. To make a game like Diablo III.
          - Do I learn C, C++ to make it? I know, it's a huge project.
Somewhere I have an eMail of advice from John Carmack (Doom, Quake, etc) about getting into game development.Essentially, he said: "develop your story telling and world development skills.  For every game engine developer, we need hundreds of people who can create an engaging story."  :-)
Title: Re: Too many programming languages?
Post by: westfw on September 25, 2019, 01:42:56 am
Quote
"develop your story telling and world development skills.  For every game engine developer, we need hundreds of people who can create an engaging story."

Ok, I dug it up, and that's not quite what he actually said.   Here's the whole conversation...(Note that DeVry has been heavily criticized for assort "bad behavior" since this conversation.)

Quote
Date: Wed, 02 Feb 2005 18:31:33 -0600
To: William "Chops" Westfield <billw@xxxxx>
From: John Carmack <johnc@xxxxxx>
Subject: Re: [xxxxx: Help with son's college choices]

At 03:35 PM 2/2/2005 -0800, William "Chops" Westfield wrote:
>I sorta feel like I know you well enough via the arocket mailing list
>to at least suspect that you might be willing to send your thoughts on
>the following issue raised by a co-worker.  I'm likely to be in a similar
>position myself, but not for another 8 years or so (whew!)  If not, feel
>free to ignore this message (or, if you have a canned answer, a pointer
>to that would be fine too.)
>
>Thanks
>Bill Westfield
>
>   To: parents@xxxxxx
>   From: xxxxxxx
>   Subject: Help with son's college choices
>   Date: Wed, 02 Feb 2005 17:56:11 GMT
>
>   I have a son who is a senior in high school. And he is an
>   avid videogame player. He does very well in school. He's
>   taken all of the Advanced Placement math and science classes
>   that his school offers and he gets all A's and B's. When we
>   discuss what he is going to take in college, he says that he
>   wants to be a "video game developer". I don't think that he
>   even knows what is involved in video game development. He
>   found a school in the Mid-West that offers a four-year-degree
>   in video-game development. The tuition is $40,000. He wanted
>   to know if I would I would pay for that. <yeah right!> I've
>   suggested that it would be more realistic to major in
>   Electrical Engineering or Computer Science.
>
>   Yesterday some representatives from the DeVry Institute in
>   Fremont visited his school. They offer a "certificate" in
>   "Video Game Technology".
>
>   Now he is all excited about going there. My question is how
>   do I get him to be more realistic about his future without
>   crushing his spirit and dreams? Has anyone else faced a
>   similar problem?

I'm not a huge proponent of the college game development programs.

The type of, or even existence of, a college degree doesn't play a huge
role in hiring in the game development business, but it may be a lot more
important if he finds out that game development isn't what he thought it
would be as a career, and wants to do something else.  I would agree that
pursuing an EE or CS degree is probably a better choice if he is going to
go through college.

The best way to get hired in the game industry is to develop free "game
mods" that leverage an existing commercial game to showcase the applicants
particular talents without having to develop everything from scratch.  The
developers of popular game mods can usually get a job pretty easily, and
full amateur teams are often "promoted" to real companies with expansion
pack development contracts.  Of course, the vast majority of amateur mod
projects collapse before producing anything worthy of showing to a
commercial developer because the process is a lot harder than it looks at
first glance.

As with just about anything, the way to be really successful is to make
yourself really valuable.  College can lay a foundation, but most of the
value must be self taught.  I didn't go through college, but I do tell
people that it can be a very information rich environment to do a lot of
your early learning in.

John Carmack

Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 03:09:28 am
haha, MATLAB is indeed very sexy. I'm learning it.
wait, ain't it a product with commands? but not a programming language?
 :-DD

Started working with scripts?
Title: Re: Too many programming languages?
Post by: RoGeorge on September 25, 2019, 04:05:29 am
Matlab is a commercial and proprietary software.  A free and open source alternative is Octave.  Most of the Matlab scripts can be run in Octave, too.

Recently, there is a trend in using Python instead of Matlab or Octave.

Another recent trend, but with a concept a little different, is Jupyter Notebook https://jupyter.org/
Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 04:17:20 am
I think Octave lacks Simulink and that’s a really big deal.

MATLAB Personal Edition is just $149.
Title: Re: Too many programming languages?
Post by: brucehoult on September 25, 2019, 04:19:27 am
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was. What I've needed a degree for (30 or 35 years out of date) has been to persuade government bureaucrats to give me a visa to work in their fine country in a job I've already been offered. (e.g. Russia and USA, both in the last five years, while I've been already over 50 years old)

A sufficiently motivated student can learn everything they need by peeking at the syllabus of some top university and learning the stuff they think you should know. And do lots and lots of programming. Your own projects. University assignments you find online. Help out with interesting projects on github (especially ones where salaried people are contributing on company time).

Not many people are sufficiently motivated to do this.

I'd argue that those not sufficiently motivated are not going to do very well once they leave university with a shiny new degree anyway.

It's different if you want to work in a field where you need to be a registered engineer, of course.
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 04:45:09 am
Quote
"develop your story telling and world development skills.  For every game engine developer, we need hundreds of people who can create an engaging story."

Ok, I dug it up, and that's not quite what he actually said.   Here's the whole conversation...(Note that DeVry has been heavily criticized for assort "bad behavior" since this conversation.)

Quote
Date: Wed, 02 Feb 2005 18:31:33 -0600
To: William "Chops" Westfield <billw@xxxxx>
From: John Carmack <johnc@xxxxxx>
Subject: Re: [xxxxx: Help with son's college choices]

At 03:35 PM 2/2/2005 -0800, William "Chops" Westfield wrote:
>I sorta feel like I know you well enough via the arocket mailing list
...

This is especially true now days when game engine technology is now so modular that a single programmer can put together quite an impressive game (Unity, UnrealEngine..etc), but creating game content is becoming ever bigger of a job to create all of the art assets. Art assets are no longer 8x8 pixel CGA color sprites, they are heavily detailed 3D models now. This is actually becoming a problem for the game industry that big flagship game titles are becoming too expensive to make.

Also working as a programmer for a game development studio is one of the worst programming jobs you can find. You will be pulling all-nighters all the time when the release date comes up, management will pile on impossible due dates, feature creep will attack you from every direction... its very hard work indeed. And if a big ambitious game flops really badly it can sink the whole company.

I personally like programming but i would NOT want to do it as a job. Its fun when you are making little tools on your own, but can be hell when you have to work on big complex projects that ended up being a tangled spaghetti mess due to all the legacy crap in there that nobody in the whole company knows what it does, but they are pretty sure things would break badly if you tried to remove it (and they do). So i'm instead a electronics engineer that still gets to program here and there because everything has a CPU in it these days and its really useful to be able to make your own tools for automating dull tasks.
Title: Re: Too many programming languages?
Post by: legacy on September 25, 2019, 05:35:29 am
Arduino ... just don't write code like this (http://www.ulisp.com/list?2RMK)  :palm: :palm: :palm:
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 05:58:02 am
Arduino ... just don't write code like this (http://www.ulisp.com/list?2RMK)  :palm: :palm: :palm:

Well huge source files are not a bad thing, tho they are annoying to work on sometimes.

For example one of the best FAT32 libraries for MCUs:
http://elm-chan.org/fsw/ff/00index_e.html (http://elm-chan.org/fsw/ff/00index_e.html)
Almost all of it is inside ff.c that is 6500 lines long.

Or one of the simplest multi platform OpenGL libraries:
https://github.com/raysan5/raylib/blob/master/src/core.c (https://github.com/raysan5/raylib/blob/master/src/core.c)
Tho this is only one C source file out of about 7 other smaller ones (But still in the thousands of lines)

But yeah being able to write large pieces of code without it ending up a unreadable mess is one of the important skills of a good programmer. Code is for humans to read too, not just for computers.
Title: Re: Too many programming languages?
Post by: Nusa on September 25, 2019, 06:24:35 am
Arduino ... just don't write code like this (http://www.ulisp.com/list?2RMK)  :palm: :palm: :palm:

I've seen worse code and fewer comments. But as you say, there's plenty of room for improvement besides the clear assumption that the reader already knows LISP.
Title: Re: Too many programming languages?
Post by: RoGeorge on September 25, 2019, 06:40:43 am
Arduino is not a programming language.

Degrees are pretty much worthless.

Wrong.  Very wrong.  Don't fall for it.  One should do as much school as one can, or afford to do.
Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 25, 2019, 06:43:39 am
Arduino = toy
 :-DD
 :wtf:
Title: Re: Too many programming languages?
Post by: westfw on September 25, 2019, 06:49:57 am
Code written by Rocket Scientists, in a language that's supposed to make errors harder to write and encourage good programming.

Code: [Select]
L_M_BV_32 := TBD.T_ENTIER_32S ((1.0/C_M_LSB_BV) * G_M_INFO_DERIVE(T_ALG.E_BV));

if L_M_BV_32 > 32767 then
    P_M_DERIVE(T_ALG.E_BV) := 16#7FFF#;
elsif L_M_BV_32 < -32768 then
    P_M_DERIVE(T_ALG.E_BV) := 16#8000#;
else
    P_M_DERIVE(T_ALG.E_BV) := UC_16S_EN_16NS(TDB.T_ENTIER_16S(L_M_BV_32));
end if;

P_M_DERIVE(T_ALG.E_BH) :=
  UC_16S_EN_16NS (TDB.T_ENTIER_16S ((1.0/C_M_LSB_BH) * G_M_INFO_DERIVE(T_ALG.E_BH)));

(Does anyone know where I can find longer snippets from this (the Ariane 5) code?)

(Worse, what passes for "good programming" changes over the years.  For example, I find it a bit mind-boggling that "strongly-typed" and "dynamically-typed" languages are both considered valid things.  More profound is the importance of efficiency, or the lack thereof.  I'm old enough really cringe at most "CircuitPython on a 32bit CPU with 256k of code memory and 4MB of solid state storage", but... it was cheap enough and fast enough and it fit, so why be upset?)
Title: Re: Too many programming languages?
Post by: Kjelt on September 25, 2019, 06:57:28 am
As most experienced embedded programmers with an EE education will tell you: C if you want to do embedded programming.

Then you often have to work with colleagues who have an informatics background, you have the HSI to test, and as EE you need to use the API or execute testroutines.
For that it was lua (scripting language not really a programming language in my book) but more and more I see python is used for this. So I would recommend Python for this.

If you have gained knowledge in C and get a career in software programming rather than EE design, you want to learn about OO thought patterns and then you want to look at C++,
but that is for later.

So C and python.

--------------------------------- ANTI TROLL MESSAGE ------------------------------
I don't know from which planet Mr Forth came and which rock he crawled under but in 25 years and 5 companies experience as an embedded software engineer with EE education I never had to use it.
It is also not mentioned in any top 10 embedded programming languages so I just label it as trolling and personal preference.

https://www.fossmint.com/programming-languages-for-embedded-systems/ (https://www.fossmint.com/programming-languages-for-embedded-systems/)

Title: Re: Too many programming languages?
Post by: RoGeorge on September 25, 2019, 07:02:09 am
Arduino = toy
 :-DD
 :wtf:

Just to be clear, Arduino is using GNU C compiler and a very light IDE, so Arduino is C/C++.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 07:19:12 am
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The one exception turned out to be a company that produced and sold really shit software. Their unit tests executed code but didn't actually test anything, and whenever they encountered a NullPointerException, the response was the classic, wait for it, ...
try {
  // code where an NPE occurred
} catch (NullPointerException e) {
  // ignore it
}

Summary: having and using a decent degree was necessary for my career, but not sufficient.
Title: Re: Too many programming languages?
Post by: legacy on September 25, 2019, 07:41:30 am
to be pragmatic, no doubt that my degree is just "a piece of toilet paper" but it has just saved -20% off for a computer programming course related to the Erlang language (1).

See it this way: your degree can open some doors, or can get you some discount on opening doors  :D



(1) only if you are willing to spend two months around the arctic pole ... emm .... during winter, when it's most cold.
Anyway, legal details ...
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 09:29:08 am
Id say a degree is useful, but not the whole story.

Just because you have a degree from it does not mean you know how to program well. That comes from working on projects in your own time to gain experience. If you can't come up with enough personal projects then have a look at other peoples work on places like GitHub. If you find there project interesting don't be shy to make improvements and submit a pull request, the project author will be happy that you helped his project along. It will also teach you good coding style since the owner of the git repo will usually not accept shitty code to be merged in.

Perhaps not having a degree makes you fall trough a early sorting pass in interviews, but not all of them. Once you get the interview talk about the personal projects you worked on, show that you have experience in getting stuff done and they will likely take you.

If you can get a degree get one, if not don't worry about it. Its not worth putting yourself in 10s of thousands of dollars of debt for one.
Title: Re: Too many programming languages?
Post by: SparkyFX on September 25, 2019, 10:23:39 am
I think one should master one specific programming language, rather than being an expert of all.
Don´t be a "i only learnt xyz"-employee. The profession requires to learn new things constantly. Once you master one language good enough to get stuff working that is feasible for a one man show, the differences in syntax and concept to the next language are not that hard to understand (of course there are outliers and completely different concepts). Of course you need a lot of time to master just one, but it might be helpful to have an eye open for what others can do. The point is... functional languages do not differ much, object oriented languages do not differ much, once the concepts are understood and you can safely work with them it is usually quite sufficient, the rest is defined by time spent in which area.

Especially interpreter languages and their various frameworks can be written in various ways to make it a language of its own, so it is hard to tell if anyone ever masters all of that.

20 years ago the question was which license most employers bought and then you'd be stuck with that.

Quote
The thing is, as an electronics engineer, which one should I learn most!?
- Assembler/VHDL/Verilog to understand the hard basics and solve problems in realtime environments
- A high level language to understand the abstractions from Assembler and be more productive in doing so (functionality implemented per time)
- An interpreter language to make tasks comfortable, work with huge amounts of data, convert data between various formats and such
Title: Re: Too many programming languages?
Post by: techman-001 on September 25, 2019, 10:43:06 am
--------------------------------- ANTI TROLL MESSAGE ------------------------------
I don't know from which planet Mr Forth came and which rock he crawled under but in 25 years and 5 companies experience as an embedded software engineer with EE education I never had to use it.
It is also not mentioned in any top 10 embedded programming languages so I just label it as trolling and personal preference.

https://www.fossmint.com/programming-languages-for-embedded-systems/ (https://www.fossmint.com/programming-languages-for-embedded-systems/)

trolling noun (INSULTING)
the act of leaving an insulting message on the internet in order to annoy someone:
https://dictionary.cambridge.org/dictionary/english/trolling (https://dictionary.cambridge.org/dictionary/english/trolling)

Based on the definition of 'trolling' by dictionary.cambridge.org, I think the troll here is you.



Title: Re: Too many programming languages?
Post by: Kjelt on September 25, 2019, 11:11:38 am
Well another definition of trolling is : the act of trying to catch fish by pulling a baited line through the water behind a boat:

You keep on and on going about forth, no one is biting.
It is an interesting but not very popular language, not in the top 10 of embedded languages perhaps somewhere around rank 25 and it is not worth mentioning to someone serious about learning one or two relevant embedded software languages.
Title: Re: Too many programming languages?
Post by: RoGeorge on September 25, 2019, 11:33:24 am
VHDL and Verilog are not programming languages, those are for describing digital schematics.

HDLs (Hardware Description Languages) does not execute line by line, like a programming language, and they use different concepts than programming languages.  HDLs describe a digital schematic (logic gates, clocks, flip-flops, etc.), and not an algorithm.  In a digital schematic everything run at once, in parallel.  There is no order of execution that starts from the top of the first code line to the bottom line, like there is in programming languages.

It is just a coincidence that HDLs are made out of text lines, similar with the source code of a program, but they are very different and hard to learn.  Leave HDLs aside at the beginning.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 11:42:09 am
VHDL and Verilog are not programming languages, those are for describing digital schematics.

Ah, the Humpty Dumpty approach to "programming language", viz
Quote
“When I use a word,' Humpty Dumpty said in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.' 'The question is,' said Alice, 'whether you can make words mean so many different things.' 'The question is,' said Humpty Dumpty, 'which is to be master — that's all.”

You need to understand the difference between "structural" and "behavioural" HDL code.


Quote
HDLs (Hardware Description Languages) does not execute line by line, like a programming language.  They are very different and hard to learn.  Leave HDLs aside at the beginning.

HDLs are not hard. They are very different, that's precisely why learning them is beneficial.

The concepts and mentality embodied in HDLs are becoming more and more important in a multicore/multiprocessor/distributed world. It is vital that any electronic and embedded engineer understands FSMs, and the limits of what can and cannot be known in a world based on distributed computation.
Title: Re: Too many programming languages?
Post by: dave j on September 25, 2019, 11:50:32 am
Well another definition of trolling is : the act of trying to catch fish by pulling a baited line through the water behind a boat:

You keep on and on going about forth, no one is biting.
It is an interesting but not very popular language, not in the top 10 of embedded languages perhaps somewhere around rank 25 and it is not worth mentioning to someone serious about learning one or two relevant embedded software languages.
I think it's more extreme fanboyism/evangelical zeal rather than trolling.

Suggesting Forth as the language to learn if someone wants a career in embedded development is like suggesting Smalltalk to someone who wants to work in financial services. Sure some places use it but you'll be severely limiting the number of companies who'd want to employ you.

Learn a mainstream language (C for embedded) so you've got something you can fall back on before you experiment with obscure stuff with limited job opportunities.
Title: Re: Too many programming languages?
Post by: techman-001 on September 25, 2019, 12:00:49 pm
Well another definition of trolling is : the act of trying to catch fish by pulling a baited line through the water behind a boat:

You keep on and on going about forth, no one is biting.
It is an interesting but not very popular language, not in the top 10 of embedded languages perhaps somewhere around rank 25 and it is not worth mentioning to someone serious about learning one or two relevant embedded software languages.

It's a pity you seem unable to remember the OP's posts. I have reproduced them in full so you can see the OP does not mention "top 10 of embedded languages", "ranking", "relevant" or "serious". These are your interpretation of the OP's posts.

Forth is a good choice for a STM32 IMU *especially* troubleshooting I2C.

You could demonstrate technical reasons why you believe Forth is a bad choice and whatever you're advocating is a good one as the OP may appreciate the information in his decision making process. The OP may not be impressed with your mantra of "popular", "top 10 ranking from a Internet site" and so on.

At least you're not pushing Arduino.

OP Original:
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!
I think one should master one specific programming language, rather than being an expert of all.
The thing is, as an electronics engineer, which one should I learn most!?
OP Followup:
My purposes of learning programming languages:
1. To make my STM32 IMU project.
          - Schematic and PCB layout done.
          - Parts soldered
          - Troubleshooting its I2C to read RX data on my terminal window  :palm:

2. To make a game like Diablo III.
          - Do I learn C, C++ to make it? I know, it's a huge project. 
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 12:03:40 pm
Yep HDLs are not something to go into right away.

They are fundamentally different than normal programing languages. They compile into a circuit rather than a series of instructions. Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

Unless you are working with FPGAs and other programmable logic there is very little reason to know a HDL language at all. The thing is only useful for describing large digital circuits, but for smaller simpler digital circuits regular schematics work better anyway. FPGAs in general are a more "late game" sort of thing in electronics engineering, as you can do a lot of things without using them. The only reason you tend to use a FPGA in a project is because you ran out of all the other options. They are expensive and annoying to develop for.

So if you ask me only learn Verilog or VHDL if you have no other choice.
Title: Re: Too many programming languages?
Post by: techman-001 on September 25, 2019, 12:04:22 pm

... Suggesting Forth as the language to learn if someone wants a career in embedded development  ...
[/quote]

Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
Title: Re: Too many programming languages?
Post by: dave j on September 25, 2019, 12:19:22 pm
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.
Title: Re: Too many programming languages?
Post by: Kjelt on September 25, 2019, 12:22:05 pm
Well yes my interpretation was that the OP would use it in his professional career perhaps not as an embedded software engineer but as he stated an electrical engineer.

And the answer to his question:
The thing is, as an electronics engineer, which one should I learn most!?

can only be a language that is most used in his field of work , used by most companies in his line of work ,
and then it does not matter if you are in love with some obscure other language, the only relevant and to the point answer can be : C .

And I stop this discussion with you now, it is useless.
Let other engineers answer OPs question.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 12:24:10 pm
Yep HDLs are not something to go into right away.

They are fundamentally different than normal programing languages.

Correct.

Quote
They compile into a circuit rather than a series of instructions.

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 12:27:33 pm
Well yes my interpretation was that the OP would use it in his professional career perhaps not as an embedded software engineer but as he stated an electrical engineer.

And the answer to his question:
The thing is, as an electronics engineer, which one should I learn most!?

can only be a language that is most used in his field of work , used by most companies in his line of work ,
and then it does not matter if you are in love with some obscure other language, the only relevant and to the point answer can be : C .

And I stop this discussion with you now, it is useless.
Let other engineers answer OPs question.

That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.

The world is continually changing. Those that aren't used to changing will become irrelevant.

Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
Title: Re: Too many programming languages?
Post by: nfmax on September 25, 2019, 12:30:14 pm
HDL's are examples (pretty much) of declarative programming languages. These work by defining what needs to be done, not by explicitly stating a sequence of steps to be taken. There are declarative programming languages which do not compile into a circuit, but instead run on a normal computer: Prolog is one such.
Title: Re: Too many programming languages?
Post by: techman-001 on September 25, 2019, 12:44:52 pm
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.

It's perfectly natural to read into the OPs post the kind of scenarios that we are familiar with, it happens all the time.

As for  "Forth not being the best language to learn embedded on", are you speaking from experience, or quoting the usual opinions online ?

Very experienced posters here have said "learn the bare metal" again and again in other posts and specifically here in reply to the OP and I quote:
"Being able to program deep embedded applications is more about understanding bare metal and critical timing"

Libraries are irrelevant to someone *really* learning embedded. What is important is the manufacturers databook and something that allows real-time interactivity with the hardware ( Forth is perfect for this ) to obtain hands on experience and confidence in the datasheets. An oscilloscope and or logic analyzer is also very important.
To really LEARN deep embedded you MUST write you own low level stuff for all the peripherals you plan on using because this are essential exercises on the path to deep understanding.

Libraries are critical to Arduino, Python, Lua   'makers' who are after a fast ledshow or a cat door opener based on some cheap Chinese Pre-made board. These users couldn't care less about the intricacies of bare metal embedded design.

If you hate 2000 page databooks and love libraries, I believe that deep embedded is not for you and you will probably hate it. A career in embedded isn't for you in my opinion.

I have zero clue about game development as I'm a electronics technician, so I'm not qualified to offer any opinions there and will refrain from doing so.

Title: Re: Too many programming languages?
Post by: RoGeorge on September 25, 2019, 01:01:28 pm
Ah, the Humpty Dumpty approach to "programming language", viz
Quote
“When I use a word,' Humpty Dumpty said in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.' 'The question is,' said Alice, 'whether you can make words mean so many different things.' 'The question is,' said Humpty Dumpty, 'which is to be master — that's all.”

You need to understand the difference between "structural" and "behavioural" HDL code.

What did I do?  ;D
Title: Re: Too many programming languages?
Post by: Kjelt on September 25, 2019, 01:02:38 pm
That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.
The world is continually changing. Those that aren't used to changing will become irrelevant.
Ofcourse you have to change and adapt, this is practical answer to his question which language should I learn.
I also learned multiple languages and use them now and then.

The same for normal languages, if you ask I want to work as an electrical engineer in the western world, which single language should I learn, I would answer English.
Over 10-15 years this might become chinese but if I would say you should learn chinese and you try to get a job in the western world and don't speak any other language you know how it finishes.

Quote
Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
So what language would you answer to OP as the single language to learn for an EE to do some embedded development ?
Title: Re: Too many programming languages?
Post by: techman-001 on September 25, 2019, 01:06:28 pm
Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embodied in modern languages.

Technology is moving so fast, it's a scramble to keep up!

I especially don't want to bore you with Forth, but we have a Forth chip called the GA144 which has been around since 2012 I believe. It has 144 independent native Forth computers on a chip and it enables parallel or pipelined programming. This chip has no system clock and is available for $20 USD/ pack of 10.

I admit that I have little clue how to develop on the GA144, but the father of Forth (Charles Moore) felt it was the way forward from single mcu technologies at least 7 years ago.

For anyone interested in the technology: http://www.greenarraychips.com/home/products/index.html (http://www.greenarraychips.com/home/products/index.html) and a quality example heart rate monitor video developed with it: https://www.youtube.com/watch?v=1lcIn69umvU (https://www.youtube.com/watch?v=1lcIn69umvU)
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 01:16:59 pm
HDL's are examples (pretty much) of declarative programming languages.

A part of HDLs is declarative. Other parts are not, and are very similar to "conventional" languages.

Your statement is as valid as saying "C++ is a procedural language".
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 01:21:52 pm
Ah, the Humpty Dumpty approach to "programming language", viz
Quote
“When I use a word,' Humpty Dumpty said in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.' 'The question is,' said Alice, 'whether you can make words mean so many different things.' 'The question is,' said Humpty Dumpty, 'which is to be master — that's all.”

You need to understand the difference between "structural" and "behavioural" HDL code.

What did I do?  ;D

(1) you appear to not understand HDLs

(2) you snipped the context in which my statement was made. Here it is again, to refresh your mind and to allow others to follow the conversation:
VHDL and Verilog are not programming languages, those are for describing digital schematics.
in other words you are using the term "programming language" in a limited sense, without bothering to indicate the limits.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 01:27:03 pm
That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.
The world is continually changing. Those that aren't used to changing will become irrelevant.
Ofcourse you have to change and adapt, this is practical answer to his question which language should I learn.
I also learned multiple languages and use them now and then.

The same for normal languages, if you ask I want to work as an electrical engineer in the western world, which single language should I learn, I would answer English.
Over 10-15 years this might become chinese but if I would say you should learn chinese and you try to get a job in the western world and don't speak any other language you know how it finishes.

Quote
Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
So what language would you answer to OP as the single language to learn for an EE to do some embedded development ?

Mu in the Buddhist sense, as popularised in "Gödel, Escher, Bach" and "Zen and the Art of Motorcycle Maintenance". https://en.wikipedia.org/wiki/Mu_(negative)#In_popular_culture

If you ask the wrong question, you won't get the right answer!
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 01:31:30 pm
Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embodied in modern languages.

Technology is moving so fast, it's a scramble to keep up!

I especially don't want to bore you with Forth, but we have a Forth chip called the GA144 which has been around since 2012 I believe. It has 144 independent native Forth computers on a chip and it enables parallel or pipelined programming. This chip has no system clock and is available for $20 USD/ pack of 10.

I admit that I have little clue how to develop on the GA144, but the father of Forth (Charles Moore) felt it was the way forward from single mcu technologies at least 7 years ago.

Quite; neither have I!

OTOH, there is a commercially important language and processor family based on key concepts from the 1970s (Hoare's CSP). The first processors and language were developed in the 80s, the modern variant is a delight to use and "just does what it says on the tin": xC and the xCORE processors from XMOS.

However, I wouldn't recommend that a beginner learns them, but the 30,000ft overview of their philosophy and capabilities are very useful to understand.
Title: Re: Too many programming languages?
Post by: dave j on September 25, 2019, 01:44:23 pm
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.

It's perfectly natural to read into the OPs post the kind of scenarios that we are familiar with, it happens all the time.

As for  "Forth not being the best language to learn embedded on", are you speaking from experience, or quoting the usual opinions online ?
You criticized me earlier for "putting words in the OP's mouth". I would have thought you'd have avoided making the same mistake.


Quote
Very experienced posters here have said "learn the bare metal" again and again in other posts and specifically here in reply to the OP and I quote:
"Being able to program deep embedded applications is more about understanding bare metal and critical timing"

Libraries are irrelevant to someone *really* learning embedded. What is important is the manufacturers databook and something that allows real-time interactivity with the hardware ( Forth is perfect for this ) to obtain hands on experience and confidence in the datasheets. An oscilloscope and or logic analyzer is also very important.
To really LEARN deep embedded you MUST write you own low level stuff for all the peripherals you plan on using because this are essential exercises on the path to deep understanding.

Libraries are critical to Arduino, Python, Lua   'makers' who are after a fast ledshow or a cat door opener based on some cheap Chinese Pre-made board. These users couldn't care less about the intricacies of bare metal embedded design.

If you hate 2000 page databooks and love libraries, I believe that deep embedded is not for you and you will probably hate it. A career in embedded isn't for you in my opinion.
Whilst understanding things from databooks upwards is essential to really understand embedded development, presenting someone who doesn't yet know how to program a 2000 page MCU databook is just going to frustrate and discourage them. Frameworks and libraries provide a leg up early on so you can get to grips with the programming without the distraction of understanding the complexities if an MCU. They can be discarded easily enough later.

Quote
I have zero clue about game development as I'm a electronics technician, so I'm not qualified to offer any opinions there and will refrain from doing so.
I've dabbled with graphics and game development as a hobby for nearly 40 years so know a bit about it. I'd advise the OP to focus on learning programming via game development rather than embedded. There is plenty of material available aimed at beginners and drawing stuff on screen means you get lots of visual feedback to keep you interested - something that is lacking with embedded. You also just need your PC - don't need to spend money on additional devices.
Title: Re: Too many programming languages?
Post by: SiliconWizard on September 25, 2019, 02:25:47 pm
Arduino ... just don't write code like this (http://www.ulisp.com/list?2RMK)  :palm: :palm: :palm:

Damn you. Made me want to puke! :-DD
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 04:13:33 pm

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.

Well yes not everything can compile into a circuit. But these non-synthesizeable features of HDL are just a easily simulateable adaption of the languages core features to make it WAY easier to write testbenches. Having to use a different language to write the testbench would be annoying. Since simulators execute HDL code one line at a time this was easily used to implement the classical sequential execution since that way of programing lends itself much nicer to complicated behavior. So since testbenches are such a key part of HDL development the language reflects that (But still non-synthesizable features are only a fraction of the total feature set of the language)

My point is that HDL languages ware designed from the beginning to look similar to circuits (Much like C tries to work like a more readable platform independant assembler). The language itself calls things a "wire" or "register" rather than a variable. Its variables can be tristate Z or undefined X, things that only make sense in circuits.

In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL. For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.

Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.
Title: Re: Too many programming languages?
Post by: SparkyFX on September 25, 2019, 04:33:53 pm
I think we can agree that you have to learn HDLs to actually use them. Maybe implement a state machine and make them do things sequentially?

Large parts of this discussion are pretty much irrelevant to someone asking what programming language to start with.

Should you ever work with PLCs (maybe an application that interacts with them), then you probably get in contact with proprietary languages as Simatic S7.
Title: Re: Too many programming languages?
Post by: brucehoult on September 25, 2019, 05:25:33 pm
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 05:54:41 pm

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.

Well yes not everything can compile into a circuit. But these non-synthesizeable features of HDL are just a easily simulateable adaption of the languages core features to make it WAY easier to write testbenches.

They are not "just" that; they are of far wider applicability.

They can be used to
Put those two together and you can have stepwise refinement of a system.

Quote
Having to use a different language to write the testbench would be annoying.

Irrititating, but not fundamental. Test vectors are often written in a different language simply because that is the easiest way to create them.

Quote
Since simulators execute HDL code one line at a time this was easily used to implement the classical sequential execution since that way of programing lends itself much nicer to complicated behavior. So since testbenches are such a key part of HDL development the language reflects that (But still non-synthesizable features are only a fraction of the total feature set of the language)

A simulator's run-time execution model is irrelevant, and is not specified as part of an HDL. It may or may not be sequential; consider high-end simulators where some of the simulation is executed in hardware.

I have no idea what you might mean by "fraction of the total feature set".

Quote
My point is that HDL languages ware designed from the beginning to look similar to circuits (Much like C tries to work like a more readable platform independant assembler). The language itself calls things a "wire" or "register" rather than a variable. Its variables can be tristate Z or undefined X, things that only make sense in circuits.

Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.

Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.

Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 06:04:07 pm
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.

Agreed.

But the probability of someone knowing the theory is far higher if they have been to a good course at a good university. It is very difficult to imbibe theory solely via self-study; in my career I can count the number of people I've seen achieve that on the fingers of one hand.

I've seen far too many candidates with a stunted view of theory. They often waste a lot of time and energy
Title: Re: Too many programming languages?
Post by: brucehoult on September 25, 2019, 07:17:53 pm
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.

Agreed.

But the probability of someone knowing the theory is far higher if they have been to a good course at a good university. It is very difficult to imbibe theory solely via self-study; in my career I can count the number of people I've seen achieve that on the fingers of one hand.

Sure. It's difficult. I said that. That's what interviews are for. Or, better, trial periods or short term contracts leading to possible full-time employment later.

I studied some theory at university in 1981-1983, though actually 90% of what I learned was self-study in the library basement reading CACM and SIGPLAN and doing my own projects, NOT from the lectures and assignments.

Since then I have self-studied at least the following things that didn't exist in 1983:

- MIPS, SPARC, ARM (A32, T16, T32, A64), PA-RISC, i386, PowerPC, Alpha, x86_64, AVR, RISC-V programmer's models and assembly languages.

- C++, Java, C#, Perl, Python, Ruby, Dylan, ANSI Common Lisp, OCaml, lua

- MacOS, Win32, Linux, Solaris (and others), NeXTStep/Rhapsody/OS X (all at home at first, then leading to jobs e.g. Solaris on an old SPARC ELC sold by the local university for $50)

- modern programming language and compiler theory (at home at first, later leading to jobs and learning more on the job)

- instruction set and microarchitecture design (at home at first, later leading to jobs and learning more on the job)


Well, I'm sure there's a whole lot more I can't think of right now.

Even if you get spoon fed some initial theory in three or four or six years at a university, there is going to be a WHOLE HEAP of new stuff invented in the 40+ years you're going to be working after university, so you'd BETTER be capable of learning it on your own if you don't want to become a dinosaur.
Title: Re: Too many programming languages?
Post by: Berni on September 25, 2019, 07:24:23 pm
Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.
Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Well yes you can make up new concepts in languages and VHDL was designed to encourage that. But the basic things that HDL languages give you from the start are pretty circuit oriented.

My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)

Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.


Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.
Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

I was just trying to show that a parallel architecture does not necessarily need a language designed for parallelism and vice versa. There are compilers that can turn C code into FPGA gates and a HDL simulator is essentially an interpreter that runs HDL code on a CPU. Just that certain languages are better suited for describing certain things.

Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.

Its not the XC language that is at fault here.

Its more that the hardware it it being compiled for is not so great at executing the languages special multitasking features, as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

Go ahead and try out one of there devboards if you don't believe me.
Title: Re: Too many programming languages?
Post by: legacy on September 25, 2019, 07:58:20 pm
@tggzzz
Have you ever modeled anything with "Stood"?
My curiousity  :D
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 08:44:00 pm
@tggzzz
Have you ever modeled anything with "Stood"?
My curiousity  :D

Nope; never even heard of it!
Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 09:00:46 pm
Matlab is a commercial and proprietary software.  A free and open source alternative is Octave.  Most of the Matlab scripts can be run in Octave, too.

Recently, there is a trend in using Python instead of Matlab or Octave.

Another recent trend, but with a concept a little different, is Jupyter Notebook https://jupyter.org/

This is a script I wrote for plotting a differential equation using Euler's Method and it produces identical results in Octave and MATLAB.  At the university my grandson is attending, everything they do revolves around MATLAB.  The DE course can spend more time on applications and less time worrying about hand solving/plotting DEs.  What a tremendous improvement over the course I took about 48 years ago.  His course is actually fun!


Code: [Select]
steps=100;                              % number of steps per unit time
dt=1/steps;
t=0:dt:20;                              % t will have 2001 values
n=length(t);
y=zeros(1,n);                           % fill vectors with 0
dy=zeros(1,n);
y(1)=0;                                 % initial value of y(0)
                                        % but index starts at 1
dy(1)=-2*y(1)+2*(1+sin(2*t(1)));        % expression for y'(0) index = 1

for j=2:n                               % step through 2000 values
    y(j)=y(j-1)+dy(j-1)*dt;             % compute value of current y       
    dy(j)=-2*y(j)+2*(1+sin(2*t(j)));    % compute value of y'
end

plot(t,y,t,2*(1+sin(2*t)))
xlabel('t')
ylabel('y')
legend('y(t)','Forcing Function')
title(['Euler''s Method - ' num2str(steps) ' steps per unit time (t)'])[/font]
shg                                     % pull graph to top
[/font]


Is this considered 'programming'?  I suspect it should be...
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 09:09:34 pm
Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.
Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Well yes you can make up new concepts in languages and VHDL was designed to encourage that. But the basic things that HDL languages give you from the start are pretty circuit oriented.

Naturally they do that reasonably well, but they are far from limited to doing that.

Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)

Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.


Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.


Quote
Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.
Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

I was just trying to show that a parallel architecture does not necessarily need a language designed for parallelism and vice versa. There are compilers that can turn C code into FPGA gates and a HDL simulator is essentially an interpreter that runs HDL code on a CPU. Just that certain languages are better suited for describing certain things.

I make exactly that point - repeatedly.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.

Quote
Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.

Its not the XC language that is at fault here.

Its more that the hardware it it being compiled for is not so great at executing the languages special multitasking features,

Please explain that assertion.

Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

Quote
Go ahead and try out one of there devboards if you don't believe me.

I have done.

I found it did exactly what it was designed to do, without any strange "gotchas".

I found using it very easy, especially compared to other MCUs.
Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 09:57:36 pm
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 10:02:29 pm
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.

That's sane, even if I might quibble with some details.

I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career. You know the king of thing: not "10 years experience" but "1 years experience repeated 10 times" :)
Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 10:12:07 pm
I think Octave lacks Simulink and that’s a really big deal.

MATLAB Personal Edition is just $149.

SciLab is for you.

It includes Xcos as a substitute for SimuLink so I will install it and play around a bit.  In MATLAB SimuLink, I can plunk down integrators and other devices and model an analog computer solution to some set of equations.  I like analog computing!  That doesn't mean I'm any good at it, scaling time and magnitude are still not well understood, but I'm working on it.

I'll be hung up with MATLAB as long as my grandson is in college since the university uses it for all their math courses.  In fact, there is a first semester mandatory course on just MATLAB.  It was kind of fun!

ETA:  I installed SciLab and it works quite well.  The scripting language is somewhat different and the authors acknowledge that scripts are not nearly as portable as they are between MATLAB and Octave but it isn't a really big deal.  I haven't gotten to simulation.  I printed out the newcomers document and I'll look at it tomorrow.
Title: Re: Too many programming languages?
Post by: Kjelt on September 25, 2019, 10:15:21 pm
I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career.
I disagree, it is not the language, or the components or the tools that make a boring career it is sticking in one job, one company, one domain that makes it repetitive and probably boring. I have been an embedded C programmer for six different companies in five different domains and none of those jobs had much in common except for the language although in many job much more other skills, languages and multi disciplinary skills like mechatronics, mechanics, electronics and software were mixed in the job.
Title: Re: Too many programming languages?
Post by: rstofer on September 25, 2019, 10:19:05 pm
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.

That's sane, even if I might quibble with some details.

I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career. You know the king of thing: not "10 years experience" but "1 years experience repeated 10 times" :)

Heck, I quibble with it too!  Here's what I really think:  These is more money in managing engineering than in doing engineering.  You probably need some familiarity with programming but, if you're clever, you BUY programming, you don't DO programming.  In fact, the last thing you want to do is become the company's best programmer!  You'll never get promoted that way!

The day you graduate EE school, you sign up for an MBA program.  If you're an overachiever, go ahead and get your MSEE but still cap it off with an MBA (it'll be easy after all the math in EE school).  Never work a day as an engineer, just buy it!
Title: Re: Too many programming languages?
Post by: tggzzz on September 25, 2019, 11:39:55 pm
Heck, I quibble with it too!  Here's what I really think:  These is more money in managing engineering than in doing engineering.  You probably need some familiarity with programming but, if you're clever, you BUY programming, you don't DO programming.  In fact, the last thing you want to do is become the company's best programmer!  You'll never get promoted that way!

The day you graduate EE school, you sign up for an MBA program.  If you're an overachiever, go ahead and get your MSEE but still cap it off with an MBA (it'll be easy after all the math in EE school).  Never work a day as an engineer, just buy it!

Do you work for Boeing? :)

If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I have written contracts and been a project manager, but decided not to repeat that experience more than necessary :)

OTOH many times I have gone from initial concept, through hardware and software architecture and design (using analogue/FPGA/digital/hard&soft realtime/webshops/high availability telecoms/etc), through implementation, testing and acceptance trials. Great fun. (N.B. there was rarely a dedicated project manager per se, other than the engineers)
Title: Re: Too many programming languages?
Post by: emece67 on September 25, 2019, 11:55:07 pm
.
Title: Re: Too many programming languages?
Post by: rstofer on September 26, 2019, 12:34:28 am
If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I think I'll go with the Bureau of Labor Statistics definition of programmer:

https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm (https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm)

As opposed to Software Developer

https://www.bls.gov/ooh/computer-and-information-technology/software-developers.htm (https://www.bls.gov/ooh/computer-and-information-technology/software-developers.htm)

As opposed to Computer Hardware Engineer

https://www.bls.gov/ooh/architecture-and-engineering/computer-hardware-engineers.htm (https://www.bls.gov/ooh/architecture-and-engineering/computer-hardware-engineers.htm)

In the Electrical Engineer class, the BLS conflates power engineering with electonics engineering

https://www.bls.gov/ooh/architecture-and-engineering/electrical-and-electronics-engineers.htm (https://www.bls.gov/ooh/architecture-and-engineering/electrical-and-electronics-engineers.htm)

The nice thing about the BLS site is that you can drill down by area to find out how much the code weenies are making in Seattle ($129K and that's the median).  Scroll down about 2/3 of the page:

https://www.bls.gov/oes/current/oes151131.htm#st (https://www.bls.gov/oes/current/oes151131.htm#st)

Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 12:40:58 am
About HDLs, well I have used (and still use them) a lot, and, although I must admit that, in fact, they are not only HDLs, but also programming languages (hey, VHDL has pointers!!), I think that their purpose is not being used as such, but as HDLs. Maybe you will also need to learn any of them, but I doubt that you'll need them as a replacement for some programming language lacking ways to express parallelism.

Using an HDL as a replacement for a conventional general purpose language would be peverse.

Even if you don't use HDLs for hardware designs, the normal HDL design patterns really emphasise parallel operation - you simply can't ignore it (without being very peverse!).

Being familiar with such "thinking parallel" will be come a continually more important aspect of software over the next few decades.

Quote
I think other languages/tools (say OpenCL/WebCL) are/will be better for that.

Most conventional languages have parallel execution as a bolt-on. Some (and I'm looking at you, C!) couldn't even express threading until the last 5 years!

Quote
SW FSMs and systems comprised of various interconnected SW FSM can be understood without previous knowledge of HW FSMs and wrote even in plain vanilla C, I can's see here any benefit of HDLs over bare programming languages. And, if you insist on HDLs, why not SystemC?

The key point is to understand how and when to use one or more FSMs. How you implement them is a completely different issue. Indeed, frequently you implement part of a single FSM in hardware and the rest in software.

In general purpose languages there are several useful design patterns, e.g. 2D event/state jump tables and state==class.
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 12:42:59 am
If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I think I'll go with the Bureau of Labor Statistics definition of programmer:

They are contested and change over time.

When I was young "computer" was a job title.
Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 26, 2019, 01:08:45 am
Forth?  :horse:

Mom: Son, what's with these huge C/C++ books in your room?  :o
Me: Mom, I'm learning Forth.  8)
Title: Re: Too many programming languages?
Post by: westfw on September 26, 2019, 02:57:31 am
Quote
These is more money in managing engineering than in doing engineering.
Probably.  Did the OP talk about making lots of money?  I've been a manager; it wasn't as fun as being an engineer.  I hired some good people, though!
If you do go into management, try to be a GOOD manager.
Former co-worker's recent FB post:
Quote
Last week, Facebook employee Qin Chen jumped from the 4th floor of a Facebook building in Menlo Park leaving behind a wife and young daughter. Facebook, Google and other Silicon Valley companies are full of horrible, inexperienced managers and incompetent, uncaring HR staff that are ruining people's lives. It's time for Silicon Valley to clean up their act. Below is a screenshot with my blood pressure from a year ago while working at Facebook and today, one month after leaving Facebook.
[See also]: https://www.youtube.com/watch?v=VbEQriZEfoI (https://www.youtube.com/watch?v=VbEQriZEfoI)

Quote
I think one should master one specific programming language, rather than being an expert of all.
Horrible attitude.  I wouldn't have hired anyone who came in saying that.  I did hire people who hadn't programmed much in the "target language", because understanding the problems being solved is more important than being an expert at the programming language...

Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 26, 2019, 03:16:32 am
Quote
Quote
I think one should master one specific programming language, rather than being an expert of all.
Horrible attitude.  I wouldn't have hired anyone who came in saying that. I did hire people who hadn't programmed much in the "target language", because understanding the problems being solved is more important than being an expert at the programming language...
Yo dah boss!
 :-*
Title: Re: Too many programming languages?
Post by: brucehoult on September 26, 2019, 05:05:08 am
Quote
Last week, Facebook employee Qin Chen jumped from the 4th floor of a Facebook building in Menlo Park leaving behind a wife and young daughter. Facebook, Google and other Silicon Valley companies are full of horrible, inexperienced managers and incompetent, uncaring HR staff that are ruining people's lives. It's time for Silicon Valley to clean up their act. Below is a screenshot with my blood pressure from a year ago while working at Facebook and today, one month after leaving Facebook.

Yeah.

I just moved to Silicon Valley (and the US) this year to work for a startup. It's growing so quickly that inevitably some things are not as well coordinated would be ideal, and customers are constantly wanting things that are on the roadmap for 2020 or 2021 *tomorrow*, as well as asking for more documentation, more example code etc etc.

This is of course much better than no one noticing or caring about what you are doing.

Fortunately there are plenty of customers happy enough with what we *do* have now that design-wins and revenue are snowballing.

There is pressure to get things done but, unlike some of the companies you mention (and Apple and Amazon, by reputation), I don't see any expectation for people to work crazy hours and burn themselves out. I don't know whether this has something to do with our high proportion of seasoned veterans in their 40s, 50s, and even 60s who have seen it all and won't take the same crap that new grads will take.


As for the original question of this thread, which language(s) to learn, I have suggest two:

1) to learn about simple programming using libraries, and algorithms, you want a language like Scheme (especially the Racket system) or another similar language such as Lisp or Python or Ruby. For the more rigorous mathematically minded maybe go for Haskell or OCaml instead. Rust, Go, and D can also fit this bill.

2) to learn about how computers actually work, and best understand both their limitations and how to take maximum advantage of them you should simultaneously and incrementally learn C, a reasonably sane assembly language you have a C compiler for (RISC-V is ideal, MIPS, PowerPC, or one of the ARM variants are the next best thing, and AVR is not bad too), the CPU programmer's model (registers etc). how instructions are encoded in binary.

Write some code in C. Compile it and read the assembly language. Single step in both C and assembly language. Write some assembly language functions and call them from C. Call C library functions from assembly language.

Pay particular attention to pointers, arrays, globals vs stack vs heap. You're never going to be a good programmer, in ANY language, without a good mental model of how those work. And you're very unlikely to come up with the right mental model without looking at the assembly language.

For really advanced understand of algorithms and data structures and how they perform you need to know about caches, TLBs, branch prediction and things like that.
Title: Re: Too many programming languages?
Post by: Berni on September 26, 2019, 05:58:05 am
Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)
Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.
Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.

Please decide on the point you are trying to make, because you appear to be arguing against your previous points.

I said in the beginning of this conversation that it doesn't make sense to use HDL for anything other than describing complex digital circuits. I was making my point why this is the case.

As for modeling the whole system. I would love to do that, but have you had any luck getting the HDL source code from MCU or CPU vendors? What you tend to get at most is a executable of a simulator for the CPU core and maybe a few basic peripherals(Like a timer) if you are lucky.

In typical electronics engineering work you will pretty much only encounter HDLs when working with FPGAs. That is unless you work for a integrated circuit vendor like from TI to Intel or a architecture vendor like ARM.

Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 07:40:52 am
Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)
Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.
Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.

Please decide on the point you are trying to make, because you appear to be arguing against your previous points.

I'm not.

In my first post I wrote (new emphasis) "You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple."

Quote
I said in the beginning of this conversation that it doesn't make sense to use HDL for anything other than describing complex digital circuits. I was making my point why this is the case.

You put the point more strongly than that, and in any case, it is wrong.

Systems are more than merely digital bits, and HDLs can and do model other parts, e.g. analogue and (to stretch the point) humans.

Quote
As for modeling the whole system. I would love to do that, but have you had any luck getting the HDL source code from MCU or CPU vendors? What you tend to get at most is a executable of a simulator for the CPU core and maybe a few basic peripherals(Like a timer) if you are lucky.

That would be simulation, and while it can be done with softcores, it is not very sensible.

What is done is to model a processor's computation. Where appropriate, while refining the design some of the model can be moved into simulation, e.g. the boundary between hardware and software. Such stepwise refinement of a design is normal.

Quote
In typical electronics engineering work you will pretty much only encounter HDLs when working with FPGAs. That is unless you work for a integrated circuit vendor like from TI to Intel or a architecture vendor like ARM.

You should add in the concept of a system that will be partitioned between an FPGA, a processor, and everything else.

In all cases an engineer will use an appropriate tool for the job, and will swap between tools as the project progresses.

Quote
Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?

The word "really" implies a subjective response, and there would be legitimate differences of opinion. That's true for all languages and technologies.

Most importantly for this thread, the application of concepts is most definitely not limited by the hardware. In particular, CSP concepts are embodied in many languages (and some hardware), of which xC is the most "pure" example.

The concepts will survive and be useful during an entire career, even though languages come and go and mutate to breaking point.

The OP would do well to understand the concepts.
Title: Re: Too many programming languages?
Post by: Kjelt on September 26, 2019, 08:19:22 am
Perhaps a good idea to go back on topic: programming languages ?
Title: Re: Too many programming languages?
Post by: westfw on September 26, 2019, 09:43:15 am
OK.
There is no reason NOT to have exposed yourself at least briefly to a lot of languages.  Vastly Different languages.


Even if you were to become an embedded programmer who does all their products in C, you'll still need to understand "make", and "shell scripts" (or .bat files), possibly including a bunch of those obscure unix tools like sed, awk, and perl (all part of "shell scripts, I guess.  For windows these days you should probably learn powershell.)


People write python programs to extract data from spreadsheets, xml, json, and so on, for use by their C programs.

There's no reason NOT to expose yourself to Forth.  If nothing else, Forth-like interpreters are easy ways to put a relatively complex UI on your product.  Even if you actually write the interpreter in C, keeping to some of the Forth standard word names is "helpful" to other people (and maybe you won't have to document as much.)
Your editor might have an extension language that isn't C-like (EMACS uses lisp!)  And you might need it.
You'd need to know the assembly language of your target CPU, occasionally.   And since that target is likely to change over years, you should probably be pretty familiar with more than one assembler.  Assembly macros are their own language, and they're useful too.

You should figure out why everyone hates COBOL.  It wouldn't hurt to look at some of the other "historic" languages (Fortran, Algol, BASIC.)

You should figure out why the Aviation folk want Ada.

You should have at least an inkling of how to program a desktop and a cellphone, because those are frequently the user interface for an embedded system.
You should try a hardware definition language.  And a math tool that does symbolic manipulation (so you can cheat on your calculus homework!)

Specialization is for Insects.
Title: Re: Too many programming languages?
Post by: Kjelt on September 26, 2019, 09:53:45 am
OK.
There is no reason NOT to have exposed yourself at least briefly to a lot of languages.  Vastly Different languages.


Even if you were to become an embedded programmer who does all their products in C, you'll still need to understand "make", and "shell scripts" (or .bat files), possibly including a bunch of those obscure unix tools like sed, awk, and perl (all part of "shell scripts, I guess.  For windows these days you should probably learn powershell.)


People write python programs to extract data from spreadsheets, xml, json, and so on, for use by their C programs.

There's no reason NOT to expose yourself to Forth.  If nothing else, Forth-like interpreters are easy ways to put a relatively complex UI on your product.  Even if you actually write the interpreter in C, keeping to some of the Forth standard word names is "helpful" to other people (and maybe you won't have to document as much.)
Your editor might have an extension language that isn't C-like (EMACS uses lisp!)  And you might need it.
You'd need to know the assembly language of your target CPU, occasionally.   And since that target is likely to change over years, you should probably be pretty familiar with more than one assembler.  Assembly macros are their own language, and they're useful too.

You should figure out why everyone hates COBOL.  It wouldn't hurt to look at some of the other "historic" languages (Fortran, Algol, BASIC.)

You should figure out why the Aviation folk want Ada.

You should have at least an inkling of how to program a desktop and a cellphone, because those are frequently the user interface for an embedded system.
You should try a hardware definition language.  And a math tool that does symbolic manipulation (so you can cheat on your calculus homework!)

Specialization is for Insects.
Are you a manager by any chance? They have no clue how much time it takes to snif on a language and learn something significant.
You can always say you have to know everything of everything. I don't agree.

You want to know a few programming languages and script languages and learn them when needed on the job, if not needed and you are not interested, don't bother.
Perhaps in the 60s you could know everything, nowadays that is really impossible and you better be pretty good at one thing (specialist) and moderately good at many other things, then be moderately to bad in everything. At least that is my experience. In our programming team we know who to contact for which subject because that person knows more about it then the rest of the team, every one has some kind of specialization (except the newbees that get to choose their future expertise in a team decision).

If we would have 8 specialists with the exact same specialization we could not do our job.
If we would have 8 generalists that know something to moderately about everything we could not do our job esp. not in time because we need to dig in to solve some difficult issues.

Besides the programming and scripting languages, compilers, other tools like versioncontrol etc. etc. etc. the most important thing for any company is the domain knowledge.
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 10:02:27 am
Perhaps a good idea to go back on topic: programming languages ?

It remains on the topic of the thread's title, even if you don't recognise that!
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 10:21:23 am
Are you a manager by any chance? They have no clue how much time it takes to snif on a language and learn something significant.

With experience and a halfway description of why a language exists (e.g. a whitepaper), it takes between 15 minutes and 1 hour to decide that language X is merely a minor variation on existing languages. Then language X can be ignored. That deals with 95% of "new" languages.

For the other 5%, it takes a couple of hours to understand the key benefits. You then remember the benefits and, when the benefits are sufficiently compelling for the current task, you start to learn and use that language. Today's example: I'm using OpenSCAD to create a collet for a 2465 fan, because it is a good fit for the problem at hand.

For beginners the timescale will be longer, of course.

Quote
You can always say you have to know everything of everything. I don't agree.

Strawman argument; nobody has said that.
Title: Re: Too many programming languages?
Post by: obiwanjacobi on September 26, 2019, 10:51:27 am
You are all so wrong!
 ;D

It is not the language you have to learn, it's the ecosystem (of a specific language/technology) you have to master.

Once you have some experience with any programming language syntax, learning other languages is really not a big deal (usually).
But learning the the environment is what requires time and experience.
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 11:08:55 am
You are all so wrong!
 ;D

It is not the language you have to learn, it's the ecosystem (of a specific language/technology) you have to master.

Once you have some experience with any programming language syntax, learning other languages is really not a big deal (usually).
But learning the the environment is what requires time and experience.

There's some truth in that, but the ecosystem is a secondary consideration since (to a large extent) the ecosystem comes with the language. Many ecosystems are similar in concept, but the details require learning.

Learning a new language is relatively easy provided that the language is of a similar type. Examples: Delphi/Pascal, Java/C#.

But knowing C isn't much help when it comes to learning R, OpenSCAD, Java, Erlang, VHDL, Matlab etc. Why? Because the underlying concepts and semantics are so radically different.

Knowing when not to use a language is an important skill.
Title: Re: Too many programming languages?
Post by: brucehoult on September 26, 2019, 11:26:21 am
You are all so wrong!
 ;D

It is not the language you have to learn, it's the ecosystem (of a specific language/technology) you have to master.

Once you have some experience with any programming language syntax, learning other languages is really not a big deal (usually).
But learning the the environment is what requires time and experience.

Even if you know the language, even if you know the ecosystem of standard libraries, by far the biggest task at a new company is learning all the tens of thousands of lines of local code they've written, often much more poorly designed and certainly less well documented than any language or standard library.

If you *don't* know the language and standard library used at your new job, learning those is usually a so much smaller job than learning the local environment that it's almost trivial. And you can google language and standard library questions.
Title: Re: Too many programming languages?
Post by: Kjelt on September 26, 2019, 11:57:41 am
Even if you know the language, even if you know the ecosystem of standard libraries, by far the biggest task at a new company is learning all the tens of thousands of lines of local code they've written, often much more poorly designed and certainly less well documented than any language or standard library.
100% agree and my last job it was more like 36Mlocs without documentation, and no in two years you don't master that.
Title: Re: Too many programming languages?
Post by: Berni on September 26, 2019, 03:09:43 pm
I'm not.

In my first post I wrote (new emphasis) "You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple."

Yep exactly, the right tool for the right job.

And the right job for a tool called HDL languages is describing behavior of complex digital circuitry as i have pointed out in the very beginning. If you have sensible real world examples of HDL used for something else i would love to see it.

Quote
I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?

The word "really" implies a subjective response, and there would be legitimate differences of opinion. That's true for all languages and technologies.

Most importantly for this thread, the application of concepts is most definitely not limited by the hardware. In particular, CSP concepts are embodied in many languages (and some hardware), of which xC is the most "pure" example.

The concepts will survive and be useful during an entire career, even though languages come and go and mutate to breaking point.

The OP would do well to understand the concepts.

Yes that is all fair and good, but the end goal of using a programing language is to create some useful output. So far XC can only be compiled into machine code for XMOS chips, so the only reason you would want to learn this language is if your end goal is running your program on one of these chips. If you want to run your code on something else like a PC you can do similar multitasking things in C++ or C# ..etc.

It makes no sense to learn new programing languages just for the sake of learning languages. Just go for what you are trying to do, have a look at what languages other people are using to do that and choose one of them and learn that as you go. Once you know one language its quick to pick up another one.

So since the main practical home user use for HDL is "FPGA stuff" it makes little sense to learn it until you actually get to using them and need it. Just don't get into the trap of "when you know how to use a hammer every problem looks like a nail" situation where you learn 1 single programing language and then use it for everything regardless of it being a good fit or not. Be open to learning new languages when there is good reason to do so. It usually taker longer to learn the environment and useful libraries than the language itself anyway.

Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 04:15:20 pm
I'm not.

In my first post I wrote (new emphasis) "You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple."

Yep exactly, the right tool for the right job.

And the right job for a tool called HDL languages is describing behavior of complex digital circuitry as i have pointed out in the very beginning. If you have sensible real world examples of HDL used for something else i would love to see it.

You have a strange view of the world: selecting a job for a tool. Most people select a tool for the job.

I don't have any examples that are in the public domain; sorry. However, you could start aquainting yourself with a subset of the concepts such as
https://www.vhdl-online.de/vhdl-ams/examples (https://www.vhdl-online.de/vhdl-ams/examples)
http://www.denverpels.org/Downloads/Denver_PELS_20071113_Cooper_VHDL-AMS.pdf (http://www.denverpels.org/Downloads/Denver_PELS_20071113_Cooper_VHDL-AMS.pdf)
https://www.doulos.com/content/training/vhdl-ams_training.php (https://www.doulos.com/content/training/vhdl-ams_training.php)

Quote
Quote
I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?

The word "really" implies a subjective response, and there would be legitimate differences of opinion. That's true for all languages and technologies.

Most importantly for this thread, the application of concepts is most definitely not limited by the hardware. In particular, CSP concepts are embodied in many languages (and some hardware), of which xC is the most "pure" example.

The concepts will survive and be useful during an entire career, even though languages come and go and mutate to breaking point.

The OP would do well to understand the concepts.

Yes that is all fair and good, but the end goal of using a programing language is to create some useful output. So far XC can only be compiled into machine code for XMOS chips, so the only reason you would want to learn this language is if your end goal is running your program on one of these chips. If you want to run your code on something else like a PC you can do similar multitasking things in C++ or C# ..etc.

It makes no sense to learn new programing languages just for the sake of learning languages.

Actually, it does make sense; careful choice of another different type of language can make you a better engineer.


Quote
Just go for what you are trying to do, have a look at what languages other people are using to do that and choose one of them and learn that as you go. Once you know one language its quick to pick up another one.

...of the same type of language.

It takes a long time to "pick up" OOP if you have only programmed a procedural language like C. I could tell you some success stories and some horror stories, but this isn't the place!

Quote
So since the main practical home user use for HDL is "FPGA stuff" it makes little sense to learn it until you actually get to using them and need it. Just don't get into the trap of "when you know how to use a hammer every problem looks like a nail" situation where you learn 1 single programing language and then use it for everything regardless of it being a good fit or not. Be open to learning new languages when there is good reason to do so. It usually taker longer to learn the environment and useful libraries than the language itself anyway.

But, if you have no concept of the benefits and applicability of HDL/statistics/OOP/functional/forward-chaining/backward-chaining/actor language differs from, say, C, then you will only ever use C. That leads to people that insert screws with hammers.
Title: Re: Too many programming languages?
Post by: SiliconWizard on September 26, 2019, 04:23:48 pm
I'm not.

In my first post I wrote (new emphasis) "You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple."

Yep exactly, the right tool for the right job.

And the right job for a tool called HDL languages is describing behavior of complex digital circuitry as i have pointed out in the very beginning. If you have sensible real world examples of HDL used for something else i would love to see it.

You have a strange view of the world: selecting a job for a tool. Most people select a tool for the job.

Well, that sentence looked weird. I agree with you. ;D

And yes, you can do a lot of things with an HDL apart from pure digital circuit design. VHDL in particular (and probably SV, although I don't know it much) is a full-fledged language with A LOT in common with ADA. There are even library functions for file I/O and other things. One obvious use is sophisticated simulation programs, but you can do pretty much whatever you want with it as long as you're not relying on a full-fledged runtime (as I said, you get limited functionalities such as file I/O, but you can still do a lot.)
Title: Re: Too many programming languages?
Post by: Berni on September 26, 2019, 05:44:42 pm
Okay i will admit i did word that a bit backwards since you typicaly select your language according to what you are trying to do.

That is some impressive analog circuitry simulation in VHDL. Works in a pretty elegant way too. Tho most will tend to use SPICE for this.

I don't mean to ignore other languages until you find a problem that requires one. But you can get a pretty good idea of what a language is good at by googling around a bit and looking trough some tutorials for it, no need to actually learn it by writing your own code. Yes object oriented programing is a significant step up from your usual raw C but not so significant that a good C programmer couldn't wrap there head around the concept of it by reading a few quick tutorials. Id say the important thing is just to not judge a programing language by how it looks at first glance, some languages are ugly to look  at (the definition of ugly depends a lot on what you are used to a language looking like) but are still great at getting the job done.

Especially if you get into web development you enter a world of a gazilion different languages and libraries that are constantly changing, a lot of them doing the same thing but in a slightly different way,

Knowing 2 or 3 languages well is better than knowing 10 languages badly.
Title: Re: Too many programming languages?
Post by: emece67 on September 26, 2019, 06:19:44 pm
.
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 06:33:31 pm
Okay i will admit i did word that a bit backwards since you typicaly select your language according to what you are trying to do.

That is some impressive analog circuitry simulation in VHDL. Works in a pretty elegant way too. Tho most will tend to use SPICE for this.

For an analogue problem with a little digital control, Spice is probably better.

The analogue capabilities of VHDL are merely one small example of system design, where digital logic is a small part of the system and where the boundaries between technologies are fluid and will change during the course of a project's lifetime.

One extreme example of the boundaries being ill defined is in the high frequency trading world. That mob puts the entire protocol stack in FPGAs. That includes the MAC layer, the transport protocols, and the business rules.

Quote
I don't mean to ignore other languages until you find a problem that requires one. But you can get a pretty good idea of what a language is good at by googling around a bit and looking trough some tutorials for it, no need to actually learn it by writing your own code.

Most language tutorials are poor. The last good white paper I read was in 1996, by James Gosling on the philosophy of Java.

Quote
Yes object oriented programing is a significant step up from your usual raw C but not so significant that a good C programmer couldn't wrap there head around the concept of it by reading a few quick tutorials.

Quick tutorials are likely to omit key fundamentals. One example is Python's GIL which prevents it having any scalability on multiprocessor machines.

Quick tutorials are unlikely to show how well solutions can be maintained and enhanced with different staff.

Quote
Id say the important thing is just to not judge a programing language by how it looks at first glance, some languages are ugly to look  at (the definition of ugly depends a lot on what you are used to a language looking like) but are still great at getting the job done.

Especially if you get into web development you enter a world of a gazilion different languages and libraries that are constantly changing, a lot of them doing the same thing but in a slightly different way,

That's a classic example of "languages" with miniscule differences. Know they exist and why, learn one if the need arises.

Quote
Knowing 2 or 3 languages well is better than knowing 10 languages badly.

You'll never know many languages well. But you need to know enough to avoid hammering in screws.

The worst example of that was someone that had to get a value from one Unix process to another. He knew databases, so he did it by one process adding a row to a table, and the other process doing database queries. Sockets? What are they?
Title: Re: Too many programming languages?
Post by: tggzzz on September 26, 2019, 06:38:44 pm
I don't have any examples that are in the public domain; sorry. However, you could start aquainting yourself with a subset of the concepts such as
https://www.vhdl-online.de/vhdl-ams/examples (https://www.vhdl-online.de/vhdl-ams/examples)
http://www.denverpels.org/Downloads/Denver_PELS_20071113_Cooper_VHDL-AMS.pdf (http://www.denverpels.org/Downloads/Denver_PELS_20071113_Cooper_VHDL-AMS.pdf)
https://www.doulos.com/content/training/vhdl-ams_training.php (https://www.doulos.com/content/training/vhdl-ams_training.php)

The first time I have notice about VHDL-AMS was in 2005. Since then I have not meet any people using it. Obviously this doesn't mean that it is not used, but I am curious about real projects using VHDL-AMS. Does it really have any significant role in industry?

I don't know, and that question should always be evaluated. Mind you, when I first used Java commercially (early 1996), there were no examples of commercial usage and even the best available IDE was written in C! Nonetheless Gosling's whitepaper was a model of exposing the key features, and it was clear that it was a significant advance and would become important. (Ditto C# five years later, but indicating there was no dignificant advance!)

The URLs are intended to dispell any notion that HDL==digital logic.
The URLs are not intended to illustrate how large complex systems can be modelled, implemented using stepwise refinement, and simulated.
Title: Re: Too many programming languages?
Post by: SparkyFX on September 27, 2019, 12:17:42 am
Are you a manager by any chance? They have no clue how much time it takes to snif on a language and learn something significant.
Having seen code samples does probably not justify to put it on your resume, but this never has been the question. The question is more or less if learning a single language and that alone might be a good decision, as mentioned by the OP.

Quote
Perhaps in the 60s you could know everything, nowadays that is really impossible and you better be pretty good at one thing (specialist) and moderately good at many other things, then be moderately to bad in everything. At least that is my experience.
In my experience you can be good at anything you currently work with as long as you keep learning. If you learn C once and don´t use it in 20 years better don´t call that competence. This would be as bad as working in that single language, but using outdated techniques all over the place and getting the complaints about it. So the question what to learn is second to actually keep learning.

Sometimes this can be as simple as feeling comfortable in an environment, looking up documentation and working within this ecosystem - in other words enable yourself to learn as fast as you can.

Quote
In our programming team
Thats a matter of where to find work... later, kind of a goal - maybe, because i would not have interpreted the question to include if electronics engineers should aim for the domain of CS and vice versa.
Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 12:35:17 am
Much as statistics lie and ZDnet has been known to not help by stretching the truth https://www.zdnet.com/article/programming-languages-python-predicted-to-overtake-c-and-java-in-next-4-years/ (https://www.zdnet.com/article/programming-languages-python-predicted-to-overtake-c-and-java-in-next-4-years/) . This data 'claims' to be from 12 million IT heads.

Going to be different to the result of what a strict EE study would produce but C and Python (quickly rising use!) make sense and I am still looking but Forth is clearly hidden to protect it's power, wide use and appeal from view :palm:

(https://zdnet3.cbsistatic.com/hub/i/2019/06/10/596a5dcd-6d21-4a00-ab24-d75f867f4aee/2ccefb0455c8e06ac22636d01f503501/tiobeindexjune19.jpg)
Title: Re: Too many programming languages?
Post by: brucehoult on September 27, 2019, 01:02:42 am
Assembly language up six places!!

It's clearly the Next Big Thing. Groovy!
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 01:49:53 am
Much as statistics lie and ZDnet has been known to not help by stretching the truth https://www.zdnet.com/article/programming-languages-python-predicted-to-overtake-c-and-java-in-next-4-years/ (https://www.zdnet.com/article/programming-languages-python-predicted-to-overtake-c-and-java-in-next-4-years/) . This data 'claims' to be from 12 million IT heads.

Going to be different to the result of what a strict EE study would produce but C and Python (quickly rising use!) make sense and I am still looking but Forth is clearly hidden to protect it's power, wide use and appeal from view :palm:

(https://zdnet3.cbsistatic.com/hub/i/2019/06/10/596a5dcd-6d21-4a00-ab24-d75f867f4aee/2ccefb0455c8e06ac22636d01f503501/tiobeindexjune19.jpg)

The OP wanted to debug the I2C peripheral on his STM32 mcu.

C is not real time interactive and Python won't run on the OPs  STM32 mcu. :palm:

MicroPython will run on it interactively, providing the mcu has 256KB Flash and 16KB ram *just for Micropython*. It's also 30 times slower than Forth.

Guess which language from your list produced this real time STM32 I2C register data contents printout picture ?

And this real time resource stat ?

free (bytes)
FLASH.. TOTAL REPORTED: 65536 USED: 58368 FREE: 7168
RAM.... TOTAL PRESET: 8192 USED: 1180 FREE: 7012


Title: Re: Too many programming languages?
Post by: legacy on September 27, 2019, 01:58:49 am
And this real time resource stat ?

forget also ucLisp, this (http://www.ulisp.com/show?1AA0) is pure crap, and implementing a minimal interpreter doesn't suite a small memory model very well because it consumes a lot of stack and cpu cycles  :-//
Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 02:11:43 am

The OP wanted to debug the I2C peripheral on his STM32 mcu.

C is not real time interactive and Python won't run on the OPs  STM32 mcu. :palm:

MicroPython will run on it interactively, providing the mcu has 256KB Flash and 16KB ram *just for Micropython*. It's also 30 times slower than Forth.

Guess which language from your list produced this real time STM32 I2C register data contents printout picture ?

And this real time resource stat ?

free (bytes)
FLASH.. TOTAL REPORTED: 65536 USED: 58368 FREE: 7168
RAM.... TOTAL PRESET: 8192 USED: 1180 FREE: 7012

You reply like a true zealot - there is only one true programming language all other heretics must be burned at the stake.  :-DD

Tone it down we don't need a fanatical sermon from your keyboard! Forth as a 'specific' language is NOT appropriate as a solution for all and is NOT a solution for the project the OP is embarking on much as it could be used for PART of it.

There are more widely used solutions that are much better supported for the OP.

The original data I linked above https://www.tiobe.com/tiobe-index/ (https://www.tiobe.com/tiobe-index/) somewhere below 0.2% is where Forth and others are at.
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 02:18:35 am
And this real time resource stat ?

forget also ucLisp, this (http://www.ulisp.com/show?1AA0) is pure crap, and implementing a minimal interpreter doesn't suite a small memory model very well because it consumes a lot of stack and cpu cycles.

Perphas Forth is the only choice in this case  :-//

I think Forth is probably a suitable choice in this case given the STM32 I2C debugging issue and the fact that the OP asked what language to learn. Naturally I didn't imply that Forth should be the ONLY language he should learn.

Another factor affecting all real time interactive languages other than Forth is the small but regular time they spend garbage collecting, which can be a issue where the mcu is needed to do other things at that time.
Title: Re: Too many programming languages?
Post by: legacy on September 27, 2019, 02:22:36 am
So probably a good solution, in this case, is something like a little GDB-stub, or a true-hw ICE.
What do you think?
Title: Re: Too many programming languages?
Post by: Nusa on September 27, 2019, 02:25:19 am
Overall language statistics only matter if you have no focus on what you intend to do. Once you start talking about a particular field or subject matter, the choices become more refined. If you're involved in the hard sciences, FORTRAN is still very relevant to most of those fields. If you're doing work for large business/finance, COBOL is something to know about. One can still make a living working with both of those 60+ year-old languages, despite them not even being on your popular list. People have been predicting their death for half my life. Similarly, I predict C and C++ are going to still be relevant and useful languages after most of us are dead.
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 02:32:39 am

The OP wanted to debug the I2C peripheral on his STM32 mcu.
C is not real time interactive and Python won't run on the OPs  STM32 mcu. :palm:

You reply like a true zealot - there is only one true programming language all other heretics must be burned at the stake.  :-DD

Tone it down we don't need a fanatical sermon from your keyboard! Forth as a 'specific' language is NOT appropriate as a solution for all and is NOT a solution for the project the OP is embarking on much as it could be used for PART of it.

There are more widely used solutions that are much better supported for the OP.

The original data I linked above https://www.tiobe.com/tiobe-index/ (https://www.tiobe.com/tiobe-index/) somewhere below 0.2% is where Forth and others are at.

In other words you didn't understand a single thing I wrote as shown by your utter lack of technical argument.

Your use of the royal "we" in attempting to speak on behalf of the millions on this forum doesn't bode well for your mental state either.

I recommend you take a Valium or two, settle down and get a life.

Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 02:48:53 am
So probably a good solution, in this case, is something like a little GDB-stub, or a true-hw ICE.
What do you think?

In my opinion for STM32, C and GDB are outstanding, they are Free, available for many architectures and useful in small environments such as gdb-tui or massive environments such as the Java based Eclipse. I think it's the best of the non interactive solutions for small embedded available. It's not interactive, nor particularly trivial to learn and set up, but the C binaries GDB produces are about three times faster than Forth on the same hardware and smaller as no 19 KB on chip Forth Kernel is required.

I have gdb-tui set up here and it runs from a single shell script with all the usual set up stuff done by the time the pseudo gui appears. It is very impressive compared to what I had 20 years ago!

I can't comment on a true-hw ICE as I have never been able to afford one, and never used one anywhere I have worked. Perhaps someone else with firsthand experience with them may comment.


 
Title: Re: Too many programming languages?
Post by: hamster_nz on September 27, 2019, 02:50:48 am
The OP wanted to debug the I2C peripheral on his STM32 mcu.

C is not real time interactive and Python won't run on the OPs  STM32 mcu. :palm:

MicroPython will run on it interactively, providing the mcu has 256KB Flash and 16KB ram *just for Micropython*. It's also 30 times slower than Forth.

Guess which language from your list produced this real time STM32 I2C register data contents printout picture ?

And this real time resource stat ?

free (bytes)
FLASH.. TOTAL REPORTED: 65536 USED: 58368 FREE: 7168
RAM.... TOTAL PRESET: 8192 USED: 1180 FREE: 7012

BRING BACK BASIC!

(just kidding)

However, a derivative of my TinyBASIC port for the Arduino (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic)) is in the masked ROM of every ESP32 chip.

Flash used - zero
RAM used - a tiny bit for the stack the rest is free for code.

I only found out about this from the Hackaday post - https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/ (https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/)
Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 03:01:34 am
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?

p.s.: and one programming language's introduction textbook is often as thick as a brick used in the Empire State Building.

  :horse: :popcorn:

It depends on what you want to do. You said you're a electronics engineer so I'll assume you meant embedded control.

This is a contentious and flame prone question, and no doubt will soon devolve into rants, fud and being closed by the admins, especially if one offends the Gods of C, but before it gets that far I recommend :-

Low level you can't beat Forth
High level you can't beat Forth

So my advice is Forth. It's not 'popular' like C, but it's simple enough that you can learn all of Forth and even write your own Forth in a decade or so providing you put in some effort every day. After that you can hold the entire concept in your head.

You will need to be fluent in Assembler for your chip type(s) but you can learn Assembler as you learn Forth, they compliment each other.

It's quite unlikely you can learn all of C and write your own comprehensive C compiler in that same time frame.

Here is a  Forth sample: I recommend asking other programming language advocates to submit their code for this same task so you can compare them. Note I didn't need any libraries, there are no hidden include files, everything is present here for a STM32F0xx, Cortex-M0 MCU.

This Program calculates baudrates by reading the MCU configuration. Only works with USART1 16 bit oversampling which is the reboot default and will exit if 8 bit oversampling is in use.

It uses Mecrisp-Stellaris s31.32 fixed point support to calculate the Baudrate to two decimal places so you can choose the best BRR integer to use when setting up your terminal baudrate.

More information about Forth is available in my SIG below.

.

You entered this thread preaching a Language with a tiny following trying to convert the unwashed masses. Well before the OP added some more detail.

Get off the pulpit!  :palm:
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 03:05:51 am
BRING BACK BASIC!

(just kidding)

However, a derivative of my TinyBASIC port for the Arduino (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic)) is in the masked ROM of every ESP32 chip.

Flash used - zero
RAM used - a tiny bit for the stack the rest is free for code.

I only found out about this from the Hackaday post - https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/ (https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/)

WHAT THE SH*T !!!!!

That's pretty darn cool!!!

When they pinch your code and it turns up in something as widespread as the esp32-silicon that's gotta be HUGE flattery :)

Nothing wrong with Basic for interactive embedded development either. I still have a Intel 8049 with Dartmouth Basic in Rom from a battery powered portable hardware testing unit I made around 1991. Using it in the field was the best thing since sliced bread back then.
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 03:19:01 am

It depends on what you want to do. You said you're a electronics engineer so I'll assume you meant embedded control.

This is a contentious and flame prone question, and no doubt will soon devolve into rants, fud and being closed by the admins, especially if one offends the Gods of C, but before it gets that far I recommend :-

Low level you can't beat Forth
High level you can't beat Forth

So my advice is Forth. It's not 'popular' like C, but it's simple enough that you can learn all of Forth and even write your own Forth in a decade or so providing you put in some effort every day. After that you can hold the entire concept in your head.

You entered this thread preaching a Language with a tiny following trying to convert the unwashed masses. Well before the OP added some more detail.

Get off the pulpit!  :palm:

No little Arduino troll, I didn't write that. You added the BOLD typeface and didn't bother to mention it to the readers.

But thank you for proving my second line was accurate.

Title: Re: Too many programming languages?
Post by: Berni on September 27, 2019, 05:28:29 am
The OP wanted to debug the I2C peripheral on his STM32 mcu.

C is not real time interactive and Python won't run on the OPs  STM32 mcu. :palm:

MicroPython will run on it interactively, providing the mcu has 256KB Flash and 16KB ram *just for Micropython*. It's also 30 times slower than Forth.

Guess which language from your list produced this real time STM32 I2C register data contents printout picture ?

And this real time resource stat ?

free (bytes)
FLASH.. TOTAL REPORTED: 65536 USED: 58368 FREE: 7168
RAM.... TOTAL PRESET: 8192 USED: 1180 FREE: 7012

BRING BACK BASIC!

(just kidding)

However, a derivative of my TinyBASIC port for the Arduino (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic (http://hamsterworks.co.nz/mediawiki/index.php/Arduino_Basic)) is in the masked ROM of every ESP32 chip.

Flash used - zero
RAM used - a tiny bit for the stack the rest is free for code.

I only found out about this from the Hackaday post - https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/ (https://hackaday.com/2016/10/27/basic-interpreter-hidden-in-esp32-silicon/)

I had no idea about Basic being inside a ESP32. Its indeed likely meant just as an Easter egg but it does appear to be functional enough to actually do something useful.

But yeah these days if you want to program with an interactive shell Python is the way to go. Yes its not exactly fast or resource efficient, but when you want to just quickly throw together a little tool its hard to beat.

You are not going to want to build an entire product in Python. But the tons of community maintained libraries and simple easy to read language make it excellent for getting a working program in as short of a time as possible. The interactive feature of it also helps that because you can import your python program into the interactive console and then call functions and set variables manually to quickly see how a piece of code or library behaves rather than iteratively changing a line of code and re-running the whole lot.

For example if you need a tool that collects the pictures in a folder that have A in there name, checks if more than 50% of the picture has a red color and then save a thumbnail of that picture into another folder. Then you can do that with Python in about 10 or 20 lines after installing a few libraries trough the built in "pip install" feature.

Running python on a MCU tho...eww please no.
Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 06:20:10 am

It depends on what you want to do. You said you're a electronics engineer so I'll assume you meant embedded control.

This is a contentious and flame prone question, and no doubt will soon devolve into rants, fud and being closed by the admins, especially if one offends the Gods of C, but before it gets that far I recommend :-

Low level you can't beat Forth
High level you can't beat Forth

So my advice is Forth. It's not 'popular' like C, but it's simple enough that you can learn all of Forth and even write your own Forth in a decade or so providing you put in some effort every day. After that you can hold the entire concept in your head.

You entered this thread preaching a Language with a tiny following trying to convert the unwashed masses. Well before the OP added some more detail.

Get off the pulpit!  :palm:

No little Arduino troll, I didn't write that. You added the BOLD typeface and didn't bother to mention it to the readers.

But thank you for proving my second line was accurate.

Smile when you say that I am actually a Fortran and Motorola 6800 Troll  :box: What part of the bold type is not preaching from your Forth or die mantra. It is one tool like yourself that has a purpose but it is not a Universal one!

https://www.embedded.com/electronics-blogs/break-points/4023811/I-Hate-Forth (https://www.embedded.com/electronics-blogs/break-points/4023811/I-Hate-Forth)
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 06:21:49 am
Running python on a MCU tho...eww please no.

I reviewed it on a Cortex-M4 with 1MB Flash and MicroPython was pretty slick but quite different to my Forth system.

Where I click on 'make' and my Forth source is serially uploaded at 460800 baud to my mcu, MicroPython required the use of a pseudo /dev/usb-bulk storage facility on the target and a GUI on  the PC where one drags and drops the source file into the pseudo /dev/usb-bulk storage and it then flashes the mcu automatically.

It had a nice well designed feel about it. I think the MicroPython guys did a good job. For a Forth hater, MicroPython may be their interactive solution providing they have the on chip resources such as USB and memory, don't mind a garbage collector stealing cpu cycles or a 30x slowdown compared to Forth. Of course for interactive debugging the slowdown isn't usually a issue.

I note that STM now offer this same type of flashing facility in the STMXxx Nucleos with SWDv2.1

Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 06:36:31 am

No little Arduino troll, I didn't write that. You added the BOLD typeface and didn't bother to mention it to the readers.

But thank you for proving my second line was accurate.

Smile when you say that I am actually a Fortran and Motorola 6800 Troll <snip>

Hard to believe because you actually recommended Arduino to the OP.

Not that I cared at the time, before you declared yourself to be a ranting Forth hater.

Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 06:49:38 am
If you want to selectively read a paragraph out of context yes I absolutely did but that is not my background which is fairly clear in the first bit. Some of the world has moved on from programming from 30+ years ago and the OP asked which he should learn. For good reason I didn't put up Fortran or Pascal in spite of them still being in use at reasonable levels.

The sensible and even pragmatic answer is still C or derivatives and or Python in the current times. Which micro you wrap them around is largely irrelevant but the Arduino IDE and platform is an easy one to get into but certainly not the most powerful.

Forth is niche and has a use but it is no way as you made claim to in your opening post!
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 07:36:58 am
If you want to selectively read a paragraph out of context yes I absolutely did but that is not my background which is fairly clear in the first bit. Some of the world has moved on from programming from 30+ years ago and the OP asked which he should learn. For good reason I didn't put up Fortran or Pascal in spite of them still being in use at reasonable levels.

The sensible and even pragmatic answer is still C or derivatives and or Python in the current times. Which micro you wrap them around is largely irrelevant but the Arduino IDE and platform is an easy one to get into but certainly not the most powerful.

Forth is niche and has a use but it is no way as you made claim to in your opening post!

I only cut for brevity, everyone here can easily scroll down a few inches to see the full extent of your rantings if they so desire.

I'm still wondering why you say anything about Forth as you clearly don't know anything about it. You haven't argued ONE single technical reason why your choice (Arduino) is better than Forth or why Forth is unsuitable. All I've seen so far are emotive rants and mantras.

Fortran and Pascal are still in widespread use, but neither will run interactively on a STM32 so suggesting them would have been pretty stupid.

The Forth I use is a modern Forth, it runs on Cortex-M, namely STM32 the same MCU as the OP uses.

I wonder if you even realize that C and Forth are around the same age with both being designed in the early 1970's ?

I welcome any technical debate about Forth with you, but statements such as "Forth is niche" are well outside the OPs stated requirement. He asked for suggestions regarding a embedded programming language to learn and debug his STM32 boards I2C.

Perhaps you would like to try and criticize me again for suggesting a programming language that ran on his STM32 board *before* anyone here knew what hardware he actually had ?

You could start by demonstrating how your suggestion of Arduino will help him debug the I2C on his already designed and built STM32 board ?

Title: Re: Too many programming languages?
Post by: beanflying on September 27, 2019, 08:03:01 am
Yawn "Which micro you wrap them around is largely irrelevant but the Arduino IDE and platform is an easy one to get into" This is not a which micro is better than yours so stop muddying the topic. Arduino vs the rest is another can of worms for another thread at another time.

You are still preaching for a Language with perhaps less than a 0.2% AGAINST user base against C and derivatives at over 20%. Nothing wrong with Forth and the article I linked from Jack Ganslle while old still holds up and even in the comments there the disciples of Forth are present to defend the faith. https://www.embedded.com/electronics-blogs/break-points/4023811/I-Hate-Forth (https://www.embedded.com/electronics-blogs/break-points/4023811/I-Hate-Forth)

Like suggesting an English speaker someone learn Latin first so they can then learn Italian...... Any guesses which Language I think Forth is?
Title: Re: Too many programming languages?
Post by: legacy on September 27, 2019, 09:43:51 am
BRING BACK BASIC!

(just kidding)

However, a derivative of my TinyBASIC port for the Arduino

I have a copy of "Dr. Dobb's Collection 1994", it comes with "lists" of assembly code. One of them is Basic68k, which has been recently reversed-coded from assembly/68k to C.

Yes, a full inverse processing through the tool Ida combined to a sort of "C language pattern matching", funny, ain't it?

It took several months, according to the author. Anyway, this way it's possible to recompile it for different CPU-targets rather than only for the old m68k, and one immediately consequence: ported to Arduino  :D
Title: Re: Too many programming languages?
Post by: legacy on September 27, 2019, 09:58:35 am
I still have a Intel 8049 with Dartmouth Basic in Rom

(http://www.downthebunker.com/chunk_of/stuff/public/boards/board-8051-elisa1.jpg)

like this one?  :D

8051 with Intel Basic in ROM. GW-Basic ("basic with lines number") is still a used programing language. I mean, someone still uses it for the industry.

In Japan, there are still a couple of pocket computer calculators, that use Basic. The Sharp PC-e500 comes with gw-basic, an d CASIO-basic is what you find in every graphic scientific calculator, etc, but I have recently seen even a big industrial embroidery machine taking its frame input from a Basic console  :o
Title: Re: Too many programming languages?
Post by: techman-001 on September 27, 2019, 10:49:45 am
I still have a Intel 8049 with Dartmouth Basic in Rom

(http://www.downthebunker.com/chunk_of/stuff/public/boards/board-8051-elisa1.jpg)

like this one?  :D

8051 with Intel Basic in ROM. GW-Basic ("basic with lines number") is still a used programing language. I mean, someone still uses it for the industry.

In Japan, there are still a couple of pocket computer calculators, that use Basic. The Sharp PC-e500 comes with gw-basic, an d CASIO-basic is what you find in every graphic scientific calculator, etc, but I have recently seen even a big industrial embroidery machine taking its frame input from a Basic console  :o

No, just a spare chip. I was mistaken, it's not a 8049 as you can see from the pic I just took, it's a 8052 with onboard Basic. I think I ordered two back in 85, built up my own board and kept one as a spare.
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on September 27, 2019, 10:16:56 pm
C++ is an abortionate mess where knowledgable people spend a lot of time arguing over what ought to happen.

Hee, hee, hee! We share the same opinion of C++.
Title: Re: Too many programming languages?
Post by: mrflibble on September 28, 2019, 08:03:12 pm
C++ is an abortionate mess where knowledgable people spend a lot of time arguing over what ought to happen.

Hee, hee, hee! We share the same opinion of C++.

I like it! That one is going in the QUOTES file. ;D

And that is from someone who uses said abortionate mess on a voluntary basis. I'm sorry, but OpenMP is just too convenient. ;)

Oh oh oh, and since I vaguely hear some spirited Forth debate on the background ... Question for those more in the know in the land of Forth: Is there a framework along the lines of OpenMP for Forth? Reason I ask is I'd like to do a comparison of several prime sieve implementations written in different languages. And among the permutations are, yes, a few multi-threaded ones. So for example for C/C++ I use OpenMP. I wonder if there's something like it for Forth.

I noticed on the wikipedia page on threaded code (https://en.wikipedia.org/wiki/Threaded_code#Threading_models) that Forth does support the notion of threads, the exact flavor apparently depending on compiler X vs compiler Y. So there is some hope, but I could not find a Forth equivalent of "The OpenMP API specification for parallel programming". Having to explicitly code all the synchronization & blockers and whatnot is somewhat ... inefficient in terms of developer hours spent.


Title: Re: Too many programming languages?
Post by: SiliconWizard on September 28, 2019, 08:25:49 pm
Oh, many have said similar things about C++. A couple famous.

http://harmful.cat-v.org/software/c++/linus (http://harmful.cat-v.org/software/c++/linus)

"C++ is a horrible language. It's made more horrible by the fact that a lot
of substandard programmers use it, to the point where it's much much
easier to generate total and utter crap with it. Quite frankly, even if
the choice of C were to do *nothing* but keep the C++ programmers out,
that in itself would be a huge reason to use C."

http://tex.loria.fr/litte/knuth-interview (http://tex.loria.fr/litte/knuth-interview)

"The problem that I have with them today is that... C++ is too
complicated. At the moment, it's impossible for me to write portable
code that I believe would work on lots of different systems, unless I
avoid all exotic features. Whenever the C++ language designers had two
competing ideas as to how they should solve some problem, they said
"OK, we'll do them both". So the language is too baroque for my taste."

(It has become much worse compared to when he said that.)

Bjarne Stroustrup is actually a smart guy.
Maybe too smart for us mere mortals.  ::)
Title: Re: Too many programming languages?
Post by: techman-001 on September 28, 2019, 09:32:55 pm
C++ is an abortionate mess where knowledgable people spend a lot of time arguing over what ought to happen.

Hee, hee, hee! We share the same opinion of C++.

I like it! That one is going in the QUOTES file. ;D

And that is from someone who uses said abortionate mess on a voluntary basis. I'm sorry, but OpenMP is just too convenient. ;)

Oh oh oh, and since I vaguely hear some spirited Forth debate on the background ... Question for those more in the know in the land of Forth: Is there a framework along the lines of OpenMP for Forth? Reason I ask is I'd like to do a comparison of several prime sieve implementations written in different languages. And among the permutations are, yes, a few multi-threaded ones. So for example for C/C++ I use OpenMP. I wonder if there's something like it for Forth.

I noticed on the wikipedia page on threaded code (https://en.wikipedia.org/wiki/Threaded_code#Threading_models) that Forth does support the notion of threads, the exact flavor apparently depending on compiler X vs compiler Y. So there is some hope, but I could not find a Forth equivalent of "The OpenMP API specification for parallel programming". Having to explicitly code all the synchronization & blockers and whatnot is somewhat ... inefficient in terms of developer hours spent.

"Is there a framework along the lines of OpenMP for Forth?"  I very much doubt it.

The only multiprocessor parallel processing Forth I know of is the Green Arrays 144 processor chip the GA144. Development of this system started in 2009. The GA144 not only has 144 Forth processors, it is fully clockless.

http://www.greenarraychips.com/index.html (http://www.greenarraychips.com/index.html)

There are some very slick videos on the GA144 by Daniel Kalny:
Forth Day 2016 presentation on Digital Image Processing Implemented in GA144: https://www.youtube.com/watch?v=iwM0qfQqmdE&t=1323s (https://www.youtube.com/watch?v=iwM0qfQqmdE&t=1323s)
Title: Re: Too many programming languages?
Post by: Kjelt on September 29, 2019, 03:18:12 pm
The OO way of thinking / designing is great for many IT projects but less for small embedded projects IMO.
The good parts can also be done in C.
Each language has its benefits and drawbacks, domains to be applied succesfully and not.
Clean programming has nothing to do with the language it self but everything with the programmer and where and how he learned to program. I reviewed Mlocs of code in my jobs and had many discussions with many programmers and it always came down to where and how they learned to write code.
Title: Re: Too many programming languages?
Post by: bjdhjy888 on September 30, 2019, 09:22:16 pm
Why do they often put pictures of various animals on the cover of programming books? e.g.:
Title: Re: Too many programming languages?
Post by: SiliconWizard on September 30, 2019, 09:30:02 pm
Why do they often put pictures of various animals on the cover of programming books? e.g.:

Note that the Java book has a much more "aggressive/wild" cover, while the C# one looks completely innocuous. Go figure... ;D
Title: Re: Too many programming languages?
Post by: rstofer on September 30, 2019, 11:13:44 pm
Why do they often put pictures of various animals on the cover of programming books? e.g.:

Note that the Java book has a much more "aggressive/wild" cover, while the C# one looks completely innocuous. Go figure... ;D

O'Reilly started putting animals on the cover of their books a very long time ago.  It's just the way you can tell an O'Reilly book from everything else.

https://www.oreilly.com/ideas/a-short-history-of-the-oreilly-animals (https://www.oreilly.com/ideas/a-short-history-of-the-oreilly-animals)
Title: Re: Too many programming languages?
Post by: CatalinaWOW on September 30, 2019, 11:19:21 pm
Why do they often put pictures of various animals on the cover of programming books? e.g.:

Note that the Java book has a much more "aggressive/wild" cover, while the C# one looks completely innocuous. Go figure... ;D

O'Reilly started putting animals on the cover of their books a very long time ago.  It's just the way you can tell an O'Reilly book from everything else.

https://www.oreilly.com/ideas/a-short-history-of-the-oreilly-animals (https://www.oreilly.com/ideas/a-short-history-of-the-oreilly-animals)

Great story on how the animals came to be there.  I am sure that the fact that copyrights were expired or non-existent on these images was not a factor at all.   ;)
Title: Re: Too many programming languages?
Post by: nigelwright7557 on October 01, 2019, 02:12:53 am
Its horses for courses.
I wouldn't use a C compiler for code in a tiny 8 pin PIC micro.
And conversely I wouldnt use assembler for a 32 bit micro.

I first learned assembler, then C, then Delphi Pascal, then C# then most of the web programming languages.

These days things have moved on a lot with Mplab Harmony for the larger PIC micro's.


Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 01, 2019, 04:24:51 am
The OO way of thinking / designing is great for many IT projects but less for small embedded projects IMO.

It depends on your definition of small and embedded. Right now in front of me there's a bunch of esp32s, and I've been writing a lot of C++ object::object()s for them lately. By the way, the esp32 rocks!
Title: Re: Too many programming languages?
Post by: Berni on October 01, 2019, 05:51:36 am
Its not that the OO stuff in C++ is bad.

Its more of an issue that people often don't use it right. They end up writing too many objects for every single little thing, stuffing them full of advanced C++ functionality because why wouldn't you if you have it and that's how you show off being a better programmer than everyone else. Then as the project goes on these objects start getting linked up to everything else more and more as they later on figure out they needed the two objects to do something with each other but there are like 4 other objects in the chain between them, so they say fuck it and call something directly in that object. The whole thing in the end becomes a huge tangled mess of objects that are no longer self contained units at all since trying to move one object into another project pulls half of the code from this program behind it in the form of code dependencies.

This happens especially easily for people that learn C++ as there first language and are taught to embrace this OO idea as much as possible.

I like to look at C++ as more of "Syntax sugar for C". You can do most of the stuff in C that C++ lets you do, but in C the code to do it can sometimes be long, messy and annoying. So instead of clinging to the object oriented ideology, instead just pretend you are writing C code, but when objects look like something that could simplify your code then use them.

You can write horrible code in any language, its just that C++ makes it easier to do so compared to C.

Performance wise C++ can be just as fast if you know what you are doing and not overuse its fancy features.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 01, 2019, 06:30:06 am
But what's the meaning of small today? Because a small (it's really tiny!) esp32 is orders of magnitude more powerful than a desktop^W "workstation" PC of the 90's, but it's the size of a stamp. Therefore C++ makes sense, because this is by no means the 8051, PIC or 328p µC of bygone days.

https://en.wikipedia.org/wiki/Macintosh_Quadra_950
Title: Re: Too many programming languages?
Post by: Kjelt on October 01, 2019, 07:19:06 am
If you are indeed using something like sockets with many instances OO can become interesting, valid point.

What is small and what not is a topic on its own. I have done embedded programming on a machine that incorporated 3 sun workstations, 64GB of RAM and 2TB of storage each and I did not agree that this was an embedded software job at all  ;) Still the manufacturer insisted of calling it embedded software since it was running inside a machine (the size of an office space).  :o

I also worked with an 8 bit uC with 128bytes of RAM and 2kB of ROM, now that I would not even call small but miniscule  :)

So yes the definition has become blurry over time. My old definition that anything with a display and GUI is not embedded was valid till 2005 or so when the small LCD displays became attached to dedicated uC's. So I kinda extended my personal definition of embedded as anything that has no GPU card , very disputable I know but I can't think of a better definition, perhaps someone else can.
Title: Re: Too many programming languages?
Post by: Berni on October 01, 2019, 08:10:24 am
Yeah the line of whats embedded software and whats just software can be quite fuzzy.

I suppose you can call embedded software stuff that runs on things that the general public doesn't consider as being a 'computer'. But then again what do smartphones belong into? You could definitely call the software running on an old Nokia 3310 as being embedded software, but the youtube app on a smartphone doesn't seam like it would count as that.

I guess one reason that people sometimes cringe at OO on microcontrolers is that in a lot of cases the resource usage tend to be hard to determine at compile time. But you can write such software in C too as soon as you start using malloc, but there its even easier to hog resources if you forget to release memory. In my opinion nothing wrong with C++ on things with 10s of KB of RAM, as long as its good C++ code and not a mess of shoehorned in advanced C++ features for the sake of using them.
Title: Re: Too many programming languages?
Post by: Siwastaja on October 01, 2019, 09:13:03 am
C++ is still very problematic in libraries. (Although I do see why some see it appealing in application code.)

I just went through all the issues reported by people wanting to use my small piece of C++ code, and issues seen by myself in setting up a development environment for a project using two C++ libraries (SFML and TGUI, particularly).

For a C library, it typically goes: include headers and link against a pre-compiled library object file (or copy a shared object file somewhere where it's found by the loader).

But C++ still has no standardized nor stable exception handling, name mangling rules, std::string ABI, etc., and I'm sure the list goes on. It does not only depend on the system, but also on the actual compiler version and which flags were used during compilation. And it's not about two or three options; the combinations are countless.

For SFML, for example, just downloading it pre-compiled is quite iffy: https://www.sfml-dev.org/download/sfml/2.5.1/ (https://www.sfml-dev.org/download/sfml/2.5.1/)
As they say, "The compiler versions have to match 100%!" and "It's recommended to use the SFML version from your package manager (if recent enough) or build from source to prevent incompatibilities." Yes, they give precompiled versions, but I can confirm their warnings about the problems are very real.


So it was a 20-hour job for me to be able to get everything "play together" for a combined linux + windows cross-compilation environment. Earlier, it either worked out by luck (some people reported success on installing SFML and compiling our code, some reported they just can't do it with their skills and documentation in existence), or then you need to compile all libraries from scratch ensuring ABI compatibility. I did the latter, but this basically prevents "novices" (however good at understanding programming and writing code) getting into the project; I just need to provide binaries. Which is, of course, what most people want anyway, but if you think about open source community, then the only people who will jump in are some C++ "experts"  who don't mind all the colossal hassle to be able to use C++.

I have never seen such colossal pain in any project using C libraries. The difference is about two orders of magnitude.
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 01, 2019, 06:37:59 pm
There are myriads of ways of using OO as well... and various languages allowing to use the OO paradigm, including, actually, C!
The typical C++ way is certainly not the only one.

Beyond what you mentioned with interfacing libraries, one very common problem with how specific "features" are implemented as classes is the whole logic chosen by the developers, and all the dependencies. A given class may depend on a forrest of other classes that may make the whole thing a complete mess to maintain. Inheritance is also very easy to misuse. You may design your whole architecture around a pile of classes that may need to be broken entirely if at some point you realize it's become way too rigid to be extended in any useful way.

One thing to note, and that Wirth understood a long time ago, is that a very useful construct for distributing software features is the module. C++ (nor C, nor many or even most of the other popular languages) still doesn't have anything for modules. D introduced that, but D is still nothing much more than a curiousity as the moment; I wouldn't even call it a niche language, it's more than niche. If I'm not mistaken, I think modules have been discussed for a future C++ standard. Maybe it'll happen. Someday. In C++100? (2100? ;D )

Libraries in C++ (and in C of course) are just a joke. There's no such thing as a library actually. They are just piles of definitions and functions/methods stacked together. Modules are definitely a needed addition IMO to better standardize and structure all this.

I agree it's still usually easier to deal with C, but C also lacks modules. Handling libraries still just looks like messing around, not like serious work. ::)
Title: Re: Too many programming languages?
Post by: Berni on October 02, 2019, 05:27:42 am
Well on linux the package managers sort of got the role of getting you the C libraries needed to compile something. It either works first go or something goes wrong and you are in for hours of dependency solving 'fun' before finally giving up on it.

One of the languages that has this down nicely is Python. Just go "pip install foo" and it will grab the correct version of foo for your version of python and OS and install any other dependencies that it needs automatically. Then just put an "import foo" at the top and that's it. But on the other hand Python is awful how dependent it is on its installation environment where a collection of installed modules might work for one program but not for another because its old and need this very specific version of "foo 1.3.44" to work. So they just tacked on virtual environments that basically just fudge the paths to point to a different separate installation of python where you install your special library needed for that one program.

C# has its NuGet thing that's basically an automated GitHub downloader under the hood, it worked the few times i used it, no idea how well it works long term.
Title: Re: Too many programming languages?
Post by: legacy on October 02, 2019, 12:22:54 pm
Prolog? anyone?
in case, which Prolog compiler/interpreter?
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 02, 2019, 01:27:55 pm
Prolog? anyone?
in case, which Prolog compiler/interpreter?

Nope, sorry. I think the last time I heard about Prolog was in the early nineties... (and after that, just in occasional articles about programming languages.)  ;D
Title: Re: Too many programming languages?
Post by: legacy on October 02, 2019, 03:19:17 pm
Turbo Prolog? Yup, it was copyrighted by Borland in 1992-1994.  I have a copy running under a 486 guest card installed in a RiscPC/600 and handled by RiscOS v4.39. I had to install Microsoft DOS v5.0(1) for that, v6.22 has problems with the Gemini BIOS about CON/IO (console IO, BIOS's stuff).

... a lot of troubles, but now it works, and it works better this way than how you can get running in a DoxBox sandbox under Linux, this due to the stupid CON/IO library required for supporting the Borland's IDE.

This sucks, and under DosBox it sucks even more because you are not able to move between menu items  :palm: :palm: :palm:

Anyway, according to what AADL guys say, it seems someone in the QA department does program in Prolog, therefore there must be modern versions of Prolog (but I don't know ... which one?), because even the tool "Stood" is written in Prolog & C++ and largely used in avionics for supporting for HOOD and HRT-HOOD methodology.

Code: [Select]
dev-lang/gnuprologjava
      Homepage:      http://www.gnu.org/software/gnuprologjava
      Description:   GNU Prolog for Java is an implementation of ISO Prolog as a Java library
      License:       LGPL-3+

dev-lang/gprolog
      Homepage:      http://www.gprolog.org/
      Description:   A native Prolog compiler with constraint solving over finite domains (FD)
      License:       GPL-2 LGPL-3

dev-lang/interprolog
      Homepage:      http://www.declarativa.com/interprolog/
      Description:   InterProlog is a Java front-end and enhancement for Prolog
      License:       LGPL-2

dev-lang/qu-prolog
      Homepage:      http://www.itee.uq.edu.au/~pjr/HomePages/QuPrologHome.html
      Description:   Extended Prolog supporting quantifiers, object-variables and substitutions
      License:       Apache-2.0 GPL-2+

dev-lang/swi-prolog
      Homepage:      http://www.swi-prolog.org/
      Description:   versatile implementation of the Prolog programming language
      License:       BSD-2

dev-lang/tuprolog
      Homepage:      http://tuprolog.unibo.it/
      Description:   tuProlog is a light-weight Prolog for Internet applications and infrastructures
      License:       LGPL-3

I am going to try each of them  :D


p.s.
for Windows, there is a personal edition (free):
Code: [Select]
Visual Prolog Personal Edition
Version 5.2 Release Candidate 2



(1) cannot find a copy of PC-DOS by IBM
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 02, 2019, 03:53:16 pm
Turbo Prolog? Yup, it was copyrighted by Borland in 1992-1994.

Ahah, yep. We had it at uni, and I remember I fiddled with it for a while. I also remember the ads for it back then in computer magazines. That was funny.
Title: Re: Too many programming languages?
Post by: techman-001 on October 02, 2019, 06:57:47 pm
There is also https://sourceforge.net/projects/yap/

The Yap Prolog System is a ISO-compatible high-performance Prolog compiler. Yap is widely considered one of the fastest available Prolog systems. Yap supports coroutining, CLP(QR), CHR, and depth-bound search. Tabling and parallelism are in development.

Last updated 2 years ago.
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 02, 2019, 07:57:02 pm
The Yap Prolog System is a ISO-compatible high-performance Prolog compiler. Yap is widely considered one of the fastest available Prolog systems. Yap supports coroutining, CLP(QR), CHR, and depth-bound search. Tabling and parallelism are in development.

Yet another ancient language on life-support... Fine if you want to relive the 1980s, but no one in their right mind would use this for production work.
Title: Re: Too many programming languages?
Post by: Siwastaja on October 02, 2019, 08:44:11 pm
It's called a "hobby". Nothing wrong with that.
Title: Re: Too many programming languages?
Post by: techman-001 on October 02, 2019, 08:46:03 pm
The Yap Prolog System is a ISO-compatible high-performance Prolog compiler. Yap is widely considered one of the fastest available Prolog systems. Yap supports coroutining, CLP(QR), CHR, and depth-bound search. Tabling and parallelism are in development.

Yet another ancient language on life-support... Fine if you want to relive the 1980s, but no one in their right mind would use this for production work.

Yeah it's a bit modern, I usually only use programming languages from the early 70's myself.

Forth-love? if honk then

One of my favorite languages goes all the way back to the year 1500 but it's still the most widespread language in use today  :)
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 02, 2019, 08:53:37 pm
(https://external-preview.redd.it/T600jxT-WVSoRvoj5gzOzLl_JGG1U_xR0tC1wIBoFfk.jpg?auto=webp&s=97c1b7396d21af70ace62663337c767681b174cb)
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 02, 2019, 08:58:29 pm
Ahah!

Although, I'm not sure Java fans really have ground to make fun of Ruby users... :P
Title: Re: Too many programming languages?
Post by: legacy on October 02, 2019, 09:53:01 pm
Yeah it's a bit modern

The modern AI has recently splitted into two branches. The first uses Lisp, the second uses TensorFlow.
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 02, 2019, 11:02:11 pm
Yeah it's a bit modern

The modern AI has recently splitted into two branches. The first uses Lisp, the second uses TensorFlow.

Really?

Time to be afraid, if so. ;D
Title: Re: Too many programming languages?
Post by: legacy on October 03, 2019, 12:46:01 am
Really?
Time to be afraid, if so. ;D

Yup. I sold a couple of X11-terminals and other "weird" equipment to a couple of dudes who work for Google Switzerland, and they told me a bit about the TensorFlow project, released in 2017 as *stable*. Now they also have TensorFlow hardware, made by Google itself. It's an ASIC chip full of matrix accelerators.

You cannot do any serious AI without it and without very large and super-fast storage. Hence, since Google is not interested in AI per se, but rather at classifying objects for their products and services, Skynet is no even near to our future.

We don't have to worry. At least, not for now.
Title: Re: Too many programming languages?
Post by: legacy on October 03, 2019, 12:51:17 am
There is also https://sourceforge.net/projects/yap/

Thanks! Added to the list! :D
Title: Re: Too many programming languages?
Post by: legacy on October 03, 2019, 12:56:30 am
Code: [Select]
2019-09-29--12-37-59---2019-09-29--12-38-43 - [ dev-lang/lua ] - success - root@dev2.30/8.3.1
2019-09-29--12-40-56---2019-09-29--13-03-20 - [ dev-lang/ruby ] - success - root@dev2.30/8.3.1

These two have been emerged on Linux/HPPA and on Linux/PowerPC!
I have never played with ruby, but I have a couple of applications (not written by me) that I'd like to try.

Just to try, not to study. I am already full of new stuff until the end of 2021, with Erlang and Prolog  ;D
Title: Re: Too many programming languages?
Post by: techman-001 on October 03, 2019, 02:16:00 am
Code: [Select]
Just to try, not to study. I am already full of new stuff until the end of 2021, with Erlang and Prolog  ;D
[/quote]

I'm in awe, it would take me 2 decades to learn Erlang and Prolog  :wtf:
Title: Re: Too many programming languages?
Post by: tggzzz on October 03, 2019, 06:05:49 am
Just to try, not to study. I am already full of new stuff until the end of 2021, with Erlang and Prolog  ;D

Prolog is worth knowing for the concepts. You might use the concepts, but you are unlikely to use Prolog itself. If you try to use it, choose an application where the "closed world assumption" isn't a problem.

Erlang is also worth knowing for the concepts (Actors+Prolog :) ), and it is commercially important. You might well come across some of them in other environments.
Title: Re: Too many programming languages?
Post by: legacy on October 03, 2019, 08:45:36 am
Besides being worth knowing for the concepts:


These three examples are my direct experiences, even if indirectly observed on the field  :-//

I am a simple mere C and Ada95 programmer ... even if I am going to spend two or three months around the Arctic pole, during winter when it's cold for the most, to get a "free" course about Erlang on a practical environment.

They call it "learn by doing", and it's supposed to be a "cool" experience (just, I wonder, does "cool" mean "cold"? ... oh well)
Title: Re: Too many programming languages?
Post by: tggzzz on October 03, 2019, 09:17:11 am
Besides being worth knowing for the concepts:

  • Prolog: "Stood" is written in C++ and Prolog, and it's used in avionics and military avionics for Hood and C-hood metodology are also written in Prolog. And, besides avionics, even in automotive FOM (formula one managment), e.g. RedBull uses Prolog-baded software to tune their race strategy
  • Erlang: is used by Facebook and Google

... Prolog: interesting. My direct experience of Prolog ended three decades ago, and I only observed (interesting) academic demonstrations of principle. It is fun to see it being used in anger.
... Erlang: unsurprisingly telecoms is a major user. ISTR hearing about Amazon/IMDB using it, but I'm not sure.

I'd always wanted a reason to use Erlang, but never came across one that was sufficiently compelling.

Quote
These three examples are my direct experiences, even if indirectly observed on the field  :-//

I am a simple mere C and Ada95 programmer ... even if I am going to spend two or three months around the Arctic pole, during winter when it's cold for the most, to get a "free" course about Erlang on a practical environment.

They call it "learn by doing", and it's supposed to be a "cool" experience (just, I wonder, does "cool" mean "cold"? ... oh well)

I hope the connectivity is sufficient to allow you to download tutorials etc.

Have fun; the people I've known that have worked in the Antartic have beards and a slightly different view of the world :)
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 03, 2019, 03:49:40 pm
You cannot do any serious AI without it and without very large and super-fast storage. Hence, since Google is not interested in AI per se, but rather at classifying objects for their products and services, Skynet is no even near to our future.

We don't have to worry. At least, not for now.

Oh, I think we do. ;D

But are you saying you can't do any serious AI work without Tensorflow, or did I misinterpret it? Because if so, ahem... I think you're going a bit overboard.
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 03, 2019, 03:53:25 pm
... Prolog: interesting. My direct experience of Prolog ended three decades ago, and I only observed (interesting) academic demonstrations of principle. It is fun to see it being used in anger.
... Erlang: unsurprisingly telecoms is a major user. ISTR hearing about Amazon/IMDB using it, but I'm not sure.

Agree and similar experience (although for Prolog, a little less than 30 decades ago, I guess I'm probably just a little younger.)

Both have interesting concepts and are worth "studying", but I would never use either for any industrial project (unless of course it was a hard requirement). It's not like it's likely to ever happen for Prolog anyway, as to Erlang, it could, but since telecoms are not my area, it's extremely unlikely as well.
Title: Re: Too many programming languages?
Post by: techman-001 on October 04, 2019, 04:04:25 am
Besides being worth knowing for the concepts:

  • Prolog: "Stood" is written in C++ and Prolog, and it's used in avionics and military avionics for Hood and C-hood metodology are also written in Prolog. And, besides avionics, even in automotive FOM (formula one managment), e.g. RedBull uses Prolog-baded software to tune their race strategy
  • Erlang: is used by Facebook and Google

These three examples are my direct experiences, even if indirectly observed on the field  :-//

I am a simple mere C and Ada95 programmer ... even if I am going to spend two or three months around the Arctic pole, during winter when it's cold for the most, to get a "free" course about Erlang on a practical environment.

They call it "learn by doing", and it's supposed to be a "cool" experience (just, I wonder, does "cool" mean "cold"? ... oh well)

Here is a Prolog link you may find interesting, apologies if you have seen it already.

https://en.wikipedia.org/wiki/Fifth_generation_computer

" ... Prof. Ehud Shapiro invented Concurrent Prolog, a novel concurrent programming language that integrated logic programming and concurrent programming. Concurrent Prolog is a logic programming language designed for concurrent programming and parallel execution. It is a process oriented language, which embodies dataflow synchronization and guarded-command indeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003,[3] which presented a Concurrent Prolog interpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on concurrent logic programming as the software foundation for the project. It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Ueda, which was the basis of KL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language.
Implementation

So ingrained was the belief that parallel computing was the future of all performance gains that the Fifth-Generation project generated a great deal of apprehension in the computer field. After having seen the Japanese take over the consumer electronics field during the 1970s and apparently doing the same in the automotive world during the 1980s, the Japanese in the 1980s had a reputation for invincibility. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.

Five running Parallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c. The project also produced applications to run on these systems, such as the parallel database management system Kappa, the legal reasoning system HELIC-II, and the automated theorem prover MGTP, as well as applications to bioinformatics. ... "
Title: Re: Too many programming languages?
Post by: bjdhjy888 on October 04, 2019, 04:58:44 am
Given the fact that my C# book has 1000+ pages and so do my other programming books on Java, C++, when will I be able to master all the features of these programming languages?!

Reading them all would cost me years, not to metion using them the way I want.....

Should I focus on C only?!?

 :-[
Title: Re: Too many programming languages?
Post by: Berni on October 04, 2019, 05:12:08 am
For embedded programing on MCUs id say C is all you ever need, along with a tiny bit of assembler so that you know how your architecture works.

Not everything is about learning the language. Even just learning how the particular MCU family works can be quite a bit of time. Some of these modern ones have gotten pretty advanced, like that they have various memory regions so one piece of RAM might be slower than another piece of RAM, they have caches, they have a MMU (Memory Management Unit), the DMA might only be able to work in certain RAM regions, perhaps there is external RAM connected to the MCU then you need to take care of initializing it etc... Its all still less complex than trying to run baremetal C code on something like a big multicore MPU, but its going in that direction.
Title: Re: Too many programming languages?
Post by: brucehoult on October 04, 2019, 06:36:46 am
Given the fact that my C# book has 1000+ pages and so do my other programming books on Java, C++, when will I be able to master all the features of these programming languages?!

Reading them all would cost me years, not to metion using them the way I want.....

Should I focus on C only?!?

 :-[

C is certainly a lot simpler and easier to master. The official specification has still gotten very long by now.

If you're a beginner to programming then Scheme is far simpler than C and is great for learning basic programming through to very advanced algorithms. especially, the syntax of Scheme is trivial, and the semantics (meaning) is very easy to describe and learn and reason about too. The "Racket" IDE and documentation is a great place to start. Once you know Scheme and programming concepts, it's easy to move to C. Probably easier than learning C directly.

One of the most important things about programming in C is understanding the underlying machine model.

What better way to learn the machine model than to program the machine in assembly language? Some assembly languages can be described in just a couple of pages. ARMv1 (https://en.wikichip.org/wiki/arm/armv1) for example (which all still works on modern ones, except for the condition codes hidden in the upper bits of the PC), or the base RISC-V ISA.
Title: Re: Too many programming languages?
Post by: Kjelt on October 04, 2019, 06:48:21 am
Read the books chapters on the main semantics than start reading code, preferably at your company so it has some quality. Then write your own code step by step.
It really is like learning to ride a bicycle or driving or learning a real speaking language, you can't do it by just reading books.
Title: Re: Too many programming languages?
Post by: tggzzz on October 04, 2019, 07:00:20 am
So ingrained was the belief that parallel computing was the future of all performance gains that the Fifth-Generation project generated a great deal of apprehension in the computer field. After having seen the Japanese take over the consumer electronics field during the 1970s and apparently doing the same in the automotive world during the 1980s, the Japanese in the 1980s had a reputation for invincibility. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.

I was there at the time.

The main reasons that "failed" were that cores/memory was expensive and big, and that semiconductor processes were advancing very fast (wait a few years and your SISD was 10* faster without you doing anything.

Neither of those conditions hold any more. Semiconductor processing has reached a speed barrier dictated by thermodynamics (processors' heat flux is higher than in a nuclear reactor core) and statistics (atoms/electrons in a junction). Cores are cheap. You can get a 32 core 4000MIPS MCU for the cost of a 6800 in the late 70s.

Changed constraints imply changed innovations. Parallelism is now the future.
Title: Re: Too many programming languages?
Post by: tggzzz on October 04, 2019, 07:01:51 am
Read the books chapters on the main semantics than start reading code, preferably at your company so it has some quality.

The semantics of a languages is indeed a key point to understand; syntax is comparatively simple.
Title: Re: Too many programming languages?
Post by: bjdhjy888 on October 04, 2019, 12:27:45 pm
I guess I'm crazy.

It's not that I won't learn a programming language, it's just that I want to learn them all!

Assembly, C, C++, C#, Java, Python, ASP.Net MVC, Javascript, mYSQL, SQL server, Linq, Lua Verilog, VHLD, even things like Oracle, .Net,  ETC ETC!!!!!!!!!!!!!!!!!!!!!!!!!!!!  :scared: :scared: :scared: :scared:

I WANT TO LEARN THEM ALL, AS THEY ARE ALL COOL AND EMPLoYERS WANT ME TO HAVE THEM! BUT I JUST DON'T HAVE THE TIME!
 :scared: :scared: :scared: :scared: :scared: :scared: :scared: :scared: :horse: :horse: :horse: :horse:
Title: Re: Too many programming languages?
Post by: Kjelt on October 04, 2019, 01:37:53 pm
The HR dept of companies are crazy, they ask all their IT guys what knowledge a new candidate should have, puts them all together in an add.
They kind of ask for a 22 yr old MSc with ten years of job experience knowing everything  :)
Title: Re: Too many programming languages?
Post by: SparkyFX on October 04, 2019, 02:54:14 pm
I guess I'm crazy.

It's not that I won't learn a programming language, it's just that I want to learn them all!
You don't have to, all you need to understand how the machine that executes this operates and abstract from there. Languages are only abstractions of this.
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 04, 2019, 03:13:00 pm
If you're a beginner to programming then Scheme is far simpler than C and is great for learning basic programming through to very advanced algorithms. especially, the syntax of Scheme is trivial, and the semantics (meaning) is very easy to describe and learn and reason about too.

Just make sure you use an editor that does parentheses matching, otherwise you'll drive yourself nuts.

Here's the definitive intro to Scheme that's used at MIT: https://web.mit.edu/alexmv/6.037/sicp.pdf
Title: Re: Too many programming languages?
Post by: tggzzz on October 04, 2019, 03:31:35 pm
I guess I'm crazy.

It's not that I won't learn a programming language, it's just that I want to learn them all!

Assembly, C, C++, C#, Java, Python, ASP.Net MVC, Javascript, mYSQL, SQL server, Linq, Lua Verilog, VHLD, even things like Oracle, .Net,  ETC ETC!!!!!!!!!!!!!!!!!!!!!!!!!!!!  :scared: :scared: :scared: :scared:

I WANT TO LEARN THEM ALL, AS THEY ARE ALL COOL AND EMPLoYERS WANT ME TO HAVE THEM! BUT I JUST DON'T HAVE THE TIME!
 :scared: :scared: :scared: :scared: :scared: :scared: :scared: :scared: :horse: :horse: :horse: :horse:

Adult life (unlike a child's life) is about making choices that will limit the directions you can go in in the future. Deal with it.

The skills are to not make a choice until necessary, and not to unnecessarily preclude options that you want to preserve.

Bear in mind that you will have to change during your career, unless you are highly specialised, your specialism continues to valuable, and you are content with 1 year of experience repeated 10 times :)
Title: Re: Too many programming languages?
Post by: paul_g_787 on October 04, 2019, 04:15:25 pm
Sorry everybody. Couldn't resist this one  ;) Enjoy!
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 04, 2019, 05:06:54 pm
If you're a beginner to programming then Scheme is far simpler than C and is great for learning basic programming through to very advanced algorithms. especially, the syntax of Scheme is trivial, and the semantics (meaning) is very easy to describe and learn and reason about too.

Just make sure you use an editor that does parentheses matching, otherwise you'll drive yourself nuts.

Ahah, indeed.

As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

Title: Re: Too many programming languages?
Post by: tggzzz on October 04, 2019, 06:02:30 pm
If you're a beginner to programming then Scheme is far simpler than C and is great for learning basic programming through to very advanced algorithms. especially, the syntax of Scheme is trivial, and the semantics (meaning) is very easy to describe and learn and reason about too.

Just make sure you use an editor that does parentheses matching, otherwise you'll drive yourself nuts.

Ahah, indeed.

As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

Try MIT and "Structure and Interpretation of Computer Programs".

"It was formerly used as the textbook for MIT's introductory course in electrical engineering and computer science." https://en.wikipedia.org/wiki/Structure_and_Interpretation_of_Computer_Programs

But then MIT isn't a trade school :)
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 04, 2019, 06:09:16 pm
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 04, 2019, 06:23:36 pm
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.

Sure, but even so.
I kind of doubt Knuth or Wirth, for instance, would agree with using Scheme to *introduce* programming.
I happen not to either. And it has nothing to do with the fact it would be applicable to the real world or not. It's just about teaching, especially computer science.
Just my opinion (and not just mine IMO) here. If the MIT thought it was a proper way of teaching programming and introduce computer science, good with them, and I'm not saying they are stupid in the least either. That just sounded like an argument of authority though, so I took the liberty of using it as well mentioning Knuth and Wirth ;D

Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 04, 2019, 07:37:04 pm
At a company I used to work for we set up a lunch-time seminar where every week we'd go though one chapter of that MIT Scheme book. All of the attendees of the seminar were experienced software engineers, some with decades of experience. We worked all of the exercises individually and as a group. At the end, everyone thought it was a worthwhile experience and everyone said they learned something.
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 04, 2019, 07:42:50 pm
At a company I used to work for we set up a lunch-time seminar where every week we'd go though one chapter of that MIT Scheme book. All of the attendees of the seminar were experienced software engineers, some with decades of experience. We worked all of the exercises individually and as a group. At the end, everyone thought it was a worthwhile experience and everyone said they learned something.

I'm sure it was! But aren't you kind of confirming (at least, your example doesn't confirm the opposite) what I just said above? That it may be an interesting learning experience -  as long as you're *already* experienced (and IMHO, a lot less so if you're not, as we were talking about an introductory course, unless I misunderstood something...) Are MIT students taking an introductory course experienced software engineers? My point was just pedagogical, but you may, of course, have a different view on pedagogy.
Title: Re: Too many programming languages?
Post by: tggzzz on October 04, 2019, 08:16:58 pm
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.

Oh, mutter.

Scheme is simple and orthogonal, without lots of strange history that enables experienced people to discuss what they think ought to happen. Yes, C, I'm looking at you.

Basically I want new programmers to concentrate on getting the big picture fundamentals in decent order. Horrible corner cases can come later - and often but not always will come later. "Which button to push" knowledge has a very short half life, and is a waste of time.
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 04, 2019, 09:20:24 pm
I think Scheme is great as both an introductory language and to teach old dogs new tricks. It's a simple language that lets you concentrate on the concepts rather than details of syntax.
Title: Re: Too many programming languages?
Post by: techman-001 on October 05, 2019, 01:38:54 am
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.

Sure, but even so.
I kind of doubt Knuth or Wirth, for instance, would agree with using Scheme to *introduce* programming.
I happen not to either. And it has nothing to do with the fact it would be applicable to the real world or not. It's just about teaching, especially computer science.
Just my opinion (and not just mine IMO) here. If the MIT thought it was a proper way of teaching programming and introduce computer science, good with them, and I'm not saying they are stupid in the least either. That just sounded like an argument of authority though, so I took the liberty of using it as well mentioning Knuth and Wirth ;D

Brucehoult said "The "Racket" IDE and documentation is a great place to start. Once you know Scheme and programming concepts, it's easy to move to C. Probably easier than learning C directly." and I agree with him.

DrRacket is designed to teach computer programming via examples and comes with "teachpacks" and tons of documentation for the beginner. The emphasis is on learning computer programming not the language. https://download.racket-lang.org (https://download.racket-lang.org)

“Racket” is more of an idea about programming languages than a language in the usual sense.  https://docs.racket-lang.org/guide/more-hash-lang.html (https://docs.racket-lang.org/guide/more-hash-lang.html)

I'm improving my programming skills and learning lisp/scheme myself using DrRacket . It's fun and some of the concepts are actually beginning to sink in, I have even lost my "bracket phobia".
Title: Re: Too many programming languages?
Post by: brucehoult on October 05, 2019, 04:21:41 am
If you're a beginner to programming then Scheme is far simpler than C and is great for learning basic programming through to very advanced algorithms. especially, the syntax of Scheme is trivial, and the semantics (meaning) is very easy to describe and learn and reason about too.

Just make sure you use an editor that does parentheses matching, otherwise you'll drive yourself nuts.

Here's the definitive intro to Scheme that's used at MIT: https://web.mit.edu/alexmv/6.037/sicp.pdf

If you need deeply nested parentheses then you're dong it wrong. Except in arithmetic expressions which are seldom very complicated anyway) the nesting depth of ((( in Scheme is generally no deeper than the total nesting of (, [, and { in C.

sicp is a great book, but it's designed for people with very high mathematical knowledge and there are much better Scheme choices for average programmers.
Title: Re: Too many programming languages?
Post by: brucehoult on October 05, 2019, 04:24:35 am
At a company I used to work for we set up a lunch-time seminar where every week we'd go though one chapter of that MIT Scheme book. All of the attendees of the seminar were experienced software engineers, some with decades of experience. We worked all of the exercises individually and as a group. At the end, everyone thought it was a worthwhile experience and everyone said they learned something.

I'm sure it was! But aren't you kind of confirming (at least, your example doesn't confirm the opposite) what I just said above? That it may be an interesting learning experience -  as long as you're *already* experienced (and IMHO, a lot less so if you're not, as we were talking about an introductory course, unless I misunderstood something...) Are MIT students taking an introductory course experienced software engineers? My point was just pedagogical, but you may, of course, have a different view on pedagogy.

This is about the contents and style of SICP, not about any difficulty in Scheme.
Title: Re: Too many programming languages?
Post by: brucehoult on October 05, 2019, 04:26:16 am
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.

Oh, mutter.

Scheme is simple and orthogonal, without lots of strange history that enables experienced people to discuss what they think ought to happen. Yes, C, I'm looking at you.

Basically I want new programmers to concentrate on getting the big picture fundamentals in decent order. Horrible corner cases can come later - and often but not always will come later. "Which button to push" knowledge has a very short half life, and is a waste of time.

Exactly.

Scheme doesn't have corner cases. Everything you think *should* work, does.

C is an utter minefield in comparison.
Title: Re: Too many programming languages?
Post by: techman-001 on October 05, 2019, 10:34:29 am
As to introducing programming with Scheme? Really? Ahem.
To use a funny parallel, I'd say it would be a bit like introducing physics with quantum mechanics. ::)

This is MIT we're talking about here, and they probably do start with quantum mechanics in their intro to physics classes. The whole point of that class is to teach people to think like a computer scientist, not to teach something applicable to the real world--that comes later.

Oh, mutter.

Scheme is simple and orthogonal, without lots of strange history that enables experienced people to discuss what they think ought to happen. Yes, C, I'm looking at you.

Basically I want new programmers to concentrate on getting the big picture fundamentals in decent order. Horrible corner cases can come later - and often but not always will come later. "Which button to push" knowledge has a very short half life, and is a waste of time.

Exactly.

Scheme doesn't have corner cases. Everything you think *should* work, does.

C is an utter minefield in comparison.

Lisp even has a song :)

 https://www.youtube.com/watch?v=u-7qFAuFGao (https://www.youtube.com/watch?v=u-7qFAuFGao)
Title: Re: Too many programming languages?
Post by: westfw on October 05, 2019, 11:22:47 am
Bob Kanefsky writes (wrote?) lisp code for nasa/jpl mars missions.
Iirc, it’s sorta planning/verification code that parses and checks command streams intended to be sent to the landers, run way ahead of time.
It’s a great song.  I really like the line about assembler...

Title: Re: Too many programming languages?
Post by: SiliconWizard on October 05, 2019, 04:15:28 pm
C is an utter minefield in comparison.

Never said C would be an appropriate language to teach programming either. ;D

Oh well. Never tickle LISP users...
Title: Re: Too many programming languages?
Post by: techman-001 on October 06, 2019, 11:59:03 am
Bob Kanefsky writes (wrote?) lisp code for nasa/jpl mars missions.
Iirc, it’s sorta planning/verification code that parses and checks command streams intended to be sent to the landers, run way ahead of time.
It’s a great song.  I really like the line about assembler...

It's a great song and kinda catchy.

The bit about assembler is generally true but I have a few Forth tools that make writing/testing assembly much faster and a lot easier than traditional assembly methods.

What you're looking at here is in fact Forth code, looks a lot like assembly doesn't it ? Forth can do that :)

      : ms ( u -- )           \ Blocking delay for use on STM32F0xx @ 8MHz RC Clock. "1000 ms" = 1 second delay
l-:   ldr= r0 1913   
l-:   subs r0 #1
      bne -
      subs r6 #1
      bne --      
      drop
      ;

This is the actual "ms" Word disassembly after it has been compiled by Forth: The movs,lsls and adds commands are the only way to load a Cortex-M0 register with a value greater than 8 bits without referencing the PC (using Thumb). The ldr= Word works out the shortest possible sequence of movs,lsls and adds for the given value. If you shift 0xef left 3 places then add 1, you get 1913.

: calc $ef 3 lshift 1 + . cr ;  ok.
 calc 1913

"ms" Word disassembly
--------------------------------
200005B8: 20EF  movs r0 #EF
200005BA: 00C0  lsls r0 r0 #3
200005BC: 3001  adds r0 #1
200005BE: 3801  subs r0 #1
200005C0: D1FD  bne 200005BE
200005C2: 3E01  subs r6 #1
200005C4: D1F8  bne 200005B8

Once I've developed my delay word and tested it, I can then just inline the Machine Code into a Forth Word like this, which doesn't need the "Interactive Assembler" facility I used above:

: ms ( u -- )     \ millisecond blocking delay for Cortex-m0 with 8MHz rc clock (mecrisp-stellaris default)
   [       
   $20EF h,
   $00C0 h,
   $3001 h,
   $3801 h,
   $D1FD h,
   $3E01 h,
   $D1F8 h,
   ] drop
 ;
Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 12:15:38 pm
I am really thinking about pushing a micro Forth or Lisp core inside the Linux kernel (yes, in kernel space), because Kgdb with "early console" sucks ass on the SGI/MIPS IP30(1), and because I need a way to reverse engineer a blasted IBM-SGI crossbar chip, whose behavior is weird and covered by no public documentation.

With ucLisp, or Forth in Kernel space, it would be possible to modify the kernel behavior around that blasted crossbar chip realtime and interactively, without recompiling anything.



(1) the first serial console usable is behind a PCI bridge, which is behind the crossbar chip  :palm:
Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 12:33:34 pm
(http://www.downthebunker.com/chunk_of/stuff/public/projects/workstations/mips/sgi-ip30/sgi-ip30-machines-A-B.jpg)

This is the hardware I am talking about. Note the hacked front plane: we added a mechanism to remotely reboot (and power on/off) the machine when the kernel screws up.

(http://www.downthebunker.com/chunk_of/stuff/public/projects/workstations/mips/sgi-ip30/sgi-ip30-kernel-exit.png)

We have recently developed two holy software functions, "machine_poweroff()" and "machine_reset()", which do their job, but ... you cannot trust them, because sometimes that blasted crossbar chip (which is also a "heart" chip) goes nuts with the SMP and it gets the CPUs completely out of control.

And here is where the hacked front pannel plays its trick and saves your day.

So, we are afraid of no freak-ghost in kernel space  :D
Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 12:37:24 pm
(to be continued ... in 2020, 2021 ... )
Title: Re: Too many programming languages?
Post by: tggzzz on October 06, 2019, 12:47:26 pm
C is an utter minefield in comparison.

Never said C would be an appropriate language to teach programming either. ;D

Oh well. Never tickle LISP users...

There is too much "inappropriate teaching", unfortunately. Prodding LISP users can be fun, provided care is taken to ensure everybody realises what is happening :) (Ditto Java/C++/xC/Ada/Smalltalk/Rust/VHDL/etc)
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 06, 2019, 03:47:37 pm
I am really thinking about pushing a micro Forth or Lisp core inside the Linux kernel (yes, in kernel space)

Whereas I'm sure some in your team are pushing for an Erlang runtime. ;D
Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 04:02:58 pm
Whereas I'm sure some in your team are pushing for an Erlang runtime. ;D

We have already pushed it onto a MIPS/LE router, and it's experimental profiled as common userspace application; while uLisp or uForth in Kernel space is really something *freak*, but it's necessary because we have no freak idea how to reverse engineering that freak hardaware.

With freak hardware stuff, you need freak tools to fight your battles ;D


Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 04:11:40 pm
(http://www.downthebunker.com/chunk_of/stuff/public/projects/sonoko-x11/router-board/pics/rb532-without-502-1.jpg)

A lot of work to make it happen, but this actually runs Erlang  :o :o :o
Title: Re: Too many programming languages?
Post by: SiliconWizard on October 06, 2019, 04:17:59 pm
A lot of work to make it happen, but this actually runs Erlang  :o :o :o

Nice. Now program this in Rust. :P
Title: Re: Too many programming languages?
Post by: legacy on October 06, 2019, 04:36:06 pm
Rust cannot be compiled on MIPS at the moment, because it depends on LLVM, which has some emm emm quriks and problems with MIPS  :o :o :o

(the above router is MIPS32-r2/LE:32bit, while the above big-iron machine is MIPS4-6/BE:64bit)

Sooner, or later, we will push Rust on PowerPC (maybe also POWER9? any sponsor here?), while on HPPA we have to give up because LLVM has never been planned to happen.

Anyway, the above Erlang-router has two purposes: our pleasure to play with (which offers a good psycological motivation to seriously learn it), and to satisfy a new-customer request.

Title: Re: Too many programming languages?
Post by: techman-001 on October 06, 2019, 09:19:49 pm
Whereas I'm sure some in your team are pushing for an Erlang runtime. ;D

We have already pushed it onto a MIPS/LE router, and it's experimental profiled as common userspace application; while uLisp or uForth in Kernel space is really something *freak*, but it's necessary because we have no freak idea how to reverse engineering that freak hardaware.

With freak hardware stuff, you need freak tools to fight your battles ;D

Is there a uForth ? Normal Forth is the smallest interactive binary you'll ever find apart from Basic (ughh brain damage awaits there) I think.
Title: Re: Too many programming languages?
Post by: Sal Ammoniac on October 07, 2019, 04:04:25 pm
sicp is a great book, but it's designed for people with very high mathematical knowledge and there are much better Scheme choices for average programmers.

That I would (partially) agree with. Scheme is simple, but the SICP book quickly gets difficult for average people. I would say that wasn't an issue for its original target audience--MIT students--who are hardly average.

When I went through that book, I had previously written thousands of lines of Elisp (EMACS' extension language), so I was already familiar with LISP (which Scheme is based on).
Title: Re: Too many programming languages?
Post by: westfw on October 08, 2019, 02:08:53 am
Quote
Is there a uForth ?
For a kernel-internal Forth like I think Legacy was talking about, I'd consider anything that doesn't interface directly OS services as a sort of "MicroForth."  You know, for Poking around at the OS datastructures and such for debugging purposes, but not for actually being a larger-system programming language...
Title: Re: Too many programming languages?
Post by: bjdhjy888 on October 09, 2019, 03:19:30 am
Does westfw believe the earth is flat? Yes.
Does westfw believe NASA astronauts did not land on the moon? Yes.
Does westfw believe Forth is ranked as the number 1 programming language in 2019? YES!
Does westfw want to hire me as a programmer, where I only know C++? YES! YES!
 :popcorn:
Title: Re: Too many programming languages?
Post by: legacy on October 09, 2019, 06:03:26 am
[..] I'd consider anything that doesn't interface directly OS services as a sort of "MicroForth."  You know, for Poking around at the OS datastructures and such for debugging purposes, but not for actually being a larger-system programming language...

yup, precisely. It's a crazy idea, I know, but I am out of alternatives  :-//
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 06:04:14 am
Does westfw believe the earth is flat? Yes.
Does westfw believe NASA astronauts did not land on the moon? Yes.
Does westfw believe Forth is ranked as the number 1 programming language in 2019? YES!
Does westfw want to hire me as a programmer, where I only know C++? YES! YES!
 :popcorn:

Whilst I can't presume to speak for Westfw, I feel it's more likely that he believes that you have poor fact retention because I'm the Forth guy around here, not him.

Westfw is merely your highly experienced embedded professional with serious credibility, very wide experience, impeccable manners and the patience of Job.

I  don't have any programming positions open, but if I need a clown, I'll bear you in mind.
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 06:06:34 am
[..] I'd consider anything that doesn't interface directly OS services as a sort of "MicroForth."  You know, for Poking around at the OS datastructures and such for debugging purposes, but not for actually being a larger-system programming language...

yup, precisely. It's a crazy idea, I know, but I am out of alternatives  :-//

Why crazy ?
Freebsd uses Forth as a bootloader (FICL).
Title: Re: Too many programming languages?
Post by: legacy on October 09, 2019, 06:31:44 am
Why crazy ?
Freebsd uses Forth as a bootloader (FICL).

well, even the PowerMac's firmware comes with a Forth interpreter, but it doesn't operate inside any kernel, it's a bootloader, and it loads a kernel  :D

While I am going to push an interpreter inside the kernel, that is the crazy part, and it's so really crazy, that it only makes sense *IF* and only *IF* you cannot use kgdb or any hardware debugger (e.g. BDI2000? for IP30? forget it, there is no kwnon and documented e/jtag) for debugging the kernel.
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 06:48:07 am
Why crazy ?
Freebsd uses Forth as a bootloader (FICL).

well, even the PowerMac's firmware comes with a Forth interpreter, but it doesn't operate inside any kernel, it's a bootloader, and it loads a kernel  :D

While I am going to push an interpreter inside the kernel, that is the crazy part, and it's so really crazy, that it only makes sense *IF* and only *IF* you cannot use kgdb or any hardware debugger (e.g. BDI2000? for IP30? forget it, there is no known and documented e/jtag) for debugging the kernel.

DOH! of course. Apologies.

Even so, as your aim is to obtain information on "that blasted crossbar chip" would not a self contained Forth help to investigate and map that chips operation within the hardware itself ?
Title: Re: Too many programming languages?
Post by: legacy on October 09, 2019, 07:03:14 am
Even so, as your aim is to obtain information on "that blasted crossbar chip" would not a self contained Forth help to investigate and map that chips operation within the hardware itself ?

Precisely  :D

Consider that there is no directly exposed PCI bridge, hence you cannot open a PCI window anywhere to attach anything, neither a PCI bus analyzer. Oh, and there is also no jtag.

This explains why we are still unable to plug a PCI-USB card in the XIO-PCI cadge with success although the IP30 has been reverse-engineered in the last 15 years.

I believe, don't want to appear arrogant, but I believe it's time to try a different approach  :D
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 09, 2019, 07:30:22 am
I run this sort-of-forth in all my esp32s: "Autodesk Threaded Language Application System Toolkit"

https://www.fourmilab.ch/atlast/ (https://www.fourmilab.ch/atlast/)
https://www.eevblog.com/forum/microcontrollers/somebody-goofed-up/msg2668650/#msg2668650 (https://www.eevblog.com/forum/microcontrollers/somebody-goofed-up/msg2668650/#msg2668650)

Just "nc esp32ip 23" and voilà, c'est magnifique!
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 08:49:02 am
I run this sort-of-forth in all my esp32s: "Autodesk Threaded Language Application System Toolkit"

https://www.fourmilab.ch/atlast/ (https://www.fourmilab.ch/atlast/)
https://www.eevblog.com/forum/microcontrollers/somebody-goofed-up/msg2668650/#msg2668650 (https://www.eevblog.com/forum/microcontrollers/somebody-goofed-up/msg2668650/#msg2668650)

Just "nc esp32ip 23" and voilà, c'est magnifique!

George, very nice, thanks for the link!

Every Forth I have seen is different, no two are alike. There really isn't a "standard Forth" in my opinion. Every one is a "sort of Forth".

Even Chuck Moore ( the father of Forth) when asked "what is Forth ?" replied "I can't say for sure, but I know it when I see it".

From the Atlast document:-

... "So what is Atlast? Well...it's FORTH, more or less. Now I'm well aware that the mere mention of FORTH stimulates a violent immune reaction in many people second, perhaps, only to that induced by the utterance of the dreaded word "LISP." Indeed, more that 12 years after my first serious encounter with FORTH, I am only now coming to feel that I am truly beginning to "get it"--to understand what it's really about, what its true strengths (and weaknesses) are, and to what problems it can offer uniquely effective solutions."

"... Atlast™ is a toolkit that makes applications programmable. Deliberately designed to be easy to integrate both into existing programs and newly-developed ones, Atlast provides any program that incorporates it most of the benefits of programmability with very little explicit effort on the part of the developer. Indeed, once you begin to “think Atlast” as part of the design cycle, you'll probably find that the way you design and build programs changes substantially. I'm coming to think of Atlast as the “monster that feeds on programs,” because including it in a program tends to shrink the amount of special-purpose code that would otherwise have to be written while resulting in finished applications that are open, extensible, and more easily adapted to other operating environments such as the event driven paradigm. .."

Compiled fast and easily and running on Freebsd:
atlast-2.0% ./atlast
ATLAST 2.0 (2014-07-04) [64-bit] This program is in the public domain.
-> 2 2 + .
4 -> words

STDERR
STDOUT
STDIN
+
-
*
/
MOD
/MOD
MIN
MAX
NEGATE
ABS
=
<>
>
<
>=
<=
AND
->
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 09, 2019, 09:01:27 am
Splendid!  :clap:

How does it compare to the other Forths you use? Is it speedy? Is the binary small-ish? How does the .bin size compare to the other ones you've got?

Edit:
I'd want to use the smallest one I can find, because I'm seeing quite a lot of flash cache misses. That slows down my esp32s.
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 10:26:59 am
Splendid!  :clap:

How does it compare to the other Forths you use? Is it speedy? Is the binary small-ish? How does the .bin size compare to the other ones you've got?

Edit:
I'd want to use the smaller one I can find, because I'm seeing quite a lot of flash cache misses. That slows down my esp32s.

Caveat: I'm a Forth user, not a Forth designer, although I program, I'm not a programmer, I'm a 65 year old electronics technician.

Comparisons.
I can't really compare any two Forths, they're all different. Even the one I use everyday "Mecrisp-Stellaris" has differences between Cortex-M chips, even ones made by the same company due to Flash controller designs, peripheral differences etc.

Speed
It's blindingly fast on this 8 core I7, but then so is Cortex-M Forth running under QEMU.

Binary Size
Atlast on FreeBSD X86: 123072 Bytes, not particularly small.
Picolisp on FreeBSD X86 is 209608 Bytes.
FICL Forth on FreeBSD X86 is 7232 bytes, REALLY TINY. http://ficl.sourceforge.net/ (http://ficl.sourceforge.net/)
   Ficl is a complete programming language interpreter designed to be embedded into other systems (including firmware based ones) as a command, macro,
and development prototype language.  Ficl stands for "Forth Inspired Command Language".
Mecrisp-Stellaris on 32 bit Cortex-M0 is 16KB, written in Assembly.

General Comments
Documentation: excellent and professionally written
Resources: it has a TON of useful Words for X86 including file operations, maths, geometry etc.
General feel: it feels very well designed and slick, I like it

I have a question or two myself
1. how do you talk to ATlast on a ESP32, I assume you had to write a USART terminal interface ?
2. I assume you use GCC and compile C for the ESP32, then use ATlast  as a debugger for your program ?
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 09, 2019, 11:01:41 am
Here it is: https://gitlab.com/xk/esp32_pseudo_forth

If you want to telnet or netcat that's not there, it's a later version I haven't uploaded there. This one works over the serial port only. I can try to update it soon-ish if you're interested. Just let me know.
Title: Re: Too many programming languages?
Post by: techman-001 on October 09, 2019, 11:39:31 am
Here it is: https://gitlab.com/xk/esp32_pseudo_forth

If you want to telnet or netcat that's not there, it's a later version I haven't uploaded there. This one works over the serial port only. I can try to update it soon-ish if you're interested. Just let me know.

That's very kind of you but I won't need it as I'm pretty much only Cortex_M0 for the next decade or so :)

I do everything via serial and GnuScreen for automated remote source uploads, error detection etc.
 
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on October 09, 2019, 12:55:50 pm
That's very kind of you but I won't need it as I'm pretty much only Cortex_M0 for the next decade or so :)

I do everything via serial and GnuScreen for automated remote source uploads, error detection etc.

You ought to play with an esp32 asap... Then you'll routinely do many more ncs than screens...

FICL Forth on FreeBSD X86 is 7232 bytes, REALLY TINY. http://ficl.sourceforge.net/ (http://ficl.sourceforge.net/)
   Ficl is a complete programming language interpreter designed to be embedded into other systems (including firmware based ones) as a command, macro,
and development prototype language.  Ficl stands for "Forth Inspired Command Language".

I'm going to have to try that one.
Title: Re: Too many programming languages?
Post by: westfw on October 10, 2019, 01:07:59 am
Quote
Whilst I can't presume to speak for Westfw, I feel it's more likely that he believes that you have poor fact retention because I'm the Forth guy around here, not him.
heh.  Pretty much.   And thank you for the kind words...
I hope that what the original poster gets from all these many pages of discussion is that there are a lot of "languages" that have a significant segment of "haters", but are still useful IN SOME SITUATIONS.Not many things hurt worse than trying to shoe-horn some function into a language where it just doesn't belong, just because it's the language you know (or worse, the language your employer forces you to use.)
Title: Re: Too many programming languages?
Post by: legacy on October 10, 2019, 07:31:27 am
or worse, the language your employer forces you to use.

DTB is a sort of *language* to describe nodes in a device tree. I know where it makes sense, but PowerPC SoCs (by AMCC) cannot be listed in any valid motivation, besides I do find DTB as unnecessary extra compressibility (which makes things more prone to fail), hence I do not like it, but I am forced to learn and use it.

Title: Re: Too many programming languages?
Post by: legacy on October 10, 2019, 07:35:28 am
I do everything via serial and GnuScreen for automated remote source uploads, error detection etc.

Can you tell me more about this?
Title: Re: Too many programming languages?
Post by: techman-001 on October 10, 2019, 12:06:19 pm
I do everything via serial and GnuScreen for automated remote source uploads, error detection etc.

Can you tell me more about this?

Love to :)

Traditional Forth uses a serial terminal to interactively command (in real time) and upload source to the MCU. The Forth I use on Cortex-M is a traditional but modern Forth which I use traditionally making use of modern applications.

Forth serial Terminal use has a few variations not limited to my description below because anything is possible with Forth.

1. A terminal such as Minicom, Picocom, even Hyperterm in Windows is used to interactively command in real time and to upload source to the MCU making use of  the ASCII serial upload capability of the terminal or a 'helper' program called by the terminal for uploading.
In these cases the source code comments are read but rejected by the on-mcu Forth compiler and End Of Line delays are inserted by the terminal (where supported) to prevent the on-mcu Forth compiler choking on long or complex source. There is usually no handshaking of any kind to use as a alternative to the EOL delays and speeds are often 9600 Baud so that errors and warnings from the on-mcu Forth compiler can be read by the programmer as they occur.

2. A purpose made terminal with on board smarts and file upload capability such as eforthcom is used instead. This special terminal is made especially to suit Forth, and strips the comments from the source, recognizes errors and colors them and/or stops the upload completely at the error , plus waits for the "OK," return from the on-mcu Forth compiler giving the fastest possible upload speed without EOL delays or traditional hardware handshaking.

3. What I use.
I use gnuscreen in my own IDE (which is designed for the fastest possible speed) to do the following:

a. Interactively command the on-mcu Forth in the gnu screen window.
b. upload source to the mcu when I click my editor 'make button'. It does this by opening a remote connection to gnu screen and uploading commands and source via a Makefile. I do all this from the editor window but I see the upload as it occurs in the gnu screen window ... sort off, because at 460800 baud with hardware handshaking its hard to read anything as it flashes up the screen.
c. all comments are stripped from the source by SED before being uploaded in b. above
d. Any errors from the on-mcu Forth compiler are highlighted in RED in gnu screen using a sub-process involving configurable SED keywords and ANSII escape codes inserted into the RX stream. Warnings are coloured BLUE.
Warnings and errors RING the gnu screen terminal bell as they occur. This is all needed because the uploads are to fast to read.
e. Stop the upload at the error if desired. This can be done with a toggle switch at the development hardware. I don't use this nowadays as color errors and the beep of the terminal bell are all I need. Gnu screen has a long history buffer and I can easily scroll back to look for the colored error after hearing a beep.
f. Suck the entire Dictionary from the MCU in IntelHex and build a complete binary image of everything including user Words from a project. This image can then be flashed to another mcu which is then a 100% clone.

Personally I find traditional Forth such as 1. above far,far,far too slow and I wouldn't blame anyone who tries Forth in this way from thinking it was a load of ancient crap compared to using arm.none.eabi and GDB via SWD on a Cortex-M.

I developed my system until it was faster than arm.none.eabi and GDB via SWD on a Cortex-M using the usual write code, try, fix, repeat development cycle.

Only then was I happy.

My system is far more than just gnu screen above as it involves:
1. a project builder
2. integrated Fossil SCM with webserver
3. integrated configurable CMSIS-SVD memory map and bitfield generator for any STM32 Cortex-M mcu. This means I can easily write Forth code as I have every peripheral memory mapped automatically. I never  have to refer to the documentation for memory locations or register bitfields.Doing this manually is tedious, error prone and incredibly slow. No one should ever have to do this and I'm amazed by Forth people who still do this manually on Cortex-M.
Title: Re: Too many programming languages?
Post by: nigelwright7557 on November 21, 2019, 04:33:35 am
I am currently working on a MVC website project.
It uses HTML, ASP.NET. .net framework, C#, Razor, ajax, javascript, JSON and jquery.
It took a while to get into transferring data between the different languages.

Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 07:59:06 am
It uses HTML, ASP.NET. .net framework, C#, Razor, ajax, javascript, JSON and jquery.

My condolences.
Title: Re: Too many programming languages?
Post by: Berni on November 21, 2019, 08:12:07 am
I am currently working on a MVC website project.
It uses HTML, ASP.NET. .net framework, C#, Razor, ajax, javascript, JSON and jquery.
It took a while to get into transferring data between the different languages.

And this is one of the reasons i don't like web development.

Not only are there a gazillion languages for every single thing, but as time goes on browsers drop support for some technology and introduces new ones. Its all frameworks running on top of frameworks that use a framework trough another framework that runs inside a convoluted interpreter in a browser while having a few unique sets quirks depending on if your browser is Firefox, Chrome or Edge
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 09:17:27 am
I am currently working on a MVC website project.
It uses HTML, ASP.NET. .net framework, C#, Razor, ajax, javascript, JSON and jquery.
It took a while to get into transferring data between the different languages.

And this is one of the reasons i don't like web development.

Not only are there a gazillion languages for every single thing, but as time goes on browsers drop support for some technology and introduces new ones. Its all frameworks running on top of frameworks that use a framework trough another framework that runs inside a convoluted interpreter in a browser while having a few unique sets quirks depending on if your browser is Firefox, Chrome or Edge

... and all to put a few pixels on a screen, while pretending the screen a paper-like display :(

I exaggerate, but the layering of frameworks with poor documentation and unclear interactions which obscure fundamental principals is indeed a real issue.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 09:27:15 am
As there's a million ways to do a web, people are doing it in a million different manners. Both server and client side. That's a good thing if you ask me. I like it.
Title: Re: Too many programming languages?
Post by: westfw on November 21, 2019, 10:12:10 am
I had a co-worker/friend in college (~1980) whose resume trumpeted that it had used 9 different programs (or something like that) to format/print it.

I wasn't sure that that was a good thing to put in a resume :-)

(OTOH, that was back when most resumes were just typed on an actual typewriter, and programmers were rarer...)

Title: Re: Too many programming languages?
Post by: Berni on November 21, 2019, 10:36:30 am
As there's a million ways to do a web, people are doing it in a million different manners. Both server and client side. That's a good thing if you ask me. I like it.

Yeah but just because you can do it differently doesn't automatically mean you should.

A lot of these web technologies are quickly thrown together things that are not extensively tested trough time and history while having often poor documentation. Then making things worse these things are layered on top of each other to the point where the developer has no idea what is actually going on under the hood. Weird nonsense bugs pop from the interaction of all these and the developer "fixes" the bug by mostly just trying random things until it goes away.

If you look at the common languages used for developing classical Windows or Linux applications you see there is a lot less of them and its rare for languages to be layered on top of each other. Sure there are still multiple languages found in some apps such as perhaps a C# app using a DLL written in C++, but it doesn't tend to go to nearly the same extent as web stuff. Also finding bugs tend to be more methodical where you use a debugger to drill down all the way to the memory of a program and see if its doing what it should.

Nothing wrong with reinventing the wheel when there is a reason to do it. But don't reinvent the same 4 spoked aluminum wheel with 9 bolt holes in a star pattern and secure it using Triwing screws because you have a deep religious belief that things that are a multiple of 3 are better despite this making you incompatible with the same 4 spokes aluminium wheels that have a standard 5 hole pattern and hex head bolts that people have been using for many years with great success.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 10:53:42 am
and hex head bolts that people have been using for many years with great success.

That's the thing: nobody has been doing webs (in the same way, with the same "bolts" as you say) for many years with great success, because everything is in permanent flux. No two webs are the same, yet still, it works. There's beauty in that.
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 11:07:06 am
Nothing wrong with reinventing the wheel when there is a reason to do it.

Merely one infamous example of that was SOAP, which was RPC over HTTP with XML format. Now RPC just about works inside a single organisation with fast reliable intranets and that can control all the machines and processes interconnected with it. Even then it is a real pain to keep going; ask anybody that has dealt with CORBA-based systems!

Unfortunately none of those pre-requisites exist on the web, and it completely ignores the positive features of the web.

That didn't stop those ignorant of the past from unthinkingly jumping onto the SOAP bandwagon, and it took a long time for that fad to decline.
Title: Re: Too many programming languages?
Post by: Berni on November 21, 2019, 11:21:40 am
Things don't have to be constantly changing.

A good example of it is E-Mail. You could take a old 286 PC running MS-DOS, install a email client, dialup modem into the internet and read emails that someone sent you from a phone running the latest Android release.

Sure E-Mail did have some tweaks and updates over the many years but its still fundamentally the same thing.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 01:09:40 pm
That didn't stop those ignorant of the past from unthinkingly jumping onto the SOAP bandwagon, and it took a long time for that fad to decline.
The utter cluelessness of the W3C is to blame. They were trying to shove down our throats for more than a decade these and many other, complicated, useless, insufferable massive amounts of crap. Thankfully the WHATWG ended with that and most of it rests in peace now.

For example, don't you know that for FOURTEEN years there's been no way to programatically draw a line or paint not even a mere pixel on a web page? That's (the idiocy of) the W3C in action. No wonder people liked Flash!
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 01:42:56 pm
For example, don't you know that for FOURTEEN years there's been no way to programatically draw a line or paint not even a mere pixel on a web page? That's (the idiocy of) the W3C in action. No wonder people liked Flash!

No! A major point of the web and HTML is that the user's browser works out how best it can display the content on the user's device.  Note that there is no "server" in that sentence.

The concept that a web designer can specify individual pixels has lead to some of the worst design and least functional web pages. Consider the web page shown below that I have to contend with frequently:
(https://www.eevblog.com/forum/programming/too-many-programming-languages/?action=dlattach;attach=876232)

Alternatively, have a look at this idiocy, which I could read from the other side of the room...

(https://www.eevblog.com/forum/programming/too-many-programming-languages/?action=dlattach;attach=876236)
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 01:43:19 pm
Things don't have to be constantly changing.
The web gives you an empty virtual machine with an empty screen that's the web page, a low level API (the DOM), a fantastic language that's a chameleon, and there you are free to do whatever you want, however you like. Therefore: chaos. 7.7 billion people in the world => 7.7 billion ways to do it. But it works!
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 01:59:59 pm
No! A major point of the web and HTML is that the user's browser works out how best it can display the content on the user's device.  Note that there is no "server" in that sentence.

I meant the <canvas> tggzzz. Nothing to do with that! Have you seen the über cool web interface of the Rohde & Schwarz scopes? That's impossible to do without it.
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 02:05:56 pm
No! A major point of the web and HTML is that the user's browser works out how best it can display the content on the user's device.  Note that there is no "server" in that sentence.

I meant the <canvas> tgzzz. Nothing to do with that! Have you seen the über cool web interface of the Rohde & Schwarz scopes? That's impossible to do without it.

No, I haven't seen those but I doubt it is a benefit. Have a look at Digilent Waveforms. And no, I don't want to see a scope display on a cellphone, thank you.

Let's get the basics right before emulating TVs in a small part of the screen :)

Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 02:31:29 pm
You don't like web apps?
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 03:12:45 pm
You don't like web apps?

I've tried a WiFi scope on an 8" 1920*1024 tablet. Usable in special circumstances (e.g. up a ladder), not as a bench tool.

Q1: what's the USP benefit of a webapp over an app?

Q2: if running in a smartphone, what's the battery life?

Q3: if running in a smartphone, are the GUI controls usable on such a small screen?

Q4: how do you support the display so that it can be used hands free?

And most importantly, is it worth giving shiny new toys to web developer so they can ignorantly cock up finding information on the web in more ways? The latter is a resounding "no".
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 21, 2019, 04:35:30 pm
And most importantly, is it worth giving shiny new toys to web developer so they can ignorantly cock up finding information on the web in more ways? The latter is a resounding "no".

Oh, but that gives them the opportunity to flood Youtube with myriads of videos talking about the new web "technologies" and trying to teach (preach?) the whole world about those!  >:D
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 04:54:28 pm
And most importantly, is it worth giving shiny new toys to web developer so they can ignorantly cock up finding information on the web in more ways? The latter is a resounding "no".

Oh, but that gives them the opportunity to flood Youtube with myriads of videos talking about the new web "technologies" and trying to teach (preach?) the whole world about those!  >:D

Most such videos (and many blog posts) are little more than "look at me, I managed to install X and execute the equivalent of a 'Hello World' program".

The better ones go a bit deeper and try to replicate something that isn't a trivial canned example.

Very few do a useful "compare and contrast" exercise - probably because they haven't a clue about what already exists and is used in production systems.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 05:08:36 pm
Yes the web hipsters are annoying, but you aren't being a bit narrow minded? Because if I can grab any phone/tablet/desktop running any OS/iOS/Android/Linux/Windozes/MacOS/whatever and connect to my scope in a sec with the browser without having to find, buy or download and install anything, I call that a win. And it works (if done properly) just as well as a native app, that in many cases is made with the same HTML+JS in a webview packaged as a normal app.
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 21, 2019, 05:24:43 pm
Because if I can grab any phone/tablet/desktop running any OS/iOS/Android/Linux/Windozes/MacOS/whatever and connect to my scope in a sec with the browser without having to find, buy or download and install anything, I call that a win. And it works (if done properly) just as well as the proprietary/dedicated app, that in many cases is made with the same HTML+JS in a webview packaged as a normal app.

You're right that usefulness should be one of the main criterions here.

Now with your example with a scope, depends on what you're going to do? If it's just for capturing screenshots/retrieving sampled data, many modern scopes these days support appearing as a removable drive when plugged with USB - so retrieving files is trivial, and no specific software needs to be installed (the recent scopes that don't support this just plain suck, come on vendors!). You won't gain any time (it will just add some) using a tablet or phone as an intermediary, as the files are still ultimately likely to be used on a computer for further use, and not on a mobile device.

If it's to act as some kind of remote control (so not just retrieving files, but controlling the scope), I personally don't see the point. Any scopes, even the lousy ones, are a lot more usable directly using their knobs and buttons than using a remote, touch-screen interface. As a paradox, those mobile UIs for scopes (and otherwise lab equipment) often mimic actual, physical knobs and buttons (but of course a lot clunkier to use). What's the benefit?
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 05:38:58 pm
With the MicSig app you use it exactly in the same way: the touch interface in the tablet is identical. It isn't a web app, but could (should!) have been :) I often use it via WiFi just to take a screenshot and send it via whatsapp/telegram in a sec. No pendrives, no cables, no nothing needed.
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 06:14:49 pm
Yes the web hipsters are annoying, but you aren't being a bit narrow minded? Because if I can grab any phone/tablet/desktop running any OS/iOS/Android/Linux/Windozes/MacOS/whatever and connect to my scope in a sec with the browser without having to find, buy or download and install anything, I call that a win. And it works (if done properly) just as well as the proprietary/dedicated app, that in many cases is made with the same HTML+JS in a webview packaged as a normal app.

I mostly use scopes with their own screen, in which case an extra screen is of no benefit.

For the one occasional case of a scope without a screen, at the back of my bench there is a large computer screen with the computer on a shelf. It does not move, so installing software is not a problem. Consequently having another display device on the bench is not an advantage - and it would take up precious space. Hence there can be no benefit in that use case.

The remaining use cases where I might use a separate screen are very niche: scope perched on top of a ladder, or wireless to ensure physical isolation from me.

I will never just walk up to a scope and connect to it with a mobile device!

So I'm struggling to see a USP with significant benefits; the same ends can be achieved by other simpler means.

Apart from that, it may be Kewl Tek with other uses, but I've found a good career progression is based on adopting technologies which enable you to achieve things you can't by other means.

BTW, I've been an early adopter of many technologies over the decades,e.g. micros in 76, C in 81, OOP in 86, Java in 96, the web in 93 when everything was announced on cern.ch, google.edu, and more. I've even developed a half of an award winning distributed web business :)

Hence I can reasonably claim that I'm not afraid of new tech :)
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 21, 2019, 06:48:35 pm
Sorry... what is USP? And Kewl Tek?
Title: Re: Too many programming languages?
Post by: tggzzz on November 21, 2019, 07:00:36 pm
Sorry... what is USP? And Kewl Tek?

Unique selling point, and cool technology :)
Title: Re: Too many programming languages?
Post by: DimitriP on November 21, 2019, 07:29:51 pm
I had a co-worker/friend in college (~1980) whose resume trumpeted that it had used 9 different programs (or something like that) to format/print it.

I wasn't sure that that was a good thing to put in a resume :-)

(OTOH, that was back when most resumes were just typed on an actual typewriter, and programmers were rarer...)

You say this as if it stopped happening.
It still happens today!

How about when you receive an email, with a link to a webpage that once you get there you have to open a PDF to read the announcement/newsletter etc ?
Brilliant!



Title: Re: Too many programming languages?
Post by: FrankBuss on November 21, 2019, 07:42:54 pm
For MCU you are basically limited to C (8 bit) and C/C++ (ARM) for any serious use. And then there is Verilog/ VHDL if you want to use FPGA.

These days you can use Python for FPGAs with Migen. Simple example I just did:

https://twitter.com/frank_buss/status/1193395941069479936

Use MicroPython/CircuitPython for the microcontroller, and Python scripts for a GUI on a PC if you need it, then it is Python all the way down, if you want :)
Title: Re: Too many programming languages?
Post by: obiwanjacobi on November 22, 2019, 06:49:47 am
Just ran into these:

https://ziglang.org/

https://nim-lang.org/

Read a bit about zig -and there are YT videos- and it sounds really good. Also targets embedded.
Nim I just discovered and looks a bit 'more of the same'... <disclaimer>

[2c]
Title: Re: Too many programming languages?
Post by: Cerebus on November 22, 2019, 12:52:40 pm
Q1: what's the USP benefit of a webapp over an app?
For web-apps in general:

Quote
Q2: if running in a smartphone, what's the battery life?
Until you get to the office/next coffee shop/the inbuilt charger in your car/the train/the plane, just.
Quote
Q3: if running in a smartphone, are the GUI controls usable on such a small screen?
Yes, it'll be "responsive". So you only have to flick down the page 5 times to find the (undocumented) control you're looking for and flick back up. If you don't use that control often enough it'll be removed to the hamburger menu for your convenience and to keep your screen nice and uncluttered.
Quote
Q4: how do you support the display so that it can be used hands free?
Siri, Alexa or Cortana are your friends. You just teach them a new skill.
Quote
And most importantly, is it worth giving shiny new toys to web developer so they can ignorantly cock up finding information on the web in more ways?
See my earlier point on keeping the streets and offices Hipster free.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 22, 2019, 01:41:00 pm
Q2: if running in a smartphone, what's the battery life?
Until you get to the office/next coffee shop/the inbuilt charger in your car/the train/the plane, just.

LOL. But the web apps I make don't suck any more battery than native apps. Those of the hispsters with ear rings and blue/green/pink hair that don't know any better than to script JQuery on the other hand... well, maybe, yeah.
Title: Re: Too many programming languages?
Post by: tggzzz on November 22, 2019, 05:37:34 pm
Q1: what's the USP benefit of a webapp over an app?
For web-apps in general:
  • Scalabilty. Sometimes. If well architected.
  • Targetting a single platform (the browser) instead of multiple platforms (Windows, Linux etc, etc.). In theory.
  • Naturally oriented to a scalable 'cloud' platform, lending itself to lucrative on-going subscription revenue.
  • Provides useful employment to "full stack" programmers, thus keeping them off the streets and out of the hair of proper programmers.
  • That is all.

Quote
Q2: if running in a smartphone, what's the battery life?
Until you get to the office/next coffee shop/the inbuilt charger in your car/the train/the plane, just.
Quote
Q3: if running in a smartphone, are the GUI controls usable on such a small screen?
Yes, it'll be "responsive". So you only have to flick down the page 5 times to find the (undocumented) control you're looking for and flick back up. If you don't use that control often enough it'll be removed to the hamburger menu for your convenience and to keep your screen nice and uncluttered.
Quote
Q4: how do you support the display so that it can be used hands free?
Siri, Alexa or Cortana are your friends. You just teach them a new skill.
Quote
And most importantly, is it worth giving shiny new toys to web developer so they can ignorantly cock up finding information on the web in more ways?
See my earlier point on keeping the streets and offices Hipster free.

I try not to be a cynic, but I find it difficult to disagree with any of that.

I'm still interested if GeorgeOfTheJungle (or anyone else) can indicate a benefit of a webapp over an plain app in this case.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 22, 2019, 06:05:24 pm
I'm still interested if GeorgeOfTheJungle (or anyone else) can indicate a benefit of a webapp over an plain app in this case.

1) No need to buy
2) No need to install
3) Runs on any platform/OS
4) As fast as a native app
5) As good as a native app

What's not to like about it?
Title: Re: Too many programming languages?
Post by: Berni on November 22, 2019, 06:43:43 pm
I will admit i am also a bit allergic to web apps.

It is indeed very impressive what is now possible in HTML5. Entire apps including hardware 3D are no problem at all. So what is not to like about it. Well all of this is the reason why browsers have become such famous resource hogs. More and more of these features are used just to make websites look a little bit prettier or to show ads in some novel new way. As a result having a older computer with any less than 4GB of RAM will cause the browser to crash once a handful of such wasteful websites are open. Things with limited computational power like slightly older phones take 0.1 second to download the page and then spend 5 seconds rendering it. Sometimes these features are used to implement "inovative" new ways of navigating websites such as infinite scrolling that loads things as you scroll down forever, offten done in a bad way where the scripts break ocasonaly, or following a link to something and going back resets you to the begining of the page. All of this making the website incredibly frustrating to use compared to a typical website from 2010. Not saying using this new browser technologies is bad, you can do some cool stuff, but don't shove them into everything for no reason.

The other thing is that webapps live in a browser window. I personally am very messy in the use of my browser. I keep opening new tabs, middle clicking lots of links and them moving trough tabs in order to have pages ready for me to see them rather than wait, keeping a lot of tabs open in case i need it again in the near future, quickly firing off google searches and opening the first 5 results in tabs. All of this is to optimize the limited performance of my "human machine interface" so that i get the information needed with as little latency as possible. I have limited time in my day and i don't need it unnecessarily wasted by looking for information. This means that web apps living in a browser tab get buried among everything else and are under the same taskbar icon as the browser. This wastes more time because it takes more than 1 click to bring up the application from any possible state of my windows desktop. Since tabs also sometimes need closing i tend to run "garbage collection" on them in batches where i start mass closing tabs related to things i am finished with, easily also inadvertently closing a web app and loosing unsaved work. This is one of the main reasons why i always have at least 2 browsers installed (My primary is Firefox) so that i can have a browser window that is quickly accessible but does not get lost in other browser windows and tabs as its a separate application in windows.

It might be fine if you only do one thing at a time on your PC, but i don't.
Title: Re: Too many programming languages?
Post by: Cerebus on November 22, 2019, 06:44:05 pm
1) No need to buy
That's not a necessary or even true attribute of a web app. There are plenty of non-web apps that you don't have to buy (or otherwise pay for) and plenty of web apps for which you will have to part company with currency to use.
Quote
2) No need to install
True in most cases, meantime you can enjoy the supply chain attacks from all the libraries and frameworks that you dynamically side-download from npm and god knows where else at the behest of your web app, watch while it makes API calls to graph.facebook.com, google-analytics.com, gravatar and all the other web peeping toms.
Quote
3) Runs on any platform/OS
In your dreams. Web apps are just as fussy about browser/platform as any native app. I am totally sick to the back teeth of the number of browsers and specific browser setups/profiles that I have to keep around to satisfy the non-portable heap of crap web apps that I have to use on a daily basis (e.g. HPE iLO remote console, VMware vSphere web client, various Juniper web clients).
Quote
4) As fast as a native app
Rarely in my experience. Witness the laggy, chaotic heaps of crap that are Jira and Confluence.
Quote
5) As good as a native app
Again, rarely in my experience.
Title: Re: Too many programming languages?
Post by: tggzzz on November 22, 2019, 07:16:53 pm
I'm still interested if GeorgeOfTheJungle (or anyone else) can indicate a benefit of a webapp over an plain app in this case.

1) No need to buy
2) No need to install
3) Runs on any platform/OS
4) As fast as a native app
5) As good as a native app

What's not to like about it?

1: many apps are free, many webapps aren't
2-5: as previously outlined, they aren't benefits to me in my use case
Title: Re: Too many programming languages?
Post by: tggzzz on November 22, 2019, 07:19:25 pm
1) No need to buy
That's not a necessary or even true attribute of a web app. There are plenty of non-web apps that you don't have to buy (or otherwise pay for) and plenty of web apps for which you will have to part company with currency to use.
Quote
2) No need to install
True in most cases, meantime you can enjoy the supply chain attacks from all the libraries and frameworks that you dynamically side-download from npm and god knows where else at the behest of your web app, watch while it makes API calls to graph.facebook.com, google-analytics.com, gravatar and all the other web peeping toms.
Quote
3) Runs on any platform/OS
In your dreams. Web apps are just as fussy about browser/platform as any native app. I am totally sick to the back teeth of the number of browsers and specific browser setups/profiles that I have to keep around to satisfy the non-portable heap of crap web apps that I have to use on a daily basis (e.g. HPE iLO remote console, VMware vSphere web client, various Juniper web clients).
Quote
4) As fast as a native app
Rarely in my experience. Witness the laggy, chaotic heaps of crap that are Jira and Confluence.
Quote
5) As good as a native app
Again, rarely in my experience.

Pretty much, but many Android (etc) apps "call home", or rather "call the neighbourhood and prison". Just look at how many apps want permission to look at your call records etc!
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 23, 2019, 11:41:44 am
Quote
2) No need to install
True in most cases, meantime you can enjoy the supply chain attacks from all the libraries and frameworks that you dynamically side-download from npm and god knows where else at the behest of your web app, watch while it makes API calls to graph.facebook.com, google-analytics.com, gravatar and all the other web peeping toms.

If you really believe the RTB2004 web app does that you're completely mistaken.

Quote
Quote
3) Runs on any platform/OS
In your dreams. Web apps are just as fussy about browser/platform as any native app. I am totally sick to the back teeth of the number of browsers and specific browser setups/profiles that I have to keep around to satisfy the non-portable heap of crap web apps that I have to use on a daily basis (e.g. HPE iLO remote console, VMware vSphere web client, various Juniper web clients).

See for example docs.google.com , that killed MS Office. Runs perfectly everywhere even with older browsers. Or maps.google.com : idem.

Quote
Quote
4) As fast as a native app
Rarely in my experience. Witness the laggy, chaotic heaps of crap that are Jira and Confluence.

I don't know what's Jira and Confluence. If it's crap don't use that.

Quote
Quote
5) As good as a native app
Again, rarely in my experience.

Better because it runs as fast and on any platform and you just have to type a url, done.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 23, 2019, 11:51:10 am
I will admit i am also a bit allergic to web apps.

Me too, not because they can't be as good or better, but because many are crap apparently made by retards. On the other hand there's lots of native apps that are crap too. But I'm sure in the future all instruments -for remote control- will come with a web app not a native app. And then the people will have to learn to program them properly.
Title: Re: Too many programming languages?
Post by: Cerebus on November 23, 2019, 01:55:56 pm
I don't know what's Jira and Confluence. If it's crap don't use that.

Ubiquitous task tracking and project documentation tools used by perhaps 50% of the development teams in the world and perhaps 80-90% of the DevOps world. If you haven't had them foist on you, you're lucky. In the real world one often doesn't get the luxury of saying 'I won't use that'; the management at the multi-billion dollar company that were paying my grossly inflated day rate last year wouldn't have taken kindly to me just saying 'nope' - if they want to waste that day rate on slowing me down with 'orrible tools that's their prerogative - doesn't mean I have to like it, but being overpaid to lump it is some consolation.
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 25, 2019, 03:27:41 pm
https://ziglang.org/
https://nim-lang.org/

Oh, I knew about nim, and I have seen tens (hundreds maybe, although that's a bit of a stretch) of other languages, but had never heard of zig. Thanks for pointing it out. Nice candidate apparently for a "yet another prog language to replace C". As I often say, there seems to be more of them than people that know how to program (just slightly kidding :-DD )

I'll have a look anyway. Always fun, and hey, what do I know, it could actually be different.
Title: Re: Too many programming languages?
Post by: obiwanjacobi on November 26, 2019, 03:07:41 pm
I like Zig better than C. C should have been replaced a couple of decades ago. A sure sign to me that that there is little software innovation in hardware land... Of course you guys will disagree - and that is fine.
Title: Re: Too many programming languages?
Post by: Mechatrommer on November 26, 2019, 03:52:07 pm
still has that stupid ";"
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 03:56:18 pm
Even more stupid yet is to require indentation and indent with a character that's invisible...  >:D
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 26, 2019, 04:24:26 pm
Require? One of the very few languages that *requires* indentation is Python. Talk about completely fucked. Of course there are so many Python lovers that they probably don't see any problem with this.

C, as most other languages, doesn't require indentation, nor any specific way of indenting. There are just coding styles. You can write however you see fit and that won't make a difference in the code itself. Indenting has been proven a simple and readable way of presenting code, so I don't see any problem using it, but it's certainly not "required".
Title: Re: Too many programming languages?
Post by: obiwanjacobi on November 26, 2019, 05:16:45 pm
If you require indenting you can drop the {}'s. If you already think a ; is too much work, this should appeal to you!  ;D

The language I am designing (hobby project) also requires indenting in an effort to minimize the noise (characters)...
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 05:17:54 pm
Require? One of the very few languages that *requires* indentation is Python. Talk about completely fucked. Of course there are so many Python lovers that they probably don't see any problem with this.
Yes, Python is what I had in mind when I was writing that. I don't think making whitespace significant is a good idea.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 05:21:33 pm
If you require indenting you can drop the {}'s.

Yes, indentation lets you replace two characters with... tens, hundreds or even thousands. Good idea!
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 26, 2019, 05:24:51 pm
If you require indenting you can drop the {}'s.

Yes, indentation lets you replace two characters with... tens, hundreds or even thousands. Good idea!

You also better have any "diff" tool you're going to use, well set-up. Because it's very common for those to ignore whitespaces by default. ;D
Oh, but I guess the big boys don't use diff tools?
Title: Re: Too many programming languages?
Post by: Mechatrommer on November 26, 2019, 06:41:55 pm
modern ide or text editor can do automated indentation, so we basically not typing it, or very less. but we dont have automated ";" and have to type it in every line of the codes. programmers have been willingly and happily to waste time and effort on this due to some ancient nomenclature.

If you require indenting you can drop the {}'s. If you already think a ; is too much work, this should appeal to you!  ;D
Basic have done that, but people still love wasting time. granted Basic is crippled-features such as no pointer and true OOP, so thats why its not gain popularity.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 06:43:23 pm
ancient nomenclature => bad ?
Title: Re: Too many programming languages?
Post by: tggzzz on November 26, 2019, 06:45:15 pm
Even more stupid yet is to require indentation and indent with a character that's invisible...  >:D

There's worse: having a significant difference between two invisible characters!

I remember swearing regularly when hand creating makefiles. It mattered whether the indentation was spaces or tabs.

Having used ASR33s and stored source code on paper tape, I can imagine how that came about.
Title: Re: Too many programming languages?
Post by: Mechatrommer on November 26, 2019, 06:46:50 pm
ancient nomenclature => bad ?
just obsolete, some sort of "tele text command type machine?" i cant remember, but its not in existence today since decades...
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 06:54:52 pm
ancient nomenclature => bad ?
just obsolete, some sort of "tele text command type machine?" i cant remember, but its not in existence today since decades...
And how old is the language you're writing in now?
Title: Re: Too many programming languages?
Post by: Mechatrommer on November 26, 2019, 07:01:40 pm
we need the updated version, more powerful, not less, at least similar efficiency and features. i'm yet to find one. as simple as a upgrade to treat ";" to be similar to Line feed char (0A), (0D) carriage return can be ignored. hence ";" is not required if 0A is in place. how hard can it be? we dont need a new nor too many languages, just the "efficientest" one.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 07:06:17 pm
But those two are invisible. Why are they better than ; ?
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 26, 2019, 07:38:25 pm
There are also cases where splitting a statement over several lines can be useful. How do you do this without an end-of-statement token (such as ';')?
An example would be a calculation that doesn't fit on one line (or would be harder to read on one line only). If you don't have any end-of-statement token, you'll have to split the statement itself into several statements, thus actually requiring more keystrokes! (Because I guess the point against ';' is the extra keystroke.)

Oh well. I consider the amount of keystrokes to type an extremely MINOR factor when writing code. This is not where the crux of the work lies (and if it does for you, you'd probably need to seriously question your abilities or methods....)

But as I often say, purely stylistic issues can't be debated, so this is pointless. For those not liking the C syntax, use something else and be done with it.

Title: Re: Too many programming languages?
Post by: brucehoult on November 26, 2019, 08:02:55 pm
There are also cases where splitting a statement over several lines can be useful. How do you do this without an end-of-statement token (such as ';')?

A tiny bit of imagination, or five seconds with google would answer this :-)

Various languages allow you to put a mark on a line indicating that it is a continuation of the previous line. FORTRAN's "C" in column 1 for example.

Various languages allow you to put a mark at the end of a line indicating that the next line is a continuation of this one. "\" in C and .. oh yes .. Python, for example. Notionally "escaping" the following newline character.

Various languages that don't require statement separators will try to continue parsing of a statement onto the next line if finishing at the end of the current line would be a syntax error, for example because the line ends with an operator such as + or *,  or because some bracketing structure hasn't been closed. Python, for example. Or C's ancestor BCPL.


I *strongly* dislike Python's use of whitespace as significant to the semantics, but it's silly to pretend there's no possibility to break a statement up over multiple lines if you don't use a semicolon to terminate them.
Title: Re: Too many programming languages?
Post by: Berni on November 26, 2019, 08:04:38 pm
While i do find the way indentation works as part of the syntax in Python a bit questionable, it still works just fine as long as you tell your code editor to only use tab or only space. At least it forces sloppy programmers to properly indent there stuff i suppose.

Python still has a ; for the case where you DO want to terminate a line early, but doesn't require one where its obvious you finished your line.
Code: [Select]
# Here is some code
y = 3
x = 5
print(x+y)

# This is functionally the same
y = 3; x = 5; print(x+y)

# This is also the same
y\
=\
3
x = 5; print(x+y)

# This is a more sensible use of it on a not so sensible function
x = really_long_named_function_with_plenty_of_arguments( \
      1,2,3,4 \
      ,really_long_varriable_name - 10 \
      ,really_long_varriable_name + 110 )
print(x)

Same thing, different syntax. If you compare how many lines of C code are a single statement with a ; on the end versus how many lines are split over multiple id say it makes more sense to have a character to continue on another line than having a character to terminate a line. Just like it makes more sense for the clutch in your car to be engaged when the pedal is released since that's the state it spends most of its time in.

Its also annoying when a compiler gets confused about a missing ; and spits out a nonsense error. Tho C compilers tend to be fairly robust and error out on the next line, of even going straight out telling you "Hey dumbass did you forget a semicolon there?". But i seen Verilog or VHDL compilers get so confused that they start throwing nonsense errors in code 300 lines before or after the missing semicolon, resulting in a good bit of cursing and shouting at the compiler until you figure out what actually made it so upset about compiling this file.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 26, 2019, 08:15:49 pm
OTOH you've got to put those ugly backslashes there where C would not have needed them
    And this is a very old and boring discussion
       like tabs vs spaces.

You can continue; I quit;  :)
Title: Re: Too many programming languages?
Post by: Berni on November 26, 2019, 08:25:11 pm
Im not trying claim one or the other being superior.

Its just two different ways of doing things. Both work, both have there pros and cons. As long as the syntax rules don't force you into writing code in some ugly unreadable way then its fine.

My personal preference is for not having ; be required on ends of lines, but i don't mind them that much. If the language wants them il put them there. There are plenty of other things in languages that annoy me more than this minor thing.
Title: Re: Too many programming languages?
Post by: tggzzz on November 26, 2019, 08:41:18 pm
It strikes me some of the contributors to this thread need to understand the concept of "syntactic sugar", and why those experienced in more than one language regard it as uninteresting.

For myself, I only care about whether syntax makes it easier or more difficult to make mistakes. The semantics of a language are far more interesting and important.
Title: Re: Too many programming languages?
Post by: brucehoult on November 26, 2019, 09:23:45 pm
It strikes me some of the contributors to this thread need to understand the concept of "syntactic sugar", and why those experienced in more than one language regard it as uninteresting.

I agree. I can program in C, Python, Lisp, Forth/Postscript, various assembly languages, shell, perl .. I don't even know what else.

Quote
For myself, I only care about whether syntax makes it easier or more difficult to make mistakes. The semantics of a language are far more interesting and important.

It's always easy to make mistakes.

I care about how hard it is to find 'em.

Python make it far too easy to make editing mistakes with the delete key or accidentally mixing spaces and tabs in a way that *looks* the same give you a program that has no syntactic errors but silently does the wrong thing. Copying&pasting code via emails or web forums can also be problematic. With C code you can run a "pretty print" program (or command in your editor) to fix the formatting. That concept doesn't and can't apply to Python.
Title: Re: Too many programming languages?
Post by: Cerebus on November 26, 2019, 10:03:29 pm
There are also cases where splitting a statement over several lines can be useful. How do you do this without an end-of-statement token (such as ';')?

Well obviously you punch column 6 to mark it as a continuation card.  >:D
Title: Re: Too many programming languages?
Post by: Cerebus on November 26, 2019, 10:05:24 pm
Various languages allow you to put a mark on a line indicating that it is a continuation of the previous line. FORTRAN's "C" in column 1 for example.

Nope, that's a comment card.
Title: Re: Too many programming languages?
Post by: Nominal Animal on November 26, 2019, 10:29:01 pm
Python make it far too easy to make editing mistakes with the delete key or accidentally mixing spaces and tabs in a way that *looks* the same give you a program that has no syntactic errors but silently does the wrong thing.
I enforce a no-tab-indentation in source code rule.

In Gedit/Pluma, highlighting all \t (with backslash escape sequences enabled in the search) is very useful in highlighting suspicious tabs. 'expand -t 8 source.py > fixed-source.py' is also quite useful.

Just for fun, I ran
Code: [Select]
reset ; find /usr -name '*.py' -print0 | xargs -r0n 1 awk '/^  *\t[\t ]*[^\t #]/||/^\t\t* [\t ]*[^\t #]/ { mixed++; next } /^\t\t*[^\t #]/ { tabs++; next } /^  *[^\t #]/ { spaces++; } END { if (mixed>0 || (spaces>0 && tabs>0)) printf "%4d %4d %4d %s\n", mixed, tabs, spaces, FILENAME }'to see which Python files under /usr (installed modules etc.) mix spaces and tabs in indentation, ignoring empty and comment lines.  (The output lists the number of lines with mixed space/tab indentation, the number of lines indented with tabs, and the number of lines intended with spaces, ignoring comment and whitespace lines, followed by the file name, skipping files with just space OR tab indentation.)
A lot of offenders...

I consider mixing spaces and tabs in Python indentation one of the many footguns available in various programming languages.
Title: Re: Too many programming languages?
Post by: tycz on November 26, 2019, 10:38:56 pm
There are also cases where splitting a statement over several lines can be useful. How do you do this without an end-of-statement token (such as ';')?

Pure Basic does multi line statements in an interesting way, without either the end of statement or line continuation tokens. It simply assumes you want to do continue the statement on the next line if you end the line with an operator or function parameter.

For example, these three statements are all equivalent.
Code: [Select]
If CheckValves(Arg1, Arg2, Arg3) And DetectLight(Arg1, Arg2, Arg3) And AnotherFunction(Arg1, Arg2, Arg3)
  ;code here
EndIf

If CheckValves(Arg1, Arg2, Arg3) And
  DetectLight(Arg1, Arg2, Arg3) And
  AnotherFunction(Arg1, Arg2, Arg3)
  ;code here
EndIf

If CheckValves(Arg1,
               Arg2,
               Arg3) And
  DetectLight(Arg1,
              Arg2,
              Arg3) And
  AnotherFunction(Arg1,
                  Arg2,
                  Arg3)
  ;code here
EndIf

Title: Re: Too many programming languages?
Post by: brucehoult on November 27, 2019, 12:04:04 am
Various languages allow you to put a mark on a line indicating that it is a continuation of the previous line. FORTRAN's "C" in column 1 for example.

Nope, that's a comment card.

Oh yes, quite correct. The continuation mark was anything in column 6, between the label and the code. It's (thankfully) been almost 40 years since I did fortran.
Title: Re: Too many programming languages?
Post by: Cerebus on November 27, 2019, 12:24:56 am
Oh yes, quite correct. The continuation mark was anything in column 6, between the label and the code. It's (thankfully) been almost 40 years since I did fortran.

More like 30 in my case, and even by then you could discard all the 'this language was written assuming it was going on punch cards' stuff. Oh, and that FORTRAN I was writing back in the late 80s? AI! AI in FORTRAN!*

*For those who don't get why this is 'a thing' and gains me serious street cred points: Real Programmers Don't Eat Quiche (https://www.kimballlarsen.com/2007/10/26/real-programmers-dont-eat-quiche/)
Title: Re: Too many programming languages?
Post by: brucehoult on November 27, 2019, 12:45:47 am
Oh yes, quite correct. The continuation mark was anything in column 6, between the label and the code. It's (thankfully) been almost 40 years since I did fortran.

More like 30 in my case, and even by then you could discard all the 'this language was written assuming it was going on punch cards' stuff. Oh, and that FORTRAN I was writing back in the late 80s? AI! AI in FORTRAN!*

The FORTRAN programming I did was on actual punched cards. Pre-scored ones such that you pushed out the chads with a paper clip or ballpoint pen or similar. The pre-scored stuff was only in every 2nd column of a standard card so you only got 40 columns. A utility was run between the card reader and the compiler to remove every second character from each line. But you could do that at the comfort of your own desk rather than booking time on one of the rare card punch machines. And 40 columns was usually more than sufficient anyway.
Title: Re: Too many programming languages?
Post by: Cerebus on November 27, 2019, 01:07:58 am
Back in my punch card days I had the luxury of proper punch operators to turn my scribbles into crisp 80 column cards. I might get to punch a few corrections myself but I used to work at times when there wasn't a queue for the punch machines so it wasn't too bad. Worst case I might have to use one of the manual desktop 'chord' punches. Slow overall, but it made you careful and you never ran out of bookmarks, or something to write your shopping list on.
Title: Re: Too many programming languages?
Post by: westfw on November 27, 2019, 02:18:30 am
It's interesting the way that improvement in editors has affected programming languages.  Fortran's column-oriented approach worked swell on the card punches of the day, which were easily configured with "tabs stops" at appropriate columns.  DEC's early fortran compilers supported use of the TAB character in source code, so you didn't have to type all those spaces, and could easily enter code on those interactive terminals like ASR33s that didn't have the keypunch tab stops.  (probably you can blame DEC for uses of actual tab characters in source code.  And the 8column default.   Saved space, you know.)
We eventually gave up on the "tabs vs spaces" arguments about source code in favor of "tab stops are every 8 spaces, and indent level is 4 spaces, but feel free to use whatever combination of tabs and spaces or just spaces or just tabs that you like, and configure your editor appropriately!")

In theory, python's use of whitespace to indicate syntax "bothers" me, but with any modern syntax-aware editor, it's pretty much a non-issue.  And Python seems to be very good at detecting indentation mistakes (much better than C's reaction to a missing brace, for instance.)

Title: Re: Too many programming languages?
Post by: Berni on November 27, 2019, 06:22:51 am
Yes Python will generally throw errors at you if your indentation does not make sense. Also if you are using any proper code editing software (I like Sublime Text myself) it won't even allow you to put a tab into the file (on default settings) as it will always save as spaces, yet still looks like a tab in the editor.

Even worse bugs can be caused in C by putting too many ; in your code.

For example:
Code: [Select]
i = 10;
while(i);
{
   printf("%d",i);
    i--;
}
What do you think this code will do?
Title: Re: Too many programming languages?
Post by: Mechatrommer on November 27, 2019, 06:57:33 am
For myself, I only care about whether syntax makes it easier or more difficult to make mistakes. The semantics of a language are far more interesting and important.
It's always easy to make mistakes.
afaik, there is no cure to semantics error. for example when you derive a formula by mistake, instead of the correct formula y = 2x + 30, you get the derivation of y = 3x + 30. you punch that in your IDE, eeek, wrong output. that is just a simple random example, in reality there are more delusive semantics errors. its not computer language thing, its more to human error thing, "procedural" errors is another thing. no magnitude of programming language and syntax can fix this. although in C/C++, it has syntaxes/operators that prone to producing semantics error such as when we just want to compare a variable to a value, instead of assigning to it, the correct syntax is if (x == 1), if we type if (x = 1) in C, something will screw up, programmers just need to be cautious in this matter. modern language like Basic is more bulletproof regarding this matter, it just know when we want to compare or assign. but we cant blame C for this because it was designed to be compact, provides shortcut (less typing, except the ; thing) and multi purpose in one line of code, and there will be always workaround to that or imposing some rules or coding style to avoid errors. drawback is we need to be really careful with precedences and the meaning of it, if.. one decides to use the compactness feature of C, but they are still free to use bloated feature one line for each purpose syntaxes (code lines). if a man cant cope with this, or things like buffer overflow handling, he should not touch C/C++ with a barge pole. otoh regarding the punch card... i have no objection to ";" if we are planning to go back to stone age.
Title: Re: Too many programming languages?
Post by: bpiphany on November 27, 2019, 08:52:22 pm
Even worse bugs can be caused in C by putting too many ; in your code.

For example:
Code: [Select]
i = 10;
while(i);
{
   printf("%d",i);
    i--;
}
What do you think this code will do?

Oh my god, how that one has bitten me =P That works in Java too, right? I'm not sure. I mostly do python nowadays.

And btw, tabs are an abomination. They should be avoided at all cost. And don't even get started on putting curly braces on their own row =D
Title: Re: Too many programming languages?
Post by: legacy on November 27, 2019, 11:07:44 pm
The language I am designing (hobby project)

can you tell me more?
Title: Re: Too many programming languages?
Post by: legacy on November 27, 2019, 11:10:40 pm
Python seems to be very good at detecting indentation mistakes (much better than C's reaction to a missing brace, for instance.)

Sci-tools Understand and Stood are able to detect missing brace better than how gcc and llvm do.
Title: Re: Too many programming languages?
Post by: Nusa on November 27, 2019, 11:26:40 pm
Don't forget that whitespace in C/C++ is a lot more than just tabs and spaces:

Code: [Select]
int test()
{
    return
    (

        2

        +

        2

        +

        3

    )

    *

    10
    ;
}

Perfectly legal code, even if it looks silly. Of course, with longer expressions instead of simple numbers, it might be done on purpose for clarity/self-documentation purposes.
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 12:21:09 am
In theory, python's use of whitespace to indicate syntax "bothers" me, ...

The reason for that is that somewhere, even if only subconsciously, you know that there's a horrible bodge going on, and instead of a well-defined formal grammar for the language there's some god awful hack going on somewhere.

Out of curiosity I 'git cloned' the cpython repository today and went off to find that hack. In the grammar for the language there are two terminals 'INDENT' and 'DEDENT'. Then buried in the tokenizing code is a 700 line monster procedure 'get_tok' that deals with the business of processing the current indent level and that's only part of the whole tokenizer, which is 1850 lines. By contrast, the tokenizer for most well specified languages is a few hundred lines in total. Pushing that context sensitivity (to indents) down into the tokenizer means that the grammar does not tell the whole story, and consequently the actual grammar of the language: (1) is not formally specified, (2) probably hides some nasty surprises.
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 28, 2019, 12:30:11 am
It's interesting to know a bit about the history of Python to understand where it comes from, how and why it was designed.
https://en.wikipedia.org/wiki/Python_(programming_language)

As explained, its roots lie in the ABC language: https://en.wikipedia.org/wiki/ABC_(programming_language)

I doubt the author anticipated the traction that it would get.
Title: Re: Too many programming languages?
Post by: emece67 on November 28, 2019, 12:45:50 am
.
Title: Re: Too many programming languages?
Post by: brucehoult on November 28, 2019, 01:19:42 am
Code: [Select]
i = 10;
while(i);
{
   printf("%d",i);
    i--;
}
What do you think this code will do?

Another reason to put opening braces not at a new line. Nobody will write:
Code: [Select]
i = 10;
while(i) {;
   printf("%d",i);
    i--;
}

But if they do, it will be fine.

The more important thing is no one will write:

Code: [Select]
i = 10;
while(i); {
   printf("%d",i);
    i--;
}

I fully support C# (I think?) forcing you to type the {} even if there is only one statement.

Where I depart from them is making it compulsory to write "break" at the end of every option in a switch. If fall-through is banned then just make it happen automatically and not need the break. What they've done goes I think too far in keeping nominal compatibility with C and Java syntax when it's not actually compatible at all.
Title: Re: Too many programming languages?
Post by: westfw on November 28, 2019, 01:52:44 am
Quote
detect missing brace better than gcc
gcc seems to be particularly horrible.  IIRC, one of the "teaching advantages" of University implementations (PL/C, WatFor, etc) was supposed to be MUCH BETTER error messaging than the industry equivalents.  We could do with another round of that sort of thinking.  Maybe Python "got it."

Quote
instead of a well-defined formal grammar for the language there's some god awful hack going on somewhere [in Python]
Yes, that could be it.  Well-defined Grammars were a big thing when I went to school; I think the world was essentially working on its first generation of languages that could be described that way (Pascal!)
OTOH, perhaps the strict separation of lexical analysis and parsing is partially responsible for the apparent inability to unwind things and produce reasonable error messages...
Title: Re: Too many programming languages?
Post by: brucehoult on November 28, 2019, 02:41:14 am
Quote
detect missing brace better than gcc
gcc seems to be particularly horrible.  IIRC, one of the "teaching advantages" of University implementations (PL/C, WatFor, etc) was supposed to be MUCH BETTER error messaging than the industry equivalents.  We could do with another round of that sort of thinking.  Maybe Python "got it."

That, plus fast compiling to code that was not close to the best possible, but much faster than an interpreted language and fine for something that was typically only run once or twice.

I thought I'd try knocking my HiFive Unleashed (64 bit RISC-V) back from its usual 1.5 GHz to 1 MHz and compiling gcc -O0 hello.c -o hello. Sadly, it turns out any setting slower than 37.75 MHz gives 37.75. Grr. I'd have thought it could run slower than that. I'm sad because the otherwise very similar (but 32 bit, and 180nm instead of 28nm) FE310  very happily runs at 16 MHz by default in the Arduino environment, and I bet it would go slower.

However, instead of the normal 0.275 real 0.2 user, at 37.75 MHz I get  6.1 seconds real and 4.6 seconds user time. So 39.74x slower clock gives 22.2x slower execution. I guess because cache misses and disk access gets a whole heck cheaper, relatively.

I don't remember exactly, but I think five or six seconds was about the time for compiling a small Pascal program on the VAX 11/780 or PDP 11/70. The PDP 11/70 ran the microcode engine at 6.7 MHz and the fastest instruction, register to register move, took 2 clock cycles. So effectively 3.3 MHz in modern RISC terms.

So, to a first approximation, modern gcc/as/ld is 10x less efficient than compilers on the PDP 11, even at -O0.

gcc -S (just producing assembly language), takes 2.35 real, 1.85 user at that same 37.75 MHz clock on the HiFive Unleashed, and gcc -c (running the compiler and assembler but not linker) takes 2.85 real, 2.0 user. So it's the gnu linker taking most of the time.
Title: Re: Too many programming languages?
Post by: chickenHeadKnob on November 28, 2019, 03:06:11 am
  IIRC, one of the "teaching advantages" of University implementations (PL/C, WatFor, etc) was supposed to be MUCH BETTER error messaging than the industry equivalents.  We could do with another round of that sort of thinking.  Maybe Python "got it."

Looks at Westfw flag, blinks.

Dude! you used Waterloo Fortran during your learn-denings? RESPECT

I thought that was only a Canadian thing. On punched cards I bet, I was pretty much in the last cohort punched card students. Maybe the year after me as well. We must be of similar ripeness antiquity.
Title: Re: Too many programming languages?
Post by: brucehoult on November 28, 2019, 03:41:55 am
  IIRC, one of the "teaching advantages" of University implementations (PL/C, WatFor, etc) was supposed to be MUCH BETTER error messaging than the industry equivalents.  We could do with another round of that sort of thinking.  Maybe Python "got it."

Looks at Westfw flag, blinks.

Dude! you used Waterloo Fortran during your learn-denings? RESPECT

I thought that was only a Canadian thing. On punched cards I bet, I was pretty much in the last cohort punched card students. Maybe the year after me as well. We must be of similar ripeness antiquity.

I used both WATFOR and WATBOL as a student in around 1981 to 1982. In New Zealand.

That was using VT52 terminals. I only used punched cards for Burroughs FORTRAN IV at high school, and for a statistics course at university using some home-grown array-processing stats language developed by Bill Rogers (who seems to still be there 38 years later... https://www.cms.waikato.ac.nz/people/coms0108 (https://www.cms.waikato.ac.nz/people/coms0108))
Title: Re: Too many programming languages?
Post by: westfw on November 28, 2019, 05:46:29 am
Quote
you used Waterloo Fortran during your learn-denings? RESPECT
I thought that was only a Canadian thing. On punched cards I bet
University of Pennsylvania, EE81.  So my first Fortran Class was '77?IIRC, we had one or two assignments to be done on punched cards, and then they started letting us use the CRT terminals (with a whopping 2400bps local connection!  Still, there were a lot more CRTs than card punches, at the time.)
They had a Univac 90/70, which was an "IBM 360 Compatible" of some sort.  I was blissfully unaware of purchasing politics, but I imagine that that means they didn't have the IBM compilers (and, I guess, they had cheaper CRT support, and JCL was easier.)  I don't know if there were any native Univac tools.  So they had PL/C from Cornell, WatFor from Waterloo, a homegrown APL, a PDP/11 emulator from somewhere, and so on...
The DEC-10 I worked on was in the business school, and had a pretty full suite of DEC tools...
Post-GNU folks don't seem to realize how much of a hotbed of "open source" the University environment used to be.  Vendors would license source code to the universities, and they'd improve stuff and send it back.  OS, shells, networking code, languages, editors - all passed around the keg at a frat party...  (Businesses too.  It's not like there were enough SW engineers to work at the actual vendors!)
Title: Re: Too many programming languages?
Post by: obiwanjacobi on November 28, 2019, 06:18:14 am
The language I am designing (hobby project)

can you tell me more?

WIP:
https://obiwanjacobi.github.io/Zlang/
Title: Re: Too many programming languages?
Post by: obiwanjacobi on November 28, 2019, 06:32:40 am
In theory, python's use of whitespace to indicate syntax "bothers" me, ...

The reason for that is that somewhere, even if only subconsciously, you know that there's a horrible bodge going on, and instead of a well-defined formal grammar for the language there's some god awful hack going on somewhere.

Out of curiosity I 'git cloned' the cpython repository today and went off to find that hack. In the grammar for the language there are two terminals 'INDENT' and 'DEDENT'. Then buried in the tokenizing code is a 700 line monster procedure 'get_tok' that deals with the business of processing the current indent level and that's only part of the whole tokenizer, which is 1850 lines. By contrast, the tokenizer for most well specified languages is a few hundred lines in total. Pushing that context sensitivity (to indents) down into the tokenizer means that the grammar does not tell the whole story, and consequently the actual grammar of the language: (1) is not formally specified, (2) probably hides some nasty surprises.

It is not about the number of characters needed to make an indent (mentioned earlier) or how complex the code is in the lexer. It is about how readable and easy it is for the developer/user. More advanced features will take more code to create them.

I don't see why the grammar would not tell the whole story when the tokenizer creates the INDENT and DEDENT tokens? These tokens are still used by the grammar...
But then, I am just starting in this whole compiler stuff...
I have seen an INDENT/DEDENT impl within <guess> 50 lines of code so, perhaps your example was not the best?
Title: Re: Too many programming languages?
Post by: Berni on November 28, 2019, 08:14:37 am
But if they do, it will be fine.

The more important thing is no one will write:

Code: [Select]
i = 10;
while(i); {
   printf("%d",i);
    i--;
}

I fully support C# (I think?) forcing you to type the {} even if there is only one statement.

Where I depart from them is making it compulsory to write "break" at the end of every option in a switch. If fall-through is banned then just make it happen automatically and not need the break. What they've done goes I think too far in keeping nominal compatibility with C and Java syntax when it's not actually compatible at all.

Yeah its common to see same line curly bracket for control statements, but it depends on what kind of coding style they are sticking to as there are so many of them out there. Some coding style guidelines impose things that make the code more readable and less error prone, some make it ugly by forcing weird naming conventions and unusual ways of formatting.

And yes things like the break; in the switch statement annoys me more than having to put semicolons everywhere. But overall C is reasonably good in terms of syntax. Pythons use of indentation is just a different way of doing it in order to avoid mistakes in what code belongs to what statement, also throws errors if the indentation does not make sense, works just fine once you know not to mix tabs and spaces, but it has plenty of other annoying syntax quirks.

Where syntax really bothers me is HDL languages like Verilog or VHDL.  They both use "begin" and "end" keywords instead of curly braces, require ; but will get very confused if there is one missing, and get even more confused if there is one where it should not be. VDHL tries to be very universal and as a result really wordy where you constantly have to explain to it what kind of number something is and even tell it what it should do with a + sign. I prefer Verilog because its less wordy but then it still has the begin end if then else end... while making you write down numbers in the strangest syntax ever, like 16'hBAAD or 8'b01101001 and using <= as an assignment operator that gets used more than the usual = assignment operator, while the operator for less than or equal in a if statement is <=, yet equality comparison is still == while introducing even more variations on the equality operator such as the === case equality operator.

But with languages that run on CPUs i have plenty of choice to go to a different language if i don't like it. Don't like Python? Use Java, still no? Maybe C? Or C++? or C#? Maybe Go? Can still use Pascal or Delphi too... etc list goes on and on. But as far as HDL languages go you get VHDL or Verilog, that's it.
Title: Re: Too many programming languages?
Post by: legacy on November 28, 2019, 11:57:08 am
Programming on a VT525 vt-terminal with VIM is different from programming with Geany on an NSC 400 X-terminal  :-//
Title: Re: Too many programming languages?
Post by: legacy on November 28, 2019, 12:13:24 pm
Code: [Select]
i = 10;
while(i); {
   printf("%d",i);
    i--;
}

SafeC checkers have the rule ")" must be followed by a block {}.
Hence the above line ");" is detected as mistake.
Title: Re: Too many programming languages?
Post by: brucehoult on November 28, 2019, 12:36:13 pm
But with languages that run on CPUs i have plenty of choice to go to a different language if i don't like it. Don't like Python? Use Java, still no? Maybe C? Or C++? or C#? Maybe Go? Can still use Pascal or Delphi too... etc list goes on and on. But as far as HDL languages go you get VHDL or Verilog, that's it.

We use Chisel for pretty much everything. It is essentially a library with some classes and stuff in the Scala language, and eventually it outputs Verilog. Actually, it outputs FIRRTL which is basically an explicit low level netlist, and then FIRRTL goes through a bunch of optimization passes (much as you get in the middle part of gcc or llvm, but for netlists), and then the FIRRTL is eventually converted to Verilog (or potentially other things).

Looks like this:

https://github.com/chipsalliance/rocket-chip/blob/master/src/main/scala/rocket/ALU.scala
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 01:23:25 pm
We use Chisel for pretty much everything.

Pah! Only using 1/3 of your toolkit? Don't forget there's "screwdriver" and "4lb club hammer".  :)
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 28, 2019, 04:03:09 pm
Code: [Select]
i = 10;
while(i); {
   printf("%d",i);
    i--;
}

SafeC checkers have the rule ")" must be followed by a block {}.
Hence the above line ");" is detected as mistake.

GCC (9.2.0 here) has no problem flagging this as suspect (at least with -Wall):

Quote
warning: this 'while' clause does not guard... [-Wmisleading-indentation]
   47 |  while(i); {
      |  ^~~~~
Case_Var.c:47:12: note: ...this statement, but the latter is misleadingly indented as if it were guarded by the 'while'
   47 |  while(i); {

Although GCC doesn't issue a warning for the following, some other static analysis tools will: as it is, 'while (i);' is additionally an infinite loop, as the condition will always be true.
(For instance, CppCheck gives: "style: Condition 'i' is always true".)

Title: Re: Too many programming languages?
Post by: legacy on November 28, 2019, 04:11:43 pm
some other static analysis tools will

Which ones? Stood? Understand? Lint? yes, they do.
But, technically I see it as a defect of the standard C grammar.

"while | if () statement" is always a potential mistake
hence it should always be "while | if { block }"
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 28, 2019, 04:32:15 pm
"while | if () statement" is always a potential mistake
hence it should always be "while | if { block }"

No Sir, there's nothing wrong in
Code: [Select]
while (condition) ; it has always been and still is widely used everywhere.

How or why is
Code: [Select]
while (condition) { ; } any better?
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 28, 2019, 05:30:32 pm
How or why is
Code: [Select]
while (condition) { ; } any better?

You don't need the ';' in there.

As to whether it's better, it can be a topic for infinite debate. I don't know about "better" per se, but it would make the language more consistent. As an example, the two constructs: individual statements and code blocks where also both allowed, AFAIR, in Pascal with the 'IF' construct. Wirth's later refined languages got rid of this out of consistency (and a simpler grammar!).

I personally don't have a problem with this in C, but I can understand the point about it not being completely consistent, or just that it could be simplified.
But I also don't buy into the associated potential issues due to mistyping. Any kind of mistyping can lead to catastrophic bugs. You added a ';' where it didn't belong? Yeah? You could also have typed 'i+2' where 'i+1' was meant.

Additionally, some basic coding styles considerably lower the possibilty of the above happening. I make it a rule NEVER to put an 'if' ('while', ...) body on the same line. Thus:
Code: [Select]
if (i == 0) i++; NOPE. This:
Code: [Select]
if (i == 0)
    i++;
is OK.

If anything, the above example with 'while (i); {' is WAY easier to spot (both by humans and automated tools) than the infinite possibilities of mistyping, so "fixing" this would be as easy as it would be a bit pointless all in all.

Now if you want a more consistent language with a simpler grammar, use something else. But choose wisely, because mistakes are easy to make in just any of them.
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 05:34:26 pm
some other static analysis tools will

Which ones? Stood? Understand? Lint? yes, they do.
But, technically I see it as a defect of the standard C grammar.

"while | if () statement" is always a potential mistake
hence it should always be "while | if { block }"

C by and large follows Algol-60 syntax, with a sprinkling of influence from FORTRAN, with the occasional nod to some Algol-68 features (for instance the C "-=" operator borrows from the Algol-68 "-:=" operator). We'd be a lot better off if they had followed (and hence popularised) the Algol-68 way of doing things, which would have been

Code: [Select]
WHILE <serial clause delivering BOOL> DO <serial clause> OD
IF <serial clause delivering BOOL> THEN <serial clause> [ ELSE <serial clause> ] FI

And the offending example would have looked like this:
Code: [Select]
i := 10;
WHILE i != 0
DO
    printf ("%d", i);
    i -:= 1
OD
and a semicolon after the conditional phrase would have been flagged as a syntax error. Notice that the semicolon is a statement separator in Algol-68, not a statement terminator as in C.

FWIW, the idiomatic Algol-68 would not use the separate variable and auto-decrement and would have looked like this:
Code: [Select]
FOR i FROM 10 DOWNTO 1
DO
    printf ("%d", i)
OD

Edited to add: I forgot to say that the legitimate 'no op' statement (which is what gets us into trouble with the C 'while (x) ;" above) which has to be awkwardly leveraged into an empty statement in C was an explicit 'SKIP' operator in Algol-68. Heck, even python has 'pass'.
Title: Re: Too many programming languages?
Post by: legacy on November 28, 2019, 05:39:39 pm
Code: [Select]
while (condition) { ; } any better?

Yes, it's better. There is also a rule to avoid empty statements.
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 05:42:43 pm
Code: [Select]
while (condition) { ; } any better?

Yes, it's better. There is also a rule to avoid empty statements.

It's better, but better still would be the rather more explicit:
Code: [Select]
while (condition) {  /* deliberate no op */; }
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 28, 2019, 05:50:05 pm
Code: [Select]
while (condition) { ; } any better?
Yes, it's better. There is also a rule to avoid empty statements.
It's better, but better still would be the rather more explicit:
Code: [Select]
while (condition) {  /* deliberate no op */; }

It's better because... what?
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 06:41:48 pm
Code: [Select]
while (condition) { ; } any better?
Yes, it's better. There is also a rule to avoid empty statements.
It's better, but better still would be the rather more explicit:
Code: [Select]
while (condition) {  /* deliberate no op */; }

It's better because... what?

 :palm: It's better because it draws attention to the fact that an obscure 'feature', spot-able only by a semicolon being in an odd place, has been used - so that the poor sod who comes after you has a chance to spot what you're doing, that you're deliberately doing it, and that it's not a mistake.
Title: Re: Too many programming languages?
Post by: Nusa on November 28, 2019, 06:56:08 pm
And just for more fun with loops:
Code: [Select]
  do;
  while (condition);

Title: Re: Too many programming languages?
Post by: tggzzz on November 28, 2019, 07:49:15 pm
How about this (rather trivial) makefile not working as expected because of the error in the second line
Code: [Select]
clean:
rm -f blah.o blah.c blah
What's the error?



...








A Makefile consists of a set of rules. A rule generally looks like this:
targets : prerequisities
   command
   command
   command

    The targets are file names, seperated by spaces. Typically, there is only one per rule.
    The commands are a series of steps typically used to make the target(s). These need to start with a tab character, not spaces.
    The prerequisites are also file names, seperated by spaces. These files need to exist before the commands for the target are run.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 28, 2019, 07:54:30 pm
:palm: It's better because it draws attention to the fact that an obscure 'feature', spot-able only by a semicolon being in an odd place, has been used - so that the poor sod who comes after you has a chance to spot what you're doing, that you're deliberately doing it, and that it's not a mistake.

There's nothing obscure in while (condition) ; IMO any poor sod that can't understand that, shouldn't be programming. And, what makes you believe that she would understand while (condition) { } but not while (condition) ; ? Is she going to believe every line she sees and doesn't understand is a mistake or what?

What about

Code: [Select]
while (condition) this(), that();
while (condition) this() && that();

Look ma, no braces!  >:D

Linus says that lesser programmers can't understand the C he writes... I am not saying that obfuscated C is better, but I'm against the minimum common denominator just because there's dumb people out there. That while() we're talking about isn't rocket science!
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 08:38:34 pm
Look ma, no braces!  >:D

It's not about the braces, it's about drawing attention to the empty statement before the semicolon. Relying on an invisible empty statement is dangerous, drawing attention to your intent in using it seems only wise. If one wants to feel artificially smarter than other people, then by all means use 'clever' code, especially code that leverages the weak features of a language, but if one's intent is to produce maintainable code that other people can quickly and clearly understand then something else is called for.

Personally I'd rather be know for producing code that works and is easy for other people to maintain rather than for 'clever' code that falls to pieces as soon as someone less experienced, or less attentive, than me gets their hands on it.
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 08:43:13 pm
How about this (rather trivial) makefile not working as expected because of the error in the second line
Code: [Select]
clean:
rm -f blah.o blah.c blah
What's the error?

Obvious to anyone that has used make for more than, ooh 15 minutes? And that's even without scrolling down. I always have "Show Invisible Characters" turned on in my editor(s) when editing makefiles (or, for that matter, when working in python).
Title: Re: Too many programming languages?
Post by: hamster_nz on November 28, 2019, 09:04:00 pm
Quote from: GeorgeOfTheJungle link=topic=211742.msg2805546#msg2805546 hat about

[code
while (condition) this(), that();
while (condition) this() && that();[/code]

Look ma, no braces!  >:D

All good and well, until this() returns -1 on error and 0 on success, breaking my "do this() and if that succeeds then do that()" internal dialogue on reading this code.

Looking at pretty much any stdlib function which returns a pure status code:

Code: [Select]
RETURN VALUE
       On success, zero is returned.  On error, -1 is returned, and errno is
       set appropriately.

And if you can't use the pattern against the stdlib then it isn't a good pattern.

I still think that the goal isn't to write the most concise code, but the most descriptive and least ambiguous code (to humans). That is why some people think I am silly for putting "if(somepointer != NULL) ..." rather than "if(!somepointer) ..." - those people aren't the people I am leaving the hints for.

"{ }" is much better than ";" in my book, as I try to not lay landmines for the future readers.
Title: Re: Too many programming languages?
Post by: tggzzz on November 28, 2019, 10:17:06 pm
How about this (rather trivial) makefile not working as expected because of the error in the second line
Code: [Select]
clean:
rm -f blah.o blah.c blah
What's the error?

Obvious to anyone that has used make for more than, ooh 15 minutes? And that's even without scrolling down. I always have "Show Invisible Characters" turned on in my editor(s) when editing makefiles (or, for that matter, when working in python).

Remind me what you were writing in your post before that response. Something like  "...Relying on an invisible empty statement is dangerous...", perhaps?
Title: Re: Too many programming languages?
Post by: Cerebus on November 28, 2019, 10:37:21 pm
How about this (rather trivial) makefile not working as expected because of the error in the second line
Code: [Select]
clean:
rm -f blah.o blah.c blah
What's the error?

Obvious to anyone that has used make for more than, ooh 15 minutes? And that's even without scrolling down. I always have "Show Invisible Characters" turned on in my editor(s) when editing makefiles (or, for that matter, when working in python).

Remind me what you were writing in your post before that response. Something like  "...Relying on an invisible empty statement is dangerous...", perhaps?

Well, I can't actually check your makefile for actual characters present as the SMF editor doesn't actually take tabs, at least the WYSIWYG editor won't take them from me; so I'm quite happy to stipulate that I'm making an assumption about invisible characters. And 'rm -f <any old rubbish>' ought to work (or silently fail if a named file is not there). 'clean', as presented, is the first dependent in the file, so should be the default target, and make doesn't need a dependency to make either the default target or a target named on the command line. The only thing I can think of that remains a bit suspicious is the rm of blah.c as well as blah and blah.o but not that suspicious as I've written a  lot of makefiles with generated code where that was appropriate. So: I have bupkas otherwise.
Title: Re: Too many programming languages?
Post by: Berni on November 29, 2019, 07:30:17 am
Yeah makefiles used pretty much universally to drive compilation of C code are indeed very picky about whitespace types.

But using indentation for grouping in makefiles doesn't really make much sense since you don't really nest things in one another unless you make overly complex makefiles with a bunch of control statements and all that. Usually they are just a list of things to do along with defining a bunch of paths, so you never get past indentation level 1.

But again as long as you make sure you have your whitespace in order it all works fine, its not the only invisible character that can mess things up. Newlines being LF or CR+LF (Even CR alone was used in some systems) is also invisible in editors but can cause a lot of weird errors when they get mixed into source code. We still don't agree what the "correct" way to do a newline is
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 29, 2019, 10:13:54 am
Yeah makefiles used pretty much universally to drive compilation of C code are indeed very picky about whitespace types.

But using indentation for grouping in makefiles doesn't really make much sense since you don't really nest things in one another unless you make overly complex makefiles with a bunch of control statements and all that. Usually they are just a list of things to do along with defining a bunch of paths, so you never get past indentation level 1.

But again as long as you make sure you have your whitespace in order it all works fine, its not the only invisible character that can mess things up. Newlines being LF or CR+LF (Even CR alone was used in some systems) is also invisible in editors but can cause a lot of weird errors when they get mixed into source code. We still don't agree what the "correct" way to do a newline is

Never had a single problem writing Makefiles. The indentation thing is peculiar, but pretty simple. Admittedly not the wisest choice they made, but 'make' is not a programming language, and can't be compared to what we talked about above.

As to end-of-line characters, only very ill-written or old tools have a problem with that. And yes, that includes 'make' (at least up to recently).

If you're using C and an std lib that is reasonable (this has been the default behavior on at least both Windows and Linux for at least 20 years), opening the file in text mode ("t") and using fgets() to read text files will transform any kind of EOL combination into a single '\n' character. Nothing special to do. Of course if you're reading the file character by character, which was very common in the old days, then you'll have to deal with it by hand.

Title: Re: Too many programming languages?
Post by: Nominal Animal on November 29, 2019, 10:35:30 am
I still think that the goal isn't to write the most concise code, but the most descriptive and least ambiguous code (to humans). That is why some people think I am silly for putting "if(somepointer != NULL) ..." rather than "if(!somepointer) ..." - those people aren't the people I am leaving the hints for.
I fully agree, and don't think it silly, even though I do use if (!somepointer) { error_case(); } myself.
Thing is, I tell learners to read that pattern as "if no somepointer, then error case".
Our objective is the same, just different approaches.

Whenever I write a spinning while loop with an empty body, I have a comment block explaining its purpose preceding it.

What's the error?
Like I said, highlighting all tabs in Gedit/Pluma (by searching for \t with backslash escapes enabled in the search dialog) makes the tabs visible.

Because of the sheer number of existing Makefiles, we cannot really change the syntax anymore; we're stuck with it.  Much like the oddities in English itself, with lead, lead, and lead all pronounced differently.

However, what I do myself, and recommend others do as well, is make sure they only have one indentation level, with a single tab.  Then, running
  sed -e 's|^[\t ][\t ]*|\t|' -i Makefile
is safe, and will fix any indentation issues.

(Before anyone points it out, I do know that GNU sed supports \+ in the "one or more of" sense.  However, that's not strictly BRE syntax.  Only EREs support + in the "one or more of" sense.  Trick for new players; one that I prefer to avoid.)
Title: Re: Too many programming languages?
Post by: SiliconWizard on November 29, 2019, 10:46:45 am
Whenever I write a spinning while loop with an empty body, I have a comment block explaining its purpose preceding it.

This is wise. Additionally, my coding style includes adding a blank line AFTER the loop, so it's even more readable: makes it even clearer at first sight that the loop is not meant to include anything below it.
Title: Re: Too many programming languages?
Post by: Nominal Animal on November 29, 2019, 11:32:26 am
Definitely; I tend to sprinkle empty lines to separate groups of logical operations as well, with a comment before each group.

I mostly use Gedit/Pluma with a dark theme, and syntax highlighting, so I've found that "style" makes it easiest/fastest for me to revisit old code.  I'd like to claim its almost like two different parts of my brain working in tandem, one looking at the code, and another at the comment, but that's just how it feels.  Gets me into the flow state.

It's not like I invented the style, though.  I have a habit of examining projects sources before using or recommending them, and have just stolen the approaches I like best for my own.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 29, 2019, 10:13:26 pm
I still think that the goal isn't to write the most concise code, but the most descriptive and least ambiguous code (to humans). That is why some people think I am silly for putting "if(somepointer != NULL) ..." rather than "if(!somepointer) ..." - those people aren't the people I am leaving the hints for.

"{ }" is much better than ";" in my book, as I try to not lay landmines for the future readers.

I would expect a competent programmer to understand the syntax of the language:
kbuf = getFreeBuffer();
if(kbuf != NULL)

kbuf = getFreeBuffer();
if(kbuf)

if(kbuf = getFreeBuffer())

if((kbuf = getFreeBuffer()) != NULL)

If noob likes it or not is irrelevant, if the code is out there (and there is a lot of it out there) you need to be able to work on it. Dumb down development to cater for the incompetent noob is not helping to build more competency.

I don't need non-observant programmers maintaining code and again there is a lot of code out there that may not be written to a personal preference:
if(!getFreeBuffer()) requestNewBuffer(); verses if(!getFreeBuffer()) {requestNewBuffer();}

If your working on poorly indented code that use use a code formatter.

Finally, I would prefer a programmer ask why this was done:
writeAddress = (void *) ((long) (ioAddress + offset) & (~(0x03)));
.....

Hide it and the incompetent programmer will someday use a iowrite8(), but again my preference is that the incompetent does not work on this type of code anyway.


Title: Re: Too many programming languages?
Post by: Mechatrommer on November 29, 2019, 10:50:06 pm
Code: [Select]
while (condition) { ; } any better?
Yes, it's better. There is also a rule to avoid empty statements.
It's better, but better still would be the rather more explicit:
Code: [Select]
while (condition) {  /* deliberate no op */; }
It's better because... what?
:palm: It's better because it draws attention to the fact that an obscure 'feature'...
Code: [Select]
while (condition) {
// do nothing or whatever
};
this is better, why? because i say so... why extra ";"? because i can. no, actually because i want every line to be the same (uniform), since the rest are terminated with ";" this must be too. and how the hell i set the damn tab to be double space distance? in this SMF IDE? not 4 spaces apart? maybe i just replace a tab with "double space" instead..
[absolutionist's hat off] :palm:
Title: Re: Too many programming languages?
Post by: westfw on November 29, 2019, 11:53:44 pm
Quote
If noob likes it or not is irrelevant
I have come to believe that writing code in a way that "noob" can understand it is VERY IMPORTANT.
I mean, I've updated programs for new OSes, where I didn't "know" the language it was written in (because all of the original programmers were gone, and I at least understood the OS.)   And I've worked for companies that pretty much had to abandon promising but weird high-performance VLIW/bitslice architectures at least partially because they couldn't find enough people who could write (or fix) code for them.

Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 29, 2019, 11:54:01 pm
Code: [Select]
while (condition) {
// do nothing or whatever
};
this is better, why? because i say so... why extra ";"? because i can. no, actually because i want every line to be the same (uniform), since the rest are terminated with ";" this must be too. and how the hell i set the damn tab to be double space distance? in this SMF IDE? not 4 spaces apart? maybe i just replace a tab with "double space" instead..
[absolutionist's hat off] :palm:
[/quote]

for(;myFunction(); );

although perhaps more correctly
for(;!myFunction(); );

but I could write
while(myFunction() == OK)

and of course if {} make it more readable there is
{}while(!myFunction() != OK)

Would a comment help?
{/* doing nothing until the cows come home */}while(!myFunction() != OK)

But just remember to change the comment when writing :
while(!myFunction()) {/* when the cows come home I will have something to do */};


{} makes code more readable.... Seems like a personal preference to me.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 12:02:15 am
Quote
If noob likes it or not is irrelevant
I have come to believe that writing code in a way that "noob" can understand it is VERY IMPORTANT.
I mean, I've updated programs for new OSes, where I didn't "know" the language it was written in (because all of the original programmers were gone, and I at least understood the OS.)   And I've worked for companies that pretty much had to abandon promising but weird high-performance VLIW/bitslice architectures at least partially because they couldn't find enough people who could write (or fix) code for them.

My point however is that you don't always get to work on pristine code, so you need to learn the language. I've worked on code bases with several million lines spanning decades with very large development teams. You might see or do some refactoring here or there but with the inherent risks associated with refactoring and time pressure mostly it is resolve the defect and more to the next.

Title: Re: Too many programming languages?
Post by: Nusa on November 30, 2019, 12:02:50 am
Noobish, eh? Sometimes the noobish way is actually very clear and unambiguous, not to mention identical in function:

Code: [Select]
waitloopy: if (condition) goto waitloopy;
The screaming may commence.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 12:42:09 am
Noobish, eh? Sometimes the noobish way is actually very clear and unambiguous, not to mention identical in function:

Code: [Select]
waitloopy: if (condition) goto waitloopy;

Firstly your asserting that
while(condition()) ;

is not clear and unambiguous.

I'm having a difficult time understanding why you think that a loop construct defined in the language is not clear and unambiguous. I would tend to side with using the construct rather than working around it purely from expected language coding practices. It would be very uncommon in a C code base to come across your proposed style of loop implementation, particularly when formal code review's are process.


waitloop: if(condition()) goto waitloop;

Aside from the above, although not more or less readable (it's used in assembly) this style can be more difficult from a maintenance perspective and it's leads to less structured code. For example when there is more than one, or additional waitloops are inserted. In addition 'I can just go back' to that other waitloop infects the code.

If the language supports a loop construct, generally it's going to have preferred use. Again my point was not about personal preference (or how many different ways code can implement a requirement) but ability and competency in the language. A noob implementing blinky on their own time is quite different to working on an existing code base where your personal style preference may not be the same as those that went before you or those that will come after you.

Title: Re: Too many programming languages?
Post by: brucehoult on November 30, 2019, 01:50:15 am
Firstly your asserting that
while(condition()) ;

is not clear and unambiguous.

I'm having a difficult time understanding why you think that a loop construct defined in the language is not clear and unambiguous.

It is of course clear and unambiguous if you examine it carefully. However ";" is small and easily missed when scanning code quickly, while a "{}" stands out like a boar's bollocks.
Title: Re: Too many programming languages?
Post by: westfw on November 30, 2019, 01:59:32 am
Quote
but with the inherent risks associated with refactoring and time pressure mostly it is resolve the defect and more to the next.

Sure.  I agree that FIXING code just to make it more readable is not nearly as worthwhile as writing it to be readable in the first place.
Alas, "old code" tends to get modified for perfectly good reasons, in ways that are detrimental to readability.  (A good argument for making things as readable as possible to start with.)

I've yet to look at a C++ STL source file that I consider "readable" - all those "advanced" C++ features piled in to make them work under various obscure circumstances (I guess?)  And I have some ~1985 code that someone sensibly decided should get function prototypes.  That's a fine idea, but it now looks like:

Code: [Select]
static int
#if HAVE_STDC
prsnum(char *text, int textlen, flag_t flags, int radix, int *sign, char **digits, int *ndig, char **term)
#else /* K&R style */
prsnum(text,textlen,flags,radix,sign,digits,ndig,term)
char *text, **digits, **term;
int textlen;
flag_t flags;
int radix, *sign, *ndig;
#endif /* HAVE_STDC */
{
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 02:46:32 am
I've yet to look at a C++ STL source file that I consider "readable" - all those "advanced" C++ features piled in to make them work under various obscure circumstances (I guess?)  And I have some ~1985 code that someone sensibly decided should get function prototypes.  That's a fine idea, but it now looks like:

Please do not bring C++ template suffering into this discussion. I've had some very traumatic experiences working with abstracting C++ template zelots (aka customers). Ever tried to figure out what Fck'in C++ source line  actually caused the optimizer to write out that sequence of instructions because some a-hole loved templates (made the code more readable my ass) and sent you the 10K line test case? OK, now I'm triggered, I should have given that job to the intern.
Title: Re: Too many programming languages?
Post by: tggzzz on November 30, 2019, 08:31:35 am
no, actually because i want every line to be the same (uniform), since the rest are terminated with ";" this must be too.

You need to understand the difference between a "terminator" and a "separator" - and switch to a language that uses terminators.

Learningabitofpunctuationamdcapitalisationwouldalsobehelpful to readers. But maybe you aren't worried about people reading your posts?
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 30, 2019, 10:08:47 am
Noobish, eh? Sometimes the noobish way is actually very clear and unambiguous, not to mention identical in function:
Code: [Select]
waitloopy: if (condition) goto waitloopy;

 :-+

If the moar verbose the betterer... how about this:

Code: [Select]
while (1) if (condition) continue; else break;
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 30, 2019, 10:25:02 am
Code: [Select]
while (1) if (condition) continue; else break;

Oh, sorry! With the compulsory braces and newlines:

Code: [Select]
while (1) {
  if (condition) {
    continue;
  }
  else {
    break;
  }
}

The more the LoCs the better the bill.
Title: Re: Too many programming languages?
Post by: Nusa on November 30, 2019, 10:47:52 am
Noobish, eh? Sometimes the noobish way is actually very clear and unambiguous, not to mention identical in function:
Code: [Select]
waitloopy: if (condition) goto waitloopy;

 :-+

If the moar verbose the betterer... how about this:

Code: [Select]
while (1) if (condition) continue; else break;

Interesting. I went and played with several popular compilers on godbolt.org, enabling one line at a time.
Code: [Select]
bool condition;
int test()
{
  //waitloopy: if (condition) goto waitloopy;
  //while (condition);
  //while (1) if (condition) continue; else break;
  //while (1) if (!condition) break;
}
The first and second lines always compile to the same output.
The third and fourth lines always produced larger output than first and second lines. Continue/break branches inside loops clearly aren't fully optimized for this case.
The third and fourth lines did not always produce the same output. It depended on the compiler I chose.
Title: Re: Too many programming languages?
Post by: hamster_nz on November 30, 2019, 11:29:09 am
Code: [Select]
while (1) if (condition) continue; else break;

Oh, sorry! With the compulsory braces and newlines:

Code: [Select]
while (1) {
  if (condition) {
    continue;
  }
  else {
    break;
  }
};

The more the LOCs the better the bill.

I don't that that abhorrent, except the redundant 'else', and the trailing semicolon. Heck, we have 4k screens now, so LOC isn't the issue it was...

Code: [Select]
// Need to retry SQL for some conditions
while (1) {
  try_to_run_sql("SELECT_FOO_FROM_BAR");

  if (database_was_locked) {
    log("Retrying due to a locked database");
    continue;
  }

  if (database_was_busy) {
    log("Retrying due to a busy database");
    continue;
  }

  if (database_commit_failed) {
    log("Retrying due to commit failed");
    continue;
  }

  // No more retry conditions - has either succeeded or failed.
  break;
}

if(database_had_error) {
  log("SQL failed with error blar");
  return -1;
}
log("SQL successfully completed");
return 0;

Then you come back a few days later and change:

Code: [Select]
while (1) {
to
Code: [Select]
for(retries_left =  MAX_SQL_ATTEMPTS; retries_left > 0; retries_left--) {

Hey, but each to their own... you could write:

Code: [Select]
do try_to_run_sql("SELECT_FOO_FROM_BAR");
while(database_was_locked || database_was_busy || database_commit_failed)
return database_had_error ? -1 : 0;

And then the next guy will have to refactor the code if you wanted to add any sort of logging, or wanted to limit the number of retries... or any sort of actual improvement.

PS. Excuse any typos / errors in code, just ranting, not compiling.  :D
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 30, 2019, 11:57:53 am
Code: [Select]
while (1) if (condition) continue; else break;
[..] the redundant 'else' [..]

It isn't redundant!
Title: Re: Too many programming languages?
Post by: brucehoult on November 30, 2019, 12:37:46 pm
Noobish, eh? Sometimes the noobish way is actually very clear and unambiguous, not to mention identical in function:
Code: [Select]
waitloopy: if (condition) goto waitloopy;

 :-+

If the moar verbose the betterer... how about this:

Code: [Select]
while (1) if (condition) continue; else break;

Interesting. I went and played with several popular compilers on godbolt.org, enabling one line at a time.
Code: [Select]
bool condition;
int test()
{
  //waitloopy: if (condition) goto waitloopy;
  //while (condition);
  //while (1) if (condition) continue; else break;
  //while (1) if (!condition) break;
}
The first and second lines always compile to the same output.
The third and fourth lines always produced larger output than first and second lines. Continue/break branches inside loops clearly aren't fully optimized for this case.
The third and fourth lines did not always produce the same output. It depended on the compiler I chose.

This would be surprising from gcc or llvm on any CPU at any optimization level except -O0, although behaviour is undefined as you wrote it. You might get more consistent results with "condition" volatile. That might help consistency at -O3, otherwise I could see -O3 doing something like unrolling and then optimizing and getting something like:

Code: [Select]
while (condition){
  int i=32;
  while(--i){}
}
Title: Re: Too many programming languages?
Post by: brucehoult on November 30, 2019, 12:43:08 pm
Code: [Select]
while (1) if (condition) continue; else break;

Oh, sorry! With the compulsory braces and newlines:

Code: [Select]
while (1) {
  if (condition) {
    continue;
  }
  else {
    break;
  }
};

The more the LOCs the better the bill.

I don't that that abhorrent, except the redundant 'else', and the trailing semicolon.

Oh, Michael :-(

Without the "else" it's an infinite loop, even if "condition" is volatile.

(If "condition" isn't volatile then the compiler is entitled to do all kinds of things: delete all code following this loop as unreachable, mark the function itself "noreturn", delete all code following any *call* of this function as unreachable etc etc)
Title: Re: Too many programming languages?
Post by: Nusa on November 30, 2019, 01:21:48 pm
Ok, you were correct; rookie mistake. Once optimization was turned on, all cases compiled the same. The volatile keyword was also required or the test got optimized out and it became a simple infinite loop.
Title: Re: Too many programming languages?
Post by: hamster_nz on November 30, 2019, 06:53:35 pm
Code: [Select]
while (1) if (condition) continue; else break;

Oh, sorry! With the compulsory braces and newlines:

Code: [Select]
while (1) {
  if (condition) {
    continue;
  }
  else {
    break;
  }
};

The more the LOCs the better the bill.

I don't that that abhorrent, except the redundant 'else', and the trailing semicolon.

Oh, Michael :-(

Without the "else" it's an infinite loop, even if "condition" is volatile.

(If "condition" isn't volatile then the compiler is entitled to do all kinds of things: delete all code following this loop as unreachable, mark the function itself "noreturn", delete all code following any *call* of this function as unreachable etc etc)

Um, explain more... because I'm missing something...

Code: [Select]
#include <stdio.h>
#include <stdlib.h>

int main(int argc, char *argv[]) {
  //code copy and pasted, with condition replaced
  while (1) {
    if (rand() < 100) {
      continue;
    }
    // else // this keyword makes no difference
    {
      break;
    }
  };
  printf("Done\n!");
  return 0;
}

Seems to work fine...
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 08:41:08 pm
(If "condition" isn't volatile then the compiler is entitled to do all kinds of things: delete all code following this loop as unreachable, mark the function itself "noreturn", delete all code following any *call* of this function as unreachable etc etc)
Um, explain more... because I'm missing something...
Seems to work fine...

The condition was invariant in the first example, in the second example the condition is not invariant.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on November 30, 2019, 08:47:42 pm
(If "condition" isn't volatile then the compiler is entitled to do all kinds of things: delete all code following this loop as unreachable, mark the function itself "noreturn", delete all code following any *call* of this function as unreachable etc etc)
Um, explain more... because I'm missing something...
Seems to work fine...
The condition was invariant in the first example, in the second example the condition is not invariant.

I think he means that you can remove the else and leave the break.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 09:29:06 pm
I think he means that you can remove the else and leave the break.

Sorry, yes that as well  - the else along with two sets of  {} are redundant.

The behavior of the optimizer should be dependent on condition being variant/invariant.

Refactored it could also be written:
while(1)
{
   if (!condition()) break;
}

but it's sugar, as we come back to:
while(!condition()) ;

Or if you like you {} :
{}while(!condition());
while(!condition()) { ; }
Title: Re: Too many programming languages?
Post by: brucehoult on November 30, 2019, 09:57:54 pm
Code: [Select]
#include <stdio.h>
#include <stdlib.h>

int main(int argc, char *argv[]) {
  //code copy and pasted, with condition replaced
  while (1) {
    if (rand() < 100) {
      continue;
    }
    // else // this keyword makes no difference
    {
      break;
    }
  };
  printf("Done\n!");
  return 0;
}

Seems to work fine...

"I don't [find] that abhorrent, except the redundant 'else', and the trailing semicolon."

Ugh. Removing only the word "else" and leaving the brackets makes it even more abhorrent! But certainly that will work ok.

In my experience, when someone says they will "remove the else" they mean removing also the code controlled by it -- the "else clause".
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 09:59:29 pm
I think he means that you can remove the else and leave the break.

OK, I should have looked at your example before the {} altered it:
while (1) if (condition) continue; else break;

As written, the else is needed when condition is variant.
Title: Re: Too many programming languages?
Post by: Cerebus on November 30, 2019, 10:21:35 pm
The condition was invariant in the first example, in the second example the condition is not invariant.

I think he means that you can remove the else and leave the break.

Erm, no.

Quote from: dictionary
invariant | ɪnˈvɛːrɪənt |
adjective
never changing: the pattern of cell divisions was found to be invariant.
noun Mathematics
a function, quantity, or property which remains unchanged when a specified transformation is applied.
Title: Re: Too many programming languages?
Post by: legacy on November 30, 2019, 11:05:21 pm
when someone says they will "remove the else" they mean removing also the code controlled by it -- the "else clause".

avionic rule #13: "{ }" must always be preceded by if() | else | do | while() | switch() | for() | function()
Title: Re: Too many programming languages?
Post by: legacy on November 30, 2019, 11:12:02 pm
Like I said, highlighting all tabs in Gedit/Pluma (by searching for \t with backslash escapes enabled in the search dialog) makes the tabs visible.

Nice trick  :D

I wrote a text editor years ago. It looks like "nano", it's very simple and features limited; after your comment, yesterday I implemented a way to highlight all tabs. So now they are visible.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on November 30, 2019, 11:41:04 pm
avionic rule #13: "{ }" must always be preceded by if() | else | do | while() | switch() | for() | function()

Be interesting to know why that got written into the code standard. I'm guessing someone abused it to work around a scope issue and in typical corporate fashion all problems are fixed by updates to the documentation.
Title: Re: Too many programming languages?
Post by: legacy on December 01, 2019, 12:09:00 am
Be interesting to know why that got written into the code standard

Mostly because it simplifies the team working activities. Especially in the testing squad.
It's a de-facto benefit.
Title: Re: Too many programming languages?
Post by: westfw on December 01, 2019, 12:36:16 am
Quote
[many ways of expressing an empty while loop. almost all for C/C++]

Ok, now I'm curious.  The empty while loop is a staple of embedded programming.Does anyone have a newer favorite language that has a syntax that they like BETTER than their favorite C syntax?

Quote
I implemented a way to highlight all tabs.

I have an EMACS hack that continually highlight tabs, and any trailing whitespace on a line (dark gray instead of a black background.)It's handy (especially since tailing spaces can break things "mysteriously" if you don't see them.)

(https://www.eevblog.com/forum/programming/too-many-programming-languages/?action=dlattach;attach=882390)
Title: Re: Too many programming languages?
Post by: Cerebus on December 01, 2019, 01:39:35 am
Ok, now I'm curious.  The empty while loop is a staple of embedded programming.Does anyone have a newer favorite language that has a syntax that they like BETTER than their favorite C syntax?

To me the obvious construct to use is an explicit no-op of some sort. C doesn't have one, just the implicit null statement, which is what leads to difficult to read code.

Ironically, older languages than C had explicit no-ops (Algol-68: SKIP, FORTRAN: CONTINUE) but you've asked for newer languages. Some example of explicit no-ops from newer languages, not exhaustive: Ada - null, Python - pass. I'm sure the list is longer but I can't be bothered to scrape my memory/the net for other examples.

I've no problem with something like this (in C) that is effectively making up an explicit no-op (Which, let's face it, is part of the grand tradition of extending C in ad-hoc ways, but at least I haven't #def'd a made up SPIN_WHILE() operator.):
Code: [Select]
void spin_no_op ()
{
    /* As well as usefully doing nothing, acting as a marker for a no-op and probably getting optimized away, writing */
    /* this as a procedure also has the advantage that, if the particular platform we are on has a wait-for-interrupt */
    /* instruction or something similar that reduces power consumption in a busy-wait loop we can drop it in here     */
    /* and automatically get it used by any code that calls here. e.g.: */
#if defined(GNUC) & defined(__x86_64__)
    asm volatile ("PAUSE");
#endif
}

void do_work()
{
// ...
extern volatile int waiting_for_interrupt_from_turboencabulator;
// ...
    while (waiting_for_interrupt_from_turboencabulator) spin_no_op();
}

Title: Re: Too many programming languages?
Post by: brucehoult on December 01, 2019, 03:12:34 am
I have an EMACS hack that continually highlight tabs, and any trailing whitespace on a line (dark gray instead of a black background.)It's handy (especially since tailing spaces can break things "mysteriously" if you don't see them.)

M-x whitespace-mode

.. will basically do that. It doesn't interfere with your editing mode, and you can toggle it on and off at will.

By default it shows spaces as a . on light yellow background, tabs as >> on a khaki background, and trailing whitespace as angry red. But you can tune that to your heart's content, changing styles or what elements are highlighted at all e.g. "don't show me spaces"
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 01, 2019, 08:28:02 am
Mostly because it simplifies the team working activities. Especially in the testing squad.
It's a de-facto benefit.

Companies I've worked out change occurs because one of 5,000 people working on the product does something not considered kosher by some opinionated bastard so we all get punished by a process update. I don't recall off-hand anything being done for beneficial reasons.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 01, 2019, 09:04:16 am
[...] something not considered kosher by some opinionated bastard [...]
:-+

This is how Linus does it: https://github.com/git/git/blob/29d76d4b05bc537ac59fd1e6f849ab3386c01502/rev-tree.c#L141-L146

Code: [Select]
static unsigned long parse_commit_date(const char *buf)
{
  if (memcmp(buf, "author", 6))
    return 0;
  while (*buf++ != '\n')
    /* nada */;
  if (memcmp(buf, "committer", 9))
    return 0;
  while (*buf++ != '>')
    /* nada */;
  return parse_time(buf);
}
Title: Re: Too many programming languages?
Post by: Mechatrommer on December 01, 2019, 09:16:29 am
that is a kind of poor or lazy programming practise (if not taken with cautious) that gave C/C++ a bad name, who wrote that? it expects correctly formatted buf input, otherwise the famous buffer overrun... ps: added/corrected.
Title: Re: Too many programming languages?
Post by: brucehoult on December 01, 2019, 09:40:11 am
that is a kind of poor or lazy programming that gave C/C++ a bad name, who wrote that? it expects correctly formatted buf input, otherwise the famous buffer overrun...

That code is unchanged since April 2005.

I guess you could craft a malicious commit blob, but what are you going to achieve on your own machine, in your local git repo? I don't think that would survive packing, which happens before sending to a server.
Title: Re: Too many programming languages?
Post by: Mechatrommer on December 01, 2019, 10:15:55 am
that is a kind of poor or lazy programming that gave C/C++ a bad name, who wrote that? it expects correctly formatted buf input, otherwise the famous buffer overrun...
That code is unchanged since April 2005.
doesnt mean its a good practice (added/corrected my previous msg) esp to those who dont know what they are doing. will open up opportunity to some interesting exercises for those who are interested (crackers) i dont care its not my business. for this reason, coupled with negative zero days or whaever hat color exploits news... some swore by things such as ADA, Python or whatever so called safe programming (without they knowing boundary checks are performed in every single lines of memory access). however, this (no boundary check) practice can be intentional by some serious analysts to trade for performance, this is also usually missed by those "safe programming" fanboyz, meh because we can and you cant, anyway.. ignorance is the enemy :P
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 01, 2019, 10:47:31 am
That is a kind of poor or lazy programming [...] who wrote that?

Some ignorant git it seems: https://github.com/git/git/commit/40e88b95cdce955059e94d288e8ebdb7d1e9fa88   :popcorn:
Title: Re: Too many programming languages?
Post by: brucehoult on December 01, 2019, 11:27:00 am
that is a kind of poor or lazy programming that gave C/C++ a bad name, who wrote that? it expects correctly formatted buf input, otherwise the famous buffer overrun...
That code is unchanged since April 2005.

Opps, my mistake. I didn't look closely and that's actually a snapshot from April 2005. I don't even know where the equivalent code is now.

Quote
doesnt mean its a good practice (added/corrected my previous msg) esp to those who dont know what they are doing. will open up opportunity to some interesting exercises for those who are interested (crackers) i dont care its not my business. for this reason, coupled with negative zero days or whaever hat color exploits news... some swore by things such as ADA, Python or whatever so called safe programming (without they knowing boundary checks are performed in every single lines of memory access). however, this (no boundary check) practice can be intentional by some serious analysts to trade for performance, this is also usually missed by those "safe programming" fanboyz, meh because we can and you cant, anyway.. ignorance is the enemy :P

I don't disagree that it's bad practice. I'd personally write it more safely, even as a q&d prototype, and I'd ask for changes before commit if I was reviewing the code.

The calling parse_commit() code actually gets the length of the input back from read_sha1_file, and then *ignores* it.

I'd change the arguments to parse_commit_date(buffer) so it would be called as parse_commit_date(buffer, limit) (where limit would be earlier calculated as (char*)buffer+size.

And then:

Code: [Select]
static unsigned long parse_commit_date(const char *buf, const char *limit)
{
if (buf > limit-6 || memcmp(buf, "author", 6))
return 0;
while (buf < limit && *buf++ != '\n')
/* nada */;
if (buf > limit-9 || memcmp(buf, "committer", 9))
return 0;
while (buf < limit && *buf++ != '>')
/* nada */;
return parse_time(buf, limit);
}

I'd challenge anyone to notice any speed difference.

But more likely I'd save myself a lot of typing and write it using some auxiliary functions.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 01, 2019, 12:13:36 pm
We've seen Torvalds' style. Let's see Fabrice Bellard's:

Code: [Select]
while (*p && *p++ != '\n')
  continue;
https://gitlab.com/xk/quickjs/blob/master/quickjs.c?expanded=true&viewer=simple#L26930-26931

=>

Fabrice Bellard:
Code: [Select]
while (condition) continue;
Linus Torvalds:
Code: [Select]
while (condition) /* nada */;
I like both! (And kudos to @Nominal Animal for proposing Torvalds' style above in this thread)

Quote
I don't disagree that it's bad practice. I'd personally write it more safely

The rule you guys are trying to apply doesn't apply here. One thing is not to trust anything you've got from somebody/somewhere else (especially the net), => be extra careful, and another thing is not to trust the data you're creating yourself, that's silly! If you know you've put something in a string, you can rest assured it's there unless the computer is broken.
Title: Re: Too many programming languages?
Post by: brucehoult on December 01, 2019, 12:33:24 pm
We've seen Torvalds' style. Let's see Fabrice Bellard's:

Code: [Select]
while (*p && *p++ != '\n')
  continue;
https://gitlab.com/xk/quickjs/blob/master/quickjs.c?expanded=true&viewer=simple#L26930-26931

Yes, good. Checking for terminating null or limit pointer are pretty much as good as each other.

Quote
Quote
I don't disagree that it's bad practice. I'd personally write it more safely

The rule you guys are trying to apply doesn't apply here. One thing is not to trust anything you've got from somebody/somewhere else (especially the net), => be extra careful, and another thing is not to trust the data you're creating yourself, that's silly! If you know you've put something in a string, you can rest assured it's there unless the computer is broken.

Or, if the code you yourself wrote to create it is broken. It's good to be defensive.

In the git case it's parsing disk files that are normally created by git, but it's perfectly possible for other software to create them or even to do it by hand -- there are tutorials for that.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 01, 2019, 05:26:48 pm
We've seen Torvalds' style. Let's see Fabrice Bellard's:

Code: [Select]
while (*p && *p++ != '\n')
  continue;

Yeah, altough I still find the above typical of bad C style.

I would write the above as, for instance, such:

Code: [Select]
while (*p != '\0')
{
    if (*p++ == '\n')
        break;
}

Isn't that much easier to read and not needing any kind of empty statement?
Title: Re: Too many programming languages?
Post by: Mechatrommer on December 01, 2019, 05:46:10 pm
That is a kind of poor or lazy programming [...] who wrote that?
Some ignorant git it seems: https://github.com/git/git/commit/40e88b95cdce955059e94d288e8ebdb7d1e9fa88   :popcorn:
i kind of know the answer before i asked the question, so dont take it personally, that was a partial joke, its kind of troll feed :P for "safe program" fanboysm. thats why i said some "serious analysts" did that deliberately, its just the noobs who follows the style can get into problems.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 01, 2019, 08:18:58 pm
IMO, proper style is utterly context-dependent.

If it is an existing project, then the style it uses (hopefully consistently) is the proper one.

If you are teaching new programmers, or showing new programmers how to do things, you want to use a style that helps them get the correct intuitive grasp of things.  They will eventually shift away from that initial/learning style, and that's perfectly okay too.

If you write example code for someone coming from another programming language, you can use examples written in their "intuitive" style and an in some other style that is context-appropriate, to help them understand the differences between the languages.  You can even use styles you personally detest here, to positive effect.

So, I really don't see the purpose of comparing styles outside a specific context.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 01, 2019, 10:27:51 pm
Yeah, although I still find the above typical of bad C style.

That's fine, you can write C your own way :-+. But in my book, Torvalds and Bellard are among the best C programmers in the world.
Title: Re: Too many programming languages?
Post by: Mechatrommer on December 02, 2019, 02:10:56 am
Yeah, although I still find the above typical of bad C style.
That's fine, you can write C your own way :-+. But in my book, Torvalds and Bellard are among the best C programmers in the world.
i never interested to have a look at his code before, but to my surprised, he's the first guy i think to use "my" style of coding ie..

Code: [Select]
while() {
  // codes here
}
instead of...
Code: [Select]
while()
{
  // codes here
}

so we must have something in common somewhere 8) (at least we both have 2 eyes) but whats not in common is he still use the "old" style on function name

Code: [Select]
func()
{
  // codes here
}
instead of...
Code: [Select]
func() {
  // codes here
}

so i must be more "uniformed" (consistent) in term of coding style :P
Title: Re: Too many programming languages?
Post by: legacy on December 02, 2019, 04:25:21 am
That's fine, you can write C your own way :-+. But in my book, Torvalds and Bellard are among the best C programmers in the world.

Torvalds?   :palm:
Title: Re: Too many programming languages?
Post by: legacy on December 02, 2019, 04:40:02 am
serious analysts to trade for performance, this is also usually missed by those "safe programming" fanboyz, meh because we can and you cant, anyway.. ignorance is the enemy :P

"safe C" aka "avionic C" is mostly a ruleset to facilitate "code instrumentation" for (e.g.) dynamic coverage analysis.  Performance comes into consideration after this, and it's code running not in instrumented mode (it runs in profiling mode), nor in debug mode (under ICE).
Title: Re: Too many programming languages?
Post by: brucehoult on December 02, 2019, 04:51:10 am
That's fine, you can write C your own way :-+. But in my book, Torvalds and Bellard are among the best C programmers in the world.

Torvalds?   :palm:

Torvalds has prototyped several very influential ideas/projects, but I'd say his main talent is then managing contributions by others.

Fabrice Bellard has done a ton of very cool stuff solo. I mean ... three time winner of the Obfuscated C contest, breaking the record for digits of Pi, FFMPEG, QEMU, and other emulators, such as ones that run in a web browser eg RISC-V Fedora Linux in your browser here:

https://bellard.org/jslinux/vm.html?cpu=riscv64&url=https://bellard.org/jslinux/fedora29-riscv-2.cfg&mem=256

Or, a version with X GUI here (takes longer to start):

https://bellard.org/jslinux/vm.html?cpu=riscv64&url=https://bellard.org/jslinux/fedora29-riscv-xwin.cfg&graphic=1&mem=256


Other candidates for top programmer based on multiple amazing projects include:

Andrew Tridgell: rsync, samba, SourcePuller (the reason indirectly that Torvalds started git lol), huge improvements to ArduPilot.

Julian Seward : bzip2, valgrind
Title: Re: Too many programming languages?
Post by: legacy on December 02, 2019, 05:58:44 am
Torvalds has prototyped several very influential ideas/projects, but I'd say his main talent is then managing contributions by others.

Precisely. Torvalds is definitively not good for his code-style(1), and his monolithic kernel is a very bad idea in the long term. But his main talent is managing contributions by others, and his GIT is a very great tool.


(1) last time we analyzed the possibility of certifying Linux for avionics ( and I am talking about DO178B level E, the lowest ), even due to his code-style, it would have cost more money than rewriting it from scratch.
Title: Re: Too many programming languages?
Post by: obiwanjacobi on December 02, 2019, 06:35:23 am
Code: [Select]
while() {
  // codes here
}
instead of...
Code: [Select]
while()
{
  // codes here
}

See, if you would use indents (ala Python) you would not be having this discussion in the first place!  :-DD
Title: Re: Too many programming languages?
Post by: Berni on December 02, 2019, 07:26:51 am
Well this code style argument really escalated quickly.

There is no ONE right way to format C code. Different styles have different benefits that might be more relevant for the particular use case, hence why you find so many styles of C code. These "style politics" happens in all languages but C seams to have very diverse styles due to the syntax not being too picky where things are and the popularity of the C language in general.

However the practice of cramming as much functionality as possible into one line benefits nobody. It's mostly just showing off programing skills for the sake of feeling superior. Having a big ego is usually a bad thing in programming, especially when multiple people work on the same project where this one guy forces his own ideas on everyone. Compacting code using lesser known features not only makes it difficult to understand for a programmer not familiar with that exact feature, but it also can trip up a experienced programmer that could understand it wrong from a quick glance (because they are used to using that feature is a slightly different way that causes a different effect). It can slow down people reading the code as you actually have to actively think what is going on in that line rather than just glancing at it. Your brain gets wired for your particular style of code, be nice to the people who have brains wired for a slightly different style by writing clear concise code.

I'm NOT saying to never use any fancy features in a language. Just use them with appropriate moderation. This is especially true for larger languages like C++, C#, Java...etc that have so many features. At the same time programmers go object crazy and make objects for every little piece of crap while naming those objects confusingly.
Title: Re: Too many programming languages?
Post by: brucehoult on December 02, 2019, 07:45:29 am
However the practice of cramming as much functionality as possible into one line benefits nobody. It's mostly just showing off programing skills for the sake of feeling superior.

I would certainly hope that anyone who calls themselves a C programmer would understand a very common idiom such as Bellard's ...

Code: [Select]
while (*p && *p++ != '\n')
  continue;

... at a glance.

What do you need to understand?

- what *p and *p++ do
- that text data is often terminated with a 0 byte
- that '\n' is a newline
- that in C 0 is false and everything else is true
- how a short-circuit operator such as && works
- what continue and break do in loops

C is a very small language and those are all completely fundamental things.

C++ is a completely different matter. I think I understood pretty much all of C++ around 1989 and up to and including C++98. What we've got now though is an absolute horror of complexity and interactions and just Too Many Features.
Title: Re: Too many programming languages?
Post by: Berni on December 02, 2019, 08:20:19 am
I would certainly hope that anyone who calls themselves a C programmer would understand a very common idiom such as Bellard's ...

Code: [Select]
while (*p && *p++ != '\n')
  continue;

... at a glance.

What do you need to understand?

- what *p and *p++ do
- that text data is often terminated with a 0 byte
- that '\n' is a newline
- that in C 0 is false and everything else is true
- how a short-circuit operator such as && works
- what continue and break do in loops

C is a very small language and those are all completely fundamental things.

C++ is a completely different matter. I think I understood pretty much all of C++ around 1989 and up to and including C++98. What we've got now though is an absolute horror of complexity and interactions and just Too Many Features.

This particular snippet of code is not that bad. Id say its about as far as one would want to go with code compaction.

If i was to write this code i would have probably done something like this:
Code: [Select]
while (*p != 0 && *p != '\n')
   p++;
My reasoning behind this that the while condition is only checking the condition while the body of the loop is what is actually doing the work. So that when i look at it i go "So this loop is incrementing p, repeating it until p is zero or linefeed". But again that's just how my particular brain works. Yes i know the part with "!= 0" is completely unnecessary, but it just looks more sensible to me because I'm looking for the zero character. If i am looping until a unsigned variable x decrements to zero i might use instead "while(x > 0)" even tho again the "> 0" does nothing. But if i have something used as a flag and i don't care about its value i might just go "if(isEmpty)" Its like commenting your code without actually using a comment.

I have seen code much worse than this where for example the loop condition is a 40 character long, contains of all sorts of things including function calls. Some thing inside there might be made using #define directives, but that #define contains 3 other #defines that each go 5 levels deep resulting in hunting all over the place to even figure out what that top #define is even doing...and this is before we even got to what that loop condition is doing. :scared:
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 02, 2019, 08:36:22 am
Code: [Select]
while (*p && *p++ != '\n') continue;

If i was to write this code i would have probably done something like this:
Code: [Select]
while (*p != 0 && *p != '\n') p++;

But yours doesn't do the same thing...  :)

Code: [Select]
#include <stdio.h>

int main () {
  char s[]= "abc\n";
  char* p= s;
  char* q= s;
  while (*p && *p++ != '\n') continue;
  while (*q && *q != '\n') q++;
  printf("%u\n", p == q);
}

=> 0
Title: Re: Too many programming languages?
Post by: Berni on December 02, 2019, 07:27:57 pm
But yours doesn't do the same thing...  :)

Sorry. I see i missed the fact that the original code has a special case for line feed characters where it skips over the first occurrence of a line feed.

This is exactly my point in why it is not a good idea to cram as much functionality as possible into a line of code. The special case is embedded among all the other logic and is easy to miss. Okay i might not exactly be an expert in C (I'm a hardware engineer that only does programing when i have no other choice) but this is something that could have also easily gone unnoticed by a much more experienced programmer.

In such a case i would add "if(*p == '\n') p++;" on the end of my loop so that its clear to anyone reading this that \n is treated differently and also make it clear in what way it is treated differently. A extra line or two of code is a small price to pay for having your code be clear in its intentions and difficult to interpret the wrong way at a glance. Code is written to be read by humans, not machines (They would prefer it if you just gave them machine code rather than this inefficient ascii stuff)
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 02, 2019, 09:02:02 pm
and his monolithic kernel is a very bad idea in the long term.

You sound like Andy Tanenbaum :-) https://www.oreilly.com/openbook/opensources/book/appa.html (https://www.oreilly.com/openbook/opensources/book/appa.html) . But time has proven he wrong. The most desirable feature of microkernels is modularity, and Linux got loadable kernel modules ages ago: modprobe, lsmod, insmod, etc.

Quote
I believe you have some valid points, although I am not sure that a
microkernel is necessarily better. It might make more sense to allow some
combination of the two. As part of the IPC code I'm writting for Linux I am
going to include code that will allow device drivers and file systems to run
as user processes. These will be significantly slower though, and I believe it
would be a mistake to move everything outside the kernel (TCP/IP will be
internal).
 
Actually my main problem with OS theorists is that they have never tested
there ideas!
Title: Re: Too many programming languages?
Post by: legacy on December 02, 2019, 10:45:41 pm
But time has proven he wrong.

no, he was not wrong.

Do you know how big is k5.4 on my HPPA2? ____24 MB___  :palm:
(with all modules static compiled)

Monolithic kernels are a pain in the ass to be debugged, if there is a bug here, or there, the whole kernel crashes, and I know it not because I read a book, but rather because I am tired to deal with Linux on my SGI/IP30 (on which ... there is no ejtag port, hence I do have to debug via kgdb, on a hacked thing attached to the PCI behind a XIO port).

Macrokernel "were" good only because they minimize the costs of context switches to bundle system calls in batches. This was a must-have for the previous hardware, but times have changed in the last twenty years, and nowadays modern hardware does not need this compromise, and this means it's the end of Linux.
Title: Re: Too many programming languages?
Post by: brucehoult on December 02, 2019, 11:49:05 pm
But yours doesn't do the same thing...  :)

Sorry. I see i missed the fact that the original code has a special case for line feed characters where it skips over the first occurrence of a line feed.

This is exactly my point in why it is not a good idea to cram as much functionality as possible into a line of code. The special case is embedded among all the other logic and is easy to miss.

But it's not a special case! The whole point of this code is to skip to the start of the next line of text, including skipping over the newline.

Hopefully you'd find it clear as:

Code: [Select]
while(*p++ != '\n') continue;

The "*p &&" is there just to prevent sailing on past the end of the input string as a whole.

It's really, I would say, totally normal and idiomatic C and not in the least bit "tricky".

Quote
In such a case i would add "if(*p == '\n') p++;" on the end of my loop so that its clear to anyone reading this that \n is treated differently and also make it clear in what way it is treated differently. A extra line or two of code is a small price to pay for having your code be clear in its intentions and difficult to interpret the wrong way at a glance. Code is written to be read by humans, not machines (They would prefer it if you just gave them machine code rather than this inefficient ascii stuff)

Ugh. That's just unnecessarily bulking up the amount of stuff to be read. And the amount of machine code generated.

If it was my code I'd be doing it a little differently. I'd make a function "skipToNextLine(char **p), with a comment or two, and maybe marked as "inline" (or maybe leave that up to the compiler).
Title: Re: Too many programming languages?
Post by: Berni on December 03, 2019, 08:09:37 am
But it's not a special case! The whole point of this code is to skip to the start of the next line of text, including skipping over the newline.

Hopefully you'd find it clear as:

Code: [Select]
while(*p++ != '\n') continue;

The "*p &&" is there just to prevent sailing on past the end of the input string as a whole.

It's really, I would say, totally normal and idiomatic C and not in the least bit "tricky".

Yes but i was looking at the code out of context just looking at what that one loop is doing. Not thinking about what it does for more than a few seconds or actually trying to run it.

I could also be used to find the length of some input string later on by subtracting away the original pointer and in that case you generally would not include newlines.


Ugh. That's just unnecessarily bulking up the amount of stuff to be read. And the amount of machine code generated.

If it was my code I'd be doing it a little differently. I'd make a function "skipToNextLine(char **p), with a comment or two, and maybe marked as "inline" (or maybe leave that up to the compiler).

Yep doing it as a named function is also a good way to make it clear what it is doing as long as the name is descriptive like that. Way too often i see too abbreviated names for functions and variables in C code. Like for example naming this function "nextln(char **p)" in what case it only makes things worse.

This move to next line code sounds like it might be used in a lot of places so it makes sense to have a function for it. But if its used only once, then i think just putting a comment "\\ Skip pointer to start of next line" in front of the loop is a nice way to do it.

To be honest i don't always comment my code as well as i think i should. But i try to have a single line comment in front any significant loops or large nested "if" statement blocks. I might also add comments to single lines of code if the action has an effect down the line but has no obvious purpose within this function (Like setting some unrelated flag in a bufferClear() function that is needed to make sure that some other part of the program doesn't fall on its face because the buffer it was using suddenly disapered).

I just try to be as nice as possible to whoever reads my code, chances are that someone is going to be me in >5 years from now trying to quickly fix a bug or tweak some functionality while having forgot everything about how this thing even works.
Title: Re: Too many programming languages?
Post by: Kjelt on December 03, 2019, 08:51:42 am
Macrokernel "were" good only because they minimize the costs of context switches to bundle system calls in batches. This was a must-have for the previous hardware, but times have changed in the last twenty years, and nowadays modern hardware does not need this compromise, and this means it's the end of Linux.
I might misread your statement and it could be you are talking about the cloud for instance. But on the larger IoT devices I only see more embedded Linux systems running on SoCs popping up actually.
Could you elaborate on the non OS alternatives you see ?
Title: Re: Too many programming languages?
Post by: Tepe on December 03, 2019, 10:01:58 am
We've seen Torvalds' style. Let's see Fabrice Bellard's:

Code: [Select]
while (*p && *p++ != '\n')
  continue;

Yeah, altough I still find the above typical of bad C style.
It is, except for the superfluous "continue" (which is actually cool), idiomatic C.

I would write the above as, for instance, such:

Code: [Select]
while (*p != '\0')
{
    if (*p++ == '\n')
        break;
}
And this is not.
Title: Re: Too many programming languages?
Post by: Tepe on December 03, 2019, 10:12:02 am
... just putting a comment "\\ Skip pointer to start of next line" in front of the loop is a nice way to do it.
That's not a comment  :box:
(I have always wondered how some people manage to mix up forward and backward slashes)
Title: Re: Too many programming languages?
Post by: brucehoult on December 03, 2019, 10:35:51 am
... just putting a comment "\\ Skip pointer to start of next line" in front of the loop is a nice way to do it.
That's not a comment  :box:
(I have always wondered how some people manage to mix up forward and backward slashes)

One of the tenets of the "Extreme Programming" people back around 2000 was "no comments -- the variable and function names should make the code self-explanatory"

I've actually worked at a place that didn't go quite that far, but had the rule "No comments except possibly one before a function explaining the purpose of the function". A corollary of that was "If you feel you need to add a comment to explain a block of code then extract it into its own function".
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 11:13:05 am
Yeah, the style "books" are full of dogmas. E.g. JSLint, a famous JavaScript linter by a famous JavaScript guru, that ~ nobody uses. The ones everybody use are configurable, so you can cherry pick only the dogmas you like/want. To each his own.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 11:24:34 am
But time has proven he wrong.

no, he was not wrong.

Do you know how big is k5.4 on my HPPA2? ____24 MB___  :palm:
(with all modules static compiled)

Now try with make allnoconfig ... then add one by one only the things you really need.

and this means it's the end of Linux.

Sorry, but I don't see the end of Linux anytime soon. By the way, AST, the minix guy, 20 years later got a 2.5 million grant from the EU and made minix 3:

https://wiki.minix3.org/doku.php?id=www:documentation:features (https://wiki.minix3.org/doku.php?id=www:documentation:features)

https://www.youtube.com/watch?v=vlOsy0PZZyc (https://www.youtube.com/watch?v=vlOsy0PZZyc)

https://www.youtube.com/channel/UCGE79COc35OCPu7UQJYILQw/videos (https://www.youtube.com/channel/UCGE79COc35OCPu7UQJYILQw/videos)
Title: Re: Too many programming languages?
Post by: legacy on December 03, 2019, 12:01:29 pm
Now try with make allnoconfig ... then add one by one only the things you really need.

it was not the point, and on machines like rb532, when the minimal config of the kernel (minimal means you cannot remove anything else) takes more 8Mbyte you are completely unable to use the machine because the firmware says "out of range" and refuses to load the kernel.  There are a lot of examples of this kind. My old C1xxx Japanese PDA? yet again the firmware cannot load a kernel bigger than 5Mbyte, and I have to use a 2.4 kernel to load a 2.6.* kernel, which introduces a lot of problems, which make impossible to load a kernel 4.*, and the machine is doomed to be unable to run a modern rootfs du to the kernel obsolescence.

This sucks.

But THE point was: kernel modules are a pain in the ass to be debugged this way, because you have to assume the storage is working, and in certain cases neither the network (NFS? forget it), nor the SATA, PATA, SCSI, whatever does work. On machines like SGI/IP30 this means you have to waste a lot of time, trying and retrying in the darkness, and it takes more time than with microkernel where you can add things progressively at the sunlight.

Besides, if the kernel is so big, your ICE also needs more ram to emulate it. Do you know how much does it cost an ICE with 8Mbyte of ram RealTime? Do you know how much money does it cost 64Mbyte?

For my HPPA2, I don't want to even look at the price list, and if you do not have the ICE or a debugger cable, you are f***d up. Indeed, after 20 years of hacking, the SGI/IP30 is still unable to properly use the PCI.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 12:25:57 pm
What I don't get is, if you don't like it, why are you trying to shoehorn it into those vintage machines? There are not any microkernel alternatives?
Title: Re: Too many programming languages?
Post by: legacy on December 03, 2019, 12:29:10 pm
Sorry, but I don't see the end of Linux anytime soon

last 5.* have a lot of code-regressions. This must mean something. Starting from "devs" (like AlanCox, and Miller) are tired, to ... nobody is probably no more willing to go ahead with the increased level of complexity stuffed monolithically into the kernel.

Even Linus.T. stood up and admitted it - "Linux is too bloated" - he said, yes, it is.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 12:42:23 pm
Sorry, but I don't see the end of Linux anytime soon
last 5.* have a lot of code-regressions. This must mean something. Starting from "devs" (like AlanCox, and Miller) are tired, to ... nobody is probably no more willing to go ahead with the increased level of complexity stuffed monolithically into the kernel.

Who said that "reports of my death were greatly exaggerated" ?

Even Linus.T. stood up and admitted it - "Linux is too bloated" - he said, yes, it is.

I googled that, and see what I stumbled upon :)

Quote
One thing that I forgot to mention, but which is critical to the success of Linux, is that there really is no such thing as monolithic "Linux." Linux is highly modular and can be trimmed down/beefed up to fit a wide variety of applications...on the developers' terms, not Red Hat's, Novell's, Canonical's, etc.

So, unlike Windows, which can only be what Microsoft dictates, Linux can truly be all things to all people, as "fat" or as "skinny" as the developer wants it to be. Ubuntu is obese compared to sub-100 KB uClinux distributions, for example. Both serve different, and useful, purposes.

https://www.cnet.com/news/linus-torvalds-linux-is-bloated/ (https://www.cnet.com/news/linus-torvalds-linux-is-bloated/)
Title: Re: Too many programming languages?
Post by: Cerebus on December 03, 2019, 01:55:36 pm
Sorry, but I don't see the end of Linux anytime soon

last 5.* have a lot of code-regressions. This must mean something. Starting from "devs" (like AlanCox, and Miller) are tired, to ... nobody is probably no more willing to go ahead with the increased level of complexity stuffed monolithically into the kernel.

When a particular codebase has been around long enough, and has had enough changes made to it over they years, there comes a moment when it is time to rip it up and start again. This applies to anything: accounting software, monolithic os kernels, microkernels, you name it. It has nothing to do with the application at hand, the methodology used or any other similar variable. It's just one of the great unwritten laws of software. The cost of comprehending and maintaining, and the risks of additional changes to, the accumulated cruft outweighs the cost and risks of starting afresh.

There's always a list of reasons, often spurious, why this shouldn't/can't be done - sunk costs, backward compatibility, etc. - but every competent developer knows that all codebases eventually reach the point where it's time to take it around the back of the barn with a shotgun and get a new codebase.

Whether the linux kernel has reached this point is another question; but it's a question of when, not if, it becomes true.

I googled that, and see what I stumbled upon :)

Quote
One thing that I forgot to mention, but which is critical to the success of Linux, is that there really is no such thing as monolithic "Linux." Linux is highly modular and can be trimmed down/beefed up to fit a wide variety of applications...on the developers' terms, not Red Hat's, Novell's, Canonical's, etc.

So, unlike Windows, which can only be what Microsoft dictates, Linux can truly be all things to all people, as "fat" or as "skinny" as the developer wants it to be. Ubuntu is obese compared to sub-100 KB uClinux distributions, for example. Both serve different, and useful, purposes.

https://www.cnet.com/news/linus-torvalds-linux-is-bloated/ (https://www.cnet.com/news/linus-torvalds-linux-is-bloated/)

I think you're misinterpreting that. To me that reads as talking about a complete linux distribution/implementation, not just the kernel.

Changing tack:

I would not characterise the linux kernel as modular. Yes, it has some modules, but that's not the same thing as being properly modular.

Can you do a build of the current kernel and remove all individual arbitrary features to loadable modules or eliminate then entirely? If you can then that feature is modular, if you can't and it has spread its tentacles into other features, then it is not modular. Unless you can do with with all features I don't think you can characterise it as properly modular, just as having (some) modules.

Whether the distinction (fully modular/has modules)  is a useful distinction depends on where something is on the scale between monolithic...has modules...fully modular and what you're doing with it.

In particular, can you exclude all the features that you're not going to be using from a particular kernel build? If you can't, and you end up with a kernel that is, for your purposes, carrying unwanted baggage then it is a useful distinction.

Again, if you're developing a kernel feature, then the ability to exclude all other features is a useful distinction, as it allows you to test your code in as much isolation as possible, or in controlled combination with other features, one at a time, to see what breaks what.

If you're trying to shoehorn a minimal kernel onto a microcontroller where every extra useless byte of code is an imposition then it clearly is a useful distinction.

If you're using the kernel on a general purpose computer for general purpose uses then it's probably not a useful distinction. There I think you'll be perfectly happy with sufficient modularity that the kernel doesn't get bloated with device drivers for devices you don't have.
Title: Re: Too many programming languages?
Post by: brucehoult on December 03, 2019, 02:25:52 pm
What I don't get is, if you don't like it, why are you trying to shoehorn it into those vintage machines? There are not any microkernel alternatives?

"Microkernel" is a technical term which means that as little as possible code runs with elevated privileges and everything else talks to it and each other via the kernel. It doesn't make the total RAM impact smaller. Probably it makes it bigger.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 02:46:09 pm
What I don't get is, if you don't like it, why are you trying to shoehorn it into those vintage machines? There are not any microkernel alternatives?

"Microkernel" is a technical term which means that as little as possible code runs with elevated privileges and everything else talks to it and each other via the kernel. It doesn't make the total RAM impact smaller. Probably it makes it bigger.

Bigger, clumsier and slower, yes. But I'm not the one who criticizes Linux and advocates for microkernels.
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 03, 2019, 02:52:25 pm
If anybody wants to play with an awesome, tiny and fast microkernel for embedded, FreeRTOS it is. For $5 any esp32 comes with it for you to try. The kernel is about 6kB or so. With two cpu cores and SMP!

https://grouper.freertos.org/features.html (https://grouper.freertos.org/features.html)

https://www.youtube.com/watch?v=1oagM_tEyeA (https://www.youtube.com/watch?v=1oagM_tEyeA)
Title: Re: Too many programming languages?
Post by: bpiphany on December 03, 2019, 06:52:38 pm
On subject I feel this thing is worth some flaunting. I can't vouch for it's correctness - I haven't compiled the compiler. But allegedly it solves the first day's problem in this year's Advent of Code (https://adventofcode.com/2019/day/1)

From knowing some programming, it is kind of not completely impossible to interpret the general idea of most of the constructs. I think it serves as a nice reminder of how foreign programming probably is to someone outside the profession/hobby.

Code: [Select]
Sadness is loneliness
The programmer was anticipating
Advent is extraordinary
The rush is uncomparable

Christmas takes joy and kindness
    Your spirit is incredible
    While joy is as high as kindness
        Build your spirit up
        Let joy be without kindness

    Give back your spirit

AdventOfCode takes time (but it's plenty of fun)
    Let fun be Christmas taking time, and Advent without the rush
    If fun is as low as sadness
        Give back sadness

    Give back fun with AdventOfCode taking fun

The elves are thoughtful
Santa is overseeing
The time is now
While the time is right
    Listen to the jingles
    If the jingles ain't ok
        Break it down

    Let the jingles be without sadness
    Let the elves be with Christmas taking the jingles, and Advent without the rush
    Let Santa be with AdventOfCode taking the jingles

Shout the elves
Shout Santa
https://github.com/vstrimaitis/aoc-2019/blob/master/day1/sol_formatted.rock

Rockstar Lang (https://github.com/RockstarLang/rockstar/tree/master/examples)
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 03, 2019, 07:27:39 pm
On subject I feel this thing is worth some flaunting. I can't vouch for it's correctness - I haven't compiled the compiler. But allegedly it solves the first day's problem in this year's Advent of Code (https://adventofcode.com/2019/day/1)

From knowing some programming, it is kind of not completely impossible to interpret the general idea of most of the constructs. I think it serves as a nice reminder of how foreign programming probably is to someone outside the profession/hobby.

Ahah, nice one!

I wonder what Boeing's MCAS code would look like in Rockstar.  >:D
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 03, 2019, 08:50:06 pm
One of the tenets of the "Extreme Programming" people back around 2000 was "no comments -- the variable and function names should make the code self-explanatory"

There's a lot of crap put forward by academics who don't actually work in the industry. This was one of them.

When your implementing complex business logic comments explain the why, I get the how from the code. They why can be deduced from the code but it's a slower process than reading comment explaining it. Academics also assume that the BA's have nothing better to do with their time than talk about business rules implemented 5 years prior and that documentation is at every developers finger tips when they need it.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 03, 2019, 09:58:41 pm
I would certainly hope that anyone who calls themselves a C programmer would understand a very common idiom such as Bellard's ...
Code: [Select]
while (*p && *p++ != '\n')
  continue;
... at a glance.
To understand ≠ to recognize at a glance.  This is where a descriptive comment provides the latter, giving the reader the correct context to immediately understand the code correctly.

The problem is, whenever you see code like that that relies on side effects like post-increment after the test, you can never know whether it is as intended or an error.  The comment fixes that.  The same goes for all other idioms in all programming languages.

To those who do not immediately recognize the idiom, consider the following:
Code: [Select]
    /* Skip to next line, but stopping at end of string. */
    while (*p && *p++ != '\n')
        continue /* or Nothing */ ;
See how your mind takes a different approach when seeing both the code and the comment, compared to seeing only the code?


I've actually worked at a place that [...] had the rule "No comments except possibly one before a function explaining the purpose of the function".
Sounds like a footgun factory to me.


Linux is too bloated
Yes, but it has nothing to do with monolithic/microkernel architecture differences, and everything to do with featuritis.

As others have mentioned, a microkernel (with the same range of features supported) would probably be even larger.  You see, just adding the interfaces for optional modules to be compiled in or omitted requires code and data structures, and this applies more to microkernels than monolithic kernels (but it does apply to all code, just varies by degree).  A good example of this is the Smoothieware (https://github.com/Smoothieware/Smoothieware) firmware for CNC and G-code devices running on LPC17xx/ARM Cortex M3.  It uses (GCC) C++ classes to modularize the firmware, simpler (no privilege boundaries, only namespace boundaries) but conceptually similar to microkernels.  Its binaries are rather large, compared to other projects with similar features.  Many users find the modularity worth it, though.)
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 03, 2019, 11:21:08 pm
Its binaries are rather large, compared to other projects with similar features.

I used to get these comments also from Customers regarding the optimizer (speed verse size). With a new processor we would make changes and often those changes were equally as good or better on the older processors. First response was "It's faster, but here's how to make it run slower".
Title: Re: Too many programming languages?
Post by: legacy on December 04, 2019, 03:42:53 pm
The problem is, whenever you see code like that that relies on side effects like post-increment after the test

indeed, *p++ is banned by avionics rule #210  :D
Title: Re: Too many programming languages?
Post by: legacy on December 04, 2019, 03:54:52 pm
Linux is too bloated
Yes, but it has nothing to do with monolithic/microkernel architecture differences, and everything to do with featuritis.

It has to do with *how* you debug the kernel! if it's monolithic you cannot debug a feature, or a module without crashing the whole kernel, and if the kernel is monolithic you can neither load a module inside an ICE, at least, without overcomplicating things.

With monolithic kernels, when you add a feature, and it this feature crashes, the whole kernel crashes, and you are not able to better investigate. Things go even worse if you can only debug via kgdb because it crashes too. You need to reload the kernel and prepare test-cases, which consumes a lot of time.

Microkernels have isolation, privileges isolation, you can even put the debugger tap inside the highest privilege, therefore if a feature crashes, you can immediately investigate the why, without wasting more time.

To better understand what I mean, try to develop with VxWorks (or on Neutrino), and note how fast it goes the whole debugging process.
Title: Re: Too many programming languages?
Post by: legacy on December 04, 2019, 04:29:07 pm
As others have mentioned, a microkernel (with the same range of features supported) would probably be even larger.

No doubt microkernels are larger, but only with VxWorks and Neutrino I am able to split features, and load one of them inside an ICE, properly mapped to the kernel memory without any problem  :)

None of them will crash if I do a mistake, whereas if I try to do the same with Linux, the whole kernel will crash at the first mistake. This brings me to the need to put the whole kernel inside the ICE, but I can unload and reload a feature dynamically to better investigate iteratively, while with Linux I have to reload the whole kernel at every attempt to avoid to work on corrupted memory; this was the case of the rb532 (debugged via kgdb because it's a hobby project, and I do not have the money for a better-debugging cable, anyway), if you remember, when the miniPCI UART module crashed, not only it had previously corrupted a lot of internal kernel's structures (this might happen even with microkernel, it's how the MMU can isolate parts), but ... well ... when it crashed I got just a few clues on the reason and most of the information gone lost, and I had to speculate ... was it due to this? due to that? let's reload the whole kernel, let's spend 48 hours to see how it goes, and let's verify hypothesis in the darkness ...

A lot of time wasted this way.

I had a similar problem with VxWorks at work, and the kernel isolation helped to avoid it, so it only crashed the UART module, but not the whole kernel. Result? fixed in two days instead of 2 months!, this mostly because when the Linux kernel crashes you lose a lot of precious information.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 04, 2019, 04:50:02 pm
The problem is, whenever you see code like that that relies on side effects like post-increment after the test

indeed, *p++ is banned by avionics rule #210  :D

Banning it altogether may be a bit much, but it is sound. '*p++' is pretty idiomatic and rather clear in itself; the main problem I see and we talked about earlier is using it inside a condition.
IMO, the real slippery and unreadable thing here is to use conditions with side-effects, and doing this should be clearly completely banned.

Single statements like 'var = *p++;' or '*p++ = var;' are OK IMO. 'if (*p++ != 0)' is NOT IMO. 'if (*p++)'  is even worse, as IMO a condition should clearly be made a condition instead of taking shortcuts. The latter would be acceptable (without the '++') if '*p' is a boolean, but only in this case.

And that said, banning '*p++' in any situation would be fine with me. I wouldn't miss it much.

(We have to remember the old days when C compilers were not optimizing anything much and constructs such as '*p++' would directly map to a post-increment instruction on CPUs supporting this addressing mode, thus more efficient. Of course, it may have been the case a long time ago, but it's been pretty irrelevant for probably 3 decades, as this is very basic optimization.)
Title: Re: Too many programming languages?
Post by: tggzzz on December 04, 2019, 05:28:35 pm
The problem is, whenever you see code like that that relies on side effects like post-increment after the test

indeed, *p++ is banned by avionics rule #210  :D

Banning it altogether may be a bit much, but it is sound. '*p++' is pretty idiomatic and rather clear in itself; the main problem I see and we talked about earlier is using it inside a condition.
IMO, the real slippery and unreadable thing here is to use conditions with side-effects, and doing this should be clearly completely banned.

Single statements like 'var = *p++;' or '*p++ = var;' are OK IMO. 'if (*p++ != 0)' is NOT IMO. 'if (*p++)'  is even worse, as IMO a condition should clearly be made a condition instead of taking shortcuts. The latter would be acceptable (without the '++') if '*p' is a boolean, but only in this case.

And that said, banning '*p++' in any situation would be fine with me. I wouldn't miss it much.

(We have to remember the old days when C compilers were not optimizing anything much and constructs such as '*p++' would directly map to a post-increment instruction on CPUs supporting this addressing mode, thus more efficient. Of course, it may have been the case a long time ago, but it's been pretty irrelevant for probably 3 decades, as this is very basic optimization.)

*p++ isn't really a problem, even inside a conditional.

But things like "*p && *p++ != '\n'" brings in two additional issues: operator binding and sequence points. The latter can be important because, depending on the compiler flags, the C compiler may or may not re-order and optimise out statements in "unexpected" ways.

Those that know the relevant C standard and (current) compiler intimitely might not fall into a trap, but most developers are working under time pressure and aren't that savvy :(
Title: Re: Too many programming languages?
Post by: legacy on December 04, 2019, 05:43:23 pm
*p++ is pretty idiomatic and rather clear in itself

nope, the whole pointers arithmetic is also banned, except for seniors engineers, who are a super special case and only involved in critical code. Clear or not clear to "common" people (like me, my badge is a simple level 4 ... level 6 is for guests), it was demonstrated it caused too many troubles with too many people and equipment; *p++ can make the ICE confused by the pointer arithmetic, as well as humans, and even the artificial intelligence of Stood is usually confused when it sees a *p++ inside a conditional branch during a dynamic coverage.

Hence the drastic decision to ban it, once and forever, so dev guys don't have to fight with testers, as well as guys in the testing team don't have to fight with their ICEs, and even guys in the QA office don't have to fight against all of us and against the AI of their Stood program, and the whole team has a benefit.

Which benefit? Well, we have an aspirin distributor on each floor, it's located in the Infirmary room, but it's somehow just next to the beverage and coffee distributor.  Aspirins are a synthetic compound used medicinally to relieve mild or chronic pain and to reduce fever and inflammation. In our case, it was used to reduce the headache indirectly caused by the incompatibility between the C language's feature and our ICEs and AI.

After banning *p++, QA guys noticed the number of taken aspirins got reduced, as well as the number of coffee consumed at the beverage and coffee distributor room, and they deduced that by banning that C feature they substantially improved the efficiency and the health of the whole group :D
Title: Re: Too many programming languages?
Post by: ebastler on December 04, 2019, 06:52:22 pm
nope, the whole pointers arithmetic is also banned, except for seniors engineers, who are a super special case and only involved in critical code.

Double-00 agents, with the "license to point".  :D
Title: Re: Too many programming languages?
Post by: CatalinaWOW on December 04, 2019, 10:54:32 pm
The only thing that exceeds the number of programming languages is the number of opinions about style.  All completely correct of course.  With measurable data to support the opinion.  Which just goes to show that experiment design is as poorly understood on the software side of engineering as it is on the hardware side.
Title: Re: Too many programming languages?
Post by: tggzzz on December 04, 2019, 10:59:49 pm
The only thing that exceeds the number of programming languages is the number of opinions about style.  All completely correct of course.  With measurable data to support the opinion.

Nah. I'm the only person with impeccably good taste.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 04, 2019, 11:24:20 pm
Linux is too bloated
Yes, but it has nothing to do with monolithic/microkernel architecture differences, and everything to do with featuritis.
It has to do with *how* you debug the kernel!
No need to yell.  I'm only saying that Linux being bloated is not due to monolithic architecture.  The architecture does impact a lot of things, including debuggability and long-term maintenance, yes; but binary or run-time size is usually smaller for monolithic kernels compared to microkernels (for the same feature set on the same hardware architecture).

In my experience, all privilege barriers have a run-time cost, too.  The more complex the hardware architecture, the higher the cost; well shown by the mitigations to the recent Intel exploits for example.  It also looks like it is quite hard to get a microkernel architecture to be as efficient/performant as monolithic kernels on current complex processor architectures; see eg. the History chapter in the Wikipedia article on L4 microkernels (https://en.wikipedia.org/wiki/L4_microkernel_family#History).
Title: Re: Too many programming languages?
Post by: Nusa on December 04, 2019, 11:52:32 pm
The problem is, whenever you see code like that that relies on side effects like post-increment after the test

indeed, *p++ is banned by avionics rule #210  :D

Is there a list of these avionics rules accessible to the general public?
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 05, 2019, 12:25:26 am
Which just goes to show that experiment design is as poorly understood on the software side of engineering as it is on the hardware side.
I'd claim the opposite: we're so good at experiment design, we intuitively design the tests to support our preconceptions.

The end result is obviously the same.  :)
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 05, 2019, 01:10:56 am
Which just goes to show that experiment design is as poorly understood on the software side of engineering as it is on the hardware side.
I'd claim the opposite: we're so good at experiment design, we intuitively design the tests to support our preconceptions.

That we are good at it doesn't mean we fully understand it either. ;D
But anyway, although some are basically sound, of course most of the code "style" rules are only backed by an assorted set of cases in which the world falls apart if you don't follow said rules, feeding one another.

The number one rule of having rules is to be fully comfortable following them (which is why most of us are only comfortable with the rules they have chosen to follow). Because if you're not, it usually doesn't end well. (And that's true for any kind of rules! ;D )

Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 05, 2019, 04:07:53 am
*p++ is pretty idiomatic and rather clear in itself
nope, the whole pointers arithmetic is also banned, except for seniors engineers, who are a super special case and only involved in critical code.

There were some architectures where ++/-- were supported by specific inc/dec instructions. However for the most part this does not make a lot of sense. If your optimizing critical code ++/-- gives the optimizer more work to resolve dependencies. In those areas I would be looking to reduce the dependencies and specifically look at the code structure to reduce potential optimization.

it was demonstrated it caused too many troubles with too many people and equipment; *p++ can make the ICE confused by the pointer arithmetic, as well as humans, and even the artificial intelligence of Stood is usually confused when it sees a *p++ inside a conditional branch during a dynamic coverage.

Of course it reduced problems, in today's world we want everyone to pass the test without putting any work in. Lower the competency needed and now every dog can feel smart. Intresting that this is mission critical code, critical systems I've worked on quickly weeded out those that took up valuable real estate.

Which benefit? Well, we have an aspirin distributor on each floor, it's located in the Infirmary room, but it's somehow just next to the beverage and coffee distributor.  Aspirins are a synthetic compound used medicinally to relieve mild or chronic pain and to reduce fever and inflammation. In our case, it was used to reduce the headache indirectly caused by the incompatibility between the C language's feature and our ICEs and AI.

After banning *p++, QA guys noticed the number of taken aspirins got reduced, as well as the number of coffee consumed at the beverage and coffee distributor room, and they deduced that by banning that C feature they substantially improved the efficiency and the health of the whole group :D

Sure, the inept and incompetence were catered too and this lowered their stress.
Title: Re: Too many programming languages?
Post by: Berni on December 05, 2019, 06:24:25 am
Sure, the inept and incompetence were catered too and this lowered their stress.

Its not about making life easy to the people who barely know programing, its actually about making it hard to make a mistake for the people who do know what they are doing.

The people that barely know what programming looks like are certainly everywhere, but those don't even make it trough the hiring interview if done right. That being said you are still typically not going to have a team consisting of nothing else but epic programmers. Sure companies like Google or Microsoft can put together such teams for the most important projects because they are so big, have so many employees and attract so much talent that they can sift out the top 1% and still end up with a sizable team. Most places don't have this luxury so they have to make do with the talent they can get for the given wages.

These "okay but nothing special" programmers are more likely to get tripped up by some funky code wizardry, resulting in wasted time debugging a stupid mistake or even worse introducing a bug that goes unnoticed for a long time and then trips up someone else who then has no idea what is going on. Heck even the rockstar programmer might have a bad day, maybe they have been playing games or maintaining there personal git project or whatever until 3 in the morning and are now sleepy at the job, maybe they caught a bad cold and are not feeling that well but still come to work, or maybe they just haven't yet gotten above there minimum coffee threshold. Everyone is still human after all (Even if some people in this profession are strange enough to have you wondering if they even are from this planet).

Having a big ego in programming is usually only a bad thing. Thinking you are the smartest guy on the team, throwing everyone else ideas out the window and writing code in a complex way that only makes sense to you is only going to hurt the overall project.

Its not all about being an epic programmer, some social skills are still helpful.
Title: Re: Too many programming languages?
Post by: westfw on December 05, 2019, 07:52:12 am
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))


I think this turned out to be relatively cheap, hardware-wise.  You needed something similar for the PC anyway, and for stack instructions if you have them...
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 05, 2019, 08:17:56 am
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

How could that work for pointers of different sizes char* int* long* double* struct* etc?
Title: Re: Too many programming languages?
Post by: nfmax on December 05, 2019, 08:59:55 am
Only for operands of Byte and (16 bit) Word type. The pointer register was incremented by 1 or 2 respectively  (but R6 & R7 are special and always increment by 2)
Title: Re: Too many programming languages?
Post by: Tepe on December 05, 2019, 09:19:24 am
yup, the PowerPC has autoincremented index opcodes, but the ICE doesn't like this too much, for *a lot* of reasons, and, worse still, people can do a mess with the GreenHills C compiler.
Why are you using C at all in those critical projects?
Title: Re: Too many programming languages?
Post by: brucehoult on December 05, 2019, 10:34:13 am
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))

x86 has worse, with {LODS|STOS|MOVS}{_|B|W|D} which can only use SI for loads and DI for stores, and either increments or decrements SI and/or DI depending on a flag in the EFLAGS register.  Not to mention the REP{_|E|Z|NE|NZ} prefixes to make them a 1-instruction loop.

RISC-V has no auto-increment or auto-decrement, not even for the stack. It also has no notion of "the stack" at the base ISA level -- it's only at the software ABI level that a particular stack pointer register is designated.

It's not impossible that in future a "store with writeback of the effective address to the base register" might be added, like in PowerPC. But not loads. Loads might in future get a "form the effective address by adding two registers (maybe with scaling)". Both fit neatly in the "read two registers and write one register" pipeline model, and despite being asymmetric they actually can work together very neatly in a loop, where you often want to read two (or more) values from memory, calculate something, and write one value back. e,g,

Code: [Select]
  addi rD,rD,-4
  sub rS1,rS1,rD
  sub rS2,rS2,rD
loop:
  lw rA, (rS1, rD)
  lw rB, (rS2, rD)
  add rC,rA,rB
  sww rC, 4(rD) ; mem <- rC, rD <- rD+4
  blt rD,rLimit,loop

At present this code would be:

Code: [Select]
loop:
  lw rA, 0(rS1)
  lw rB, 0(rS2)
  add rC,rA,rB
  sww rC, 0(rD)
  addi rD,rD,4
  addi rS1,rS1,4
  addi rS2,rS2,4
  blt rD,rLimit,loop

This eliminates having to explicitly increment rD, rS1 and rS2 inside the loop, at the cost of an instruction to set each one up before the loop. So the code size is identical, but it helps speed if the loop executes more than once.

sww could use the "spare" opcodes that result from loads having load and load-unsigned (e.g. LW, LWU) but only needing one kind of store. You could use SWU (Store Word with Update) as the mnemonic instead of the SWW (Writeback) above, but that could cause confusion between Unsigned and Update :-)
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 10:56:29 am
Why are you using C at all in those critical projects?

The OS and the firmware are written in C.
The application is written in Ada.
C++ cannot be (yet) used for reasons I haven't yet understood.
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 11:30:28 am
in today's world we want everyone to pass the test without putting any work in.

Yup. Our automatic and autonomous tests aim for this. But also consider that a part of our "test-cases" is recycled and used to help guys and robots in production to test what they build.
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 03:20:13 pm
new challange: dev-lang/ghc on a Japanese PDA
ghc-what? give me "H" "a" "s" "k" "e" "l" "l"

Haskell :D
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 05, 2019, 04:14:03 pm
No Rust though? ;D
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 05, 2019, 04:15:10 pm
yup, the PowerPC has autoincremented index opcodes, but the ICE doesn't like this too much, for *a lot* of reasons, and, worse still, people can do a mess with the GreenHills C compiler.
Why are you using C at all in those critical projects?

As he said, they are routinely using Ada as well.

But out of curiousity, what language would you use instead?
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 05, 2019, 04:22:23 pm
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))

Yup. And has everyone forgotten about the 68k already? The pre-dec/post-inc addressing modes were handy on those and used a lot. Separating the access and inc/dec of the address register instead would take up significantly more cycles. It was not just luxury. I remember the earlier C compilers for 68k were explicitely mapping the C pre-dec/post-inc statements using those modes when the size of the pointed object allowed it. It was available way before decent optimizers came out, and it was a pretty straightforward task for compilers, not requiring any fancy algorithm.

I think many have just forgotten what it meant to code for < 1MIPS CISC CPUs with basic tools. We now take for granted what GCC produces for instance (and most modern compilers), but even with -O0, GCC produces code (for most targets) that is light-years more efficient than the compilers of the early days...
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 05, 2019, 08:53:23 pm
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).
And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))
I think this turned out to be relatively cheap, hardware-wise.  You needed something similar for the PC anyway, and for stack instructions if you have them...

Often instructions are cracked or have group restrictions or other execution unit dependencies.  So for example store with update instructions were not used as alternatives are equally as good and can be better sequenced. Grabbing a stack frame and using a multi-word store to save the non-volatile registers on paper looked good but the cracked instruction and was slower than alternatives, it also had a first in group restriction.

Unfortunately a lot of the documentation at this level is NDA.
Title: Re: Too many programming languages?
Post by: brucehoult on December 05, 2019, 09:05:18 pm
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))

Yup. And has everyone forgotten about the 68k already?

Who could forget it? When I need to do accounting I still use a Mac 68000 accounting program I bought in 1992. In an emulator, of course.

M68000 is clearly "one that copied the PDP/11". It's even somewhat "surviving" in the form of Coldfire, which I see the manual describes as both "variable-length RISC" and "A majority of the instructions are binary compatible or optimized 68K opcodes".

You really can't have both :-)

I've never actually had a look at it before now .. you've got the same D0..7 and A0..7 (which was a good hack to double the PDP-11 register set with still 3 bit register and addressing mode fields). And yes, still has (An)+ and -(An) modes.

Aha. "The ColdFire instruction set is a simplified version of the M68000 instruction set. The removed instructions include BCD, bit field, logical rotate, decrement and branch, and integer multiply with a 64-bit result. Nine new MAC instructions have been added." Ok, wouldn't miss most of those.

And, most vitally .. "doesn't include all the useless fancy crud introduced in the 68020". (I made that quote up)

Leaving out the rotate instructions (ROL, ROR, ROXL, ROXR) is interesting. We get a certain amount of shit with RISC-V not having rotate. But it's not used all that often and is easily implemented with a left shift, a right shift, and an OR.

But, yeah, it's a 68k pretty much. I'm guessing it's only the omission of the DBcc instructions that would make many or most 68k binaries incompatible.

Quote
I think many have just forgotten what it meant to code for < 1MIPS CISC CPUs with basic tools. We now take for granted what GCC produces for instance (and most modern compilers), but even with -O0, GCC produces code (for most targets) that is light-years more efficient than the compilers of the early days...

I don't agree with that. Apple's MPW compilers produced pretty good code. Certainly better than gcc -O0 ! THINK C and Pascal were not bad too.

Early compilers for x86 were really awful.

gcc -O1 though, yeah it would beat any of those old compilers.
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 09:17:14 pm
No Rust though? ;D

It depends on LLVM, which has some problem with Catalyst at the moment  :-//
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 05, 2019, 09:24:37 pm
in today's world we want everyone to pass the test without putting any work in.
Yup. Our automatic and autonomous tests aim for this. But also consider that a part of our "test-cases" is recycled and used to help guys and robots in production to test what they build.

My point though is that competent programmers should understand basic operators and precedence without needs packets of aspirin and a special senior classification. This is irrespective of preferred style and coding standards in the specific organization. Some styles of code I feel are much nicer to maintain than others but it should not prevent a programmer from doing their job and it should not require a special title or be considered otherwise too difficult.

It reminds me of a discussion I have with a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language. They now teach using python and basic. One of his KPI's is the number of students passing. We teach our children by lowering the standards until they too pass the test rather than having the hard conversation, therefore everyone is a winner and we have to put fences in to keep all but a select few out.

There's posts that talk about 'Rockstar' programmer as if a competent programmer nowadays is considered special. The real 'Rockstars' in the team are the ones with domain knowledge.

I think getting back to the original question of this long topic, it's not the number of programming languages that is the problem. The language is just a tool some like 'C' are general purpose and some like IL are job specific. A good developer will pick up a new programming language and become productive. The root problem is the number and quality of the programmers. Writing a script on the weekend and calling yourself a programmer is analogous to changing a light and calling your self an electrician.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 05, 2019, 09:33:00 pm
No Rust though? ;D

It depends on LLVM, which has some problem with Catalyst at the moment  :-//

I see. Anyway, I'm still not convinced by Rust for a number of reasons.

And now, this last piece seems to be a potential additional reason: https://msrc-blog.microsoft.com/2019/11/07/using-rust-in-windows/
Be very afraid!  :-DD
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 10:46:49 pm
competent programmers should understand basic operators and precedence without needs packets of aspirin and a special senior classification [...] a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language

bah, ... I am a bit perplexed. Do students have to deal with complex hardware, ICEs, dynamic coverage, DO178B supersonic certifications, voters, AI, autonomous testing machines, high responsibility for human life, and high psychological pressure due to deadlines to be respected?

Sure, we had too much difficulties with the C language, but these are different kind of difficulties that are part of the product "life cycles" we develop, rather than a "lazy mind-attitude", like in your example with students.

was it not clear? probably my fault, perhaps not properly explained :-//
Title: Re: Too many programming languages?
Post by: Tepe on December 05, 2019, 10:54:45 pm
But out of curiousity, what language would you use instead?
I would probably be a bit cautious and lean towards Ada.
Title: Re: Too many programming languages?
Post by: legacy on December 05, 2019, 11:30:09 pm
Quote
competent programmers should understand basic operators and precedence without needs packets of aspirin

the packets of aspirin anecdote was not telling that people need aspiring because they do not correctly understand basic operators and procedures, but rather because our ICEs and AI might get so confused by certain features of the C language combined with the weird avionics testing ecosystem that people have to find "workaround" to avoid extra-problems happening. But your task is not finding "workarounds". Sure, you might think it's a creative task, but ... well, it is a creative thing the first and second time, but in the long term, most people do perceive it as stressing and frustrating, especially if you have deadlines to respect.

Due to the human ego, before it became "banned rule" (like "don't use recursion"), most guys in the dev-team refused to avoid to use the pointers arithmetic, even if this is a notorious source of confusion for our testing ecosystem; they probably wanted to underline their competence with the C language, their Ego, and as result, my boss continuosly asked my squad to edit their files.

You can imagine the extra work for the testing squad. It's like when devs do program wild (without respecting anything), and then someone has to make the code Misra compliant.

Now I am in the dev squad, but did it make sense? I know the answer by experience.

Hence I appreciate that a simple "starting from now, it's banned(1), do not question" solved the issue and it also improved efficiency, and reduced frustration.


(1) since now it's a rule, "do not use the pointers arithmetics", their Ego feels comfortable. They know how to use the pointers arithmetics, but they are not allowed, hence if they do not use, it's not because they do not know the C language, but rather because it's imposed.
The human psycology is ... funny  :D
Title: Re: Too many programming languages?
Post by: tggzzz on December 05, 2019, 11:51:02 pm
It reminds me of a discussion I have with a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language. They now teach using python and basic.

A tool should solve real-world problems without introducing unnecessary problems that hinder solving the real-world problem.

So, the college avoided using such problematic tools when teaching neophytes. Good.

Quote
One of his KPI's is the number of students passing. We teach our children by lowering the standards until they too pass the test rather than having the hard conversation, therefore everyone is a winner and we have to put fences in to keep all but a select few out.

That, of course, is objectionable and a real problem.

Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 06, 2019, 12:47:16 am
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
Title: Re: Too many programming languages?
Post by: tggzzz on December 06, 2019, 08:14:32 am
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.

Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 06, 2019, 04:08:49 pm
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.

Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)

The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed. (Heck, it's important in many areas in life actually.) Lacking this ability would definitely cause you reccuring problems in your professional life.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 06, 2019, 08:18:37 pm
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
Fully agreed!  ^-^

Fortunately, in real life -- as opposed to exams -- it is easier to find out the objective.  In exams, the lecturer may have any number of different objectives, and it is not always easy to tell which.

The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed.
I definitely agree.  (Overstrikes mine, because it really is essential nowadays in just about every task.  Even research.)

I have an unfortunate tendency to underestimate the effort needed for my own part (a personality flaw), and must remember to compensate by a factor of two or so.  It does not affect comparison of tasks, though.  Outside work, in idle chat or discussions like in this forum, I often forget to compensate.

Like I said, am a researcher/problem solver, not an engineer.
Title: Re: Too many programming languages?
Post by: tggzzz on December 06, 2019, 10:09:02 pm
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
Fully agreed!  ^-^

Fortunately, in real life -- as opposed to exams -- it is easier to find out the objective.  In exams, the lecturer may have any number of different objectives, and it is not always easy to tell which.

I find it much more difficult to define "success" in normal life!

Quote
The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed.
I definitely agree.  (Overstrikes mine, because it really is essential nowadays in just about every task.  Even research.)

I have an unfortunate tendency to underestimate the effort needed for my own part (a personality flaw), and must remember to compensate by a factor of two or so.  It does not affect comparison of tasks, though.  Outside work, in idle chat or discussions like in this forum, I often forget to compensate.

Like I said, am a researcher/problem solver, not an engineer.

Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 06, 2019, 11:08:14 pm
It does not affect comparison of tasks, though.

Relatively comparing tasks is not helpful, though, when you need to make a decision which one to tackle, unless of course you have a hard rule of always picking the one which will take the least time (or the most).

Like in the "exam" example, you may have perfectly identified that one of the questions was definitely the hardest one, but you may still have chosen to pick it because you still felt you could manage in the available time.

The same can happen when working on projects. You may have a choice to make between two options, one you know for sure is harder than the other, but you could still think it's manageable (and of course a lot more interesting)... This is in fact part of a manager's nightmare. ;D Oh, and more often than not, this guy that the manager let take this decision will end up resenting their manager for letting them do it if it ever fails or time slips awfully... (yes engineering management can be tough!)


Title: Re: Too many programming languages?
Post by: Nominal Animal on December 06, 2019, 11:26:28 pm
Relatively comparing tasks is not helpful, though, when you need to make a decision which one to tackle, unless of course you have a hard rule of always picking the one which will take the least time (or the most).
Not just a hard rule, it can be useful in other situations too.  For example, if you know that one of the tasks is intentionally too complex to perform in the given time, or when you need to split a set of tasks to a varied group of workers.

You may have a choice to make between two options, one you know for sure is harder than the other, but you could still think it's manageable (and of course a lot more interesting)...
Yup, a classic footgun situation.  Reliable absolute estimates are much more often needed than comparisons.

I find it much more difficult to define "success" in normal life!
True; I associate that sort of thing with loved ones and carrying meaningful responsibilities reliably, but other than that, I wouldn't much bother with the definition.

(Above, I meant "successfully" only in the sense of not failing horribly, causing wasted time and resources.)

Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
I was once advised to use Pi as the factor. :)

Also, I'm one of the people who do not multitask well, and have had to learn to be specific and clear about priorities and timetables.  It is not always clear from the context whether you're asked to switch to do thing B first, to do things A and B at the same time, or to do A first and then B.  This has bitten me bad when I was younger.

Which, by the way, gets us back to programming languages.

As I've mentioned before, I'd like to have a C-like programming language, but more restricted to the types of processors and microcontrollers we have.  One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.  This is somewhat useful in initialization loops, but would be really useful in hard optimization of complex non-loop math operations (typical in certain types of simulations, especially force-field models in chemistry).

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(

While I don't like many (most?) of the current programming languages, every one of them has provided further insight into what features benefit and what detract from a programming language.  So, while there are lots of languages I would recommend against using when writing widely-used tools or end-user applications, I do see them having a valid use in research and experimentation: without them, our future programming languages would be worse.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 06, 2019, 11:58:07 pm
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so. You probably had something more advanced in mind, would you care to give an example?

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(

I've seen that kind of loop in ParaSail (the language I created a thread for - didn't get much traction), I think. Of course it's just a curiousity more than anything else at the moment.

For the little I've explored with OpenMP, I think you can definitely do that with it. It will parallelize iterations for things that don't depend on one another. OK, OpenMP is more an "extension" than an integral part of a language.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 01:01:30 am
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.
You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so. You probably had something more advanced in mind, would you care to give an example?
I'd like to allow reordering and especially interleaving of operations, ignoring sequence points and order of side effects.  (If you consider C++ atomics, you'll immediately see how this is essentially extending the models to sequence points and order of side effects.)

Essentially, I'd like to be able to tell the compiler that if I have code like
    {
        v1 = complex_expression_one();
        v2 = complex_expression_two();
    }
the compiler is allowed to evaluate the contents of the block in parallel (interleaved, using a single thread of execution), completely ignoring sequence points or the order of side effects.

These code segments are rare, but very hot, and I'd like to also annotate them so the compiler knows to do lots of extra work to optimize this particular chunk of code, even as far as brute-forcing some details.  Technically, you could do that via attributes or pragmas, but I'd like the language to natively support it.

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(
I've seen that kind of loop in ParaSail (the language I created a thread for - didn't get much traction), I think. Of course it's just a curiousity more than anything else at the moment.
Right.  It is in the same category as memrep(buffer, offset, length), that is the complementary of memcpy()/memmove(), in the sense that it fills buffer by repeated copies of the first offset bytes.  (It is the inverse of memcpy() with respect to memmove(), implementation-wise.  It is very useful for initializing arrays with structures and/or floating-point members, since only the storage representation matters.  While you can trivially implement it yourself, it should be something the compiler can optimize for the target architecture; i.e., in GCC, a built-in function, and not just a library function.)

For the little I've explored with OpenMP, I think you can definitely do that with it.
OpenMP parallelizes the loops with multiple threads.  That's not what I am talking about.  I'm talking about generating machine code that interleaves several operations on superscalar architectures with enough registers.

Also, like I said, I want this even closer to the metal than C, and OpenMP is quite a complicated abstraction, with a lot of hidden costs in the thread management.  When writing kernels, or simulators using MPI and a fixed number of threads per node (which is typical), OpenMP is a square peg in a round hole.

Does anyone know of a programming language that exposes the flag register after operations or function calls?  The standard flags (Zero, Carry, Negative, Overflow) would be rather useful.  I do realize that it would require emulation on MIPS and RISC-V at least, as they don't have them in a dedicated register.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 07, 2019, 01:10:52 am
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

Compile-time optimization can only go so far in determining 'best', and best varies depending on resource availability. FDPR is much more interesting and potentially a better direction to explore and extend. I've used FDPR primarily for it's quick win with branch prediction but it along with tools like CTRACE can give a lot of insight into the run time behavior.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 01:43:03 am
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.
Compile-time optimization can only go so far in determining 'best'
In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.  These are extremely rare but extremely hot (typically simulation calculations; I do not recall seeing them in application-type software). Like I said, I'd love the compiler to even brute-force the various implementation possibilities.  I could even accept if this was only possible for native compilation, and disabled for cross-compilation.

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code; to allow optimization if such is available on the target architecture.  Currently, in C, the sequence points and therefore the order of side effects is dictated by the standard, and the compiler is quite restricted in its ability to reorder operations (and still stay standards-compliant).

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.
(The code profiling above, for brute-forcing different implementations, is done as part of the compilation, and there a statistical answer suffices; no code needs to be "run" at all.)
Title: Re: Too many programming languages?
Post by: GeorgeOfTheJungle on December 07, 2019, 01:55:15 am
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant

I think one way to do that is with , instead of ;

If you write complex_expression_one(); complex_expression_two(); _one()'s side effects must have happened before executing _two(), but if you write instead complex_expression_one(), complex_expression_two(); you're telling the compiler that it doesn't matter. I might be wrong as always, though.
Title: Re: Too many programming languages?
Post by: blacksheeplogic on December 07, 2019, 03:15:35 am
In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.

Perhaps I'm not quite understanding your requirement here or where/how you see this being done without some form of execution analysis. But I might also be too narrowly focused as I was primarily involved with (and spent most of my time doing) instruction sequencing which does not require execution. Also, parallelism would look at functional blocks and it's not clear if that's what you are referring to by operation. It's an area I had little to no involvement in.

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code;

I'm a little lost here. Instruction sequencing (which includes scheduling,  dependencies, instruction groups, available execution units, store queues etc) are considered during optimization.  Again, you seem to be using operations in a broader context so my response may be too narrow.

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.

Perhaps, but as I said my use/interest of FPDR was primarily to look at branching. Run-time analysis can provide a lot more insight than that use case even tho it is not feed back to the optimizer.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 04:23:24 am
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant
I think one way to do that is with , instead of ;
No, comma is a sequence point in C: the side effects of the left side must be complete before the side effects of the right side start.

However, there is one workaround with GCC: splitting individual operations into (often static inline) functions marked pure or const (https://gcc.gnu.org/onlinedocs/gcc/Common-Function-Attributes.html#Common-Function-Attributes), and using local scope const variables to hold the temporary results.  Then, GCC knows there are no side effects, and is free to reorder and interleave the machine code implementation.  For example:
Code: [Select]
static inline double op1(const double arg1, const double arg2) __attribute__((const));
static inline double op1(const double arg1, const double arg2)
{
   /* Implementation depending only arg1 and arg2, does not change any variables */
}

static inline double op2(double arg1, double arg2, double arg3) __attribute__((pure));
static inline double op2(double arg1, double arg2, double arg3)
{
   /* Implementation depending on arg1, arg2, arg3, and optionally global variables,
       but may only modify arg1, arg2, arg3. */
}
then a sequence similar to say
Code: [Select]
    /* double arg1, arg2, arg3; */
    double result;
    {
        const double temp1 = op1(arg1, arg2);
        const double temp2 = op1(arg1, arg3);
        const double temp3 = op2(arg1, arg2, arg3);
        result = op2(temp1, temp2, temp3);
    }
has maximal opportunities for optimizing the code used to calculate result, since GCC knows the only side effect is the value of result itself.

(It might be better to declare result in the same scope with current versions of GCC; the above code is for illustration only, and not from a real world project.)

However, as you can see, splitting a complicated formula, say something like used in the embedded atom model (https://en.wikipedia.org/wiki/Embedded_atom_model) (EAM, often used in molecular dynamic simulations involving metals), the code does become completely undebuggable and write-only.  Especially so when you vectorize the calculation (for two or four pairs of atoms) using SSE or AVX.

After all, we could write the code in assembly right now.  It just isn't worth the time and especially maintenance effort.  I believe, but do not know for sure, that a new low-level programming language not completely different to C, could solve this and other system-level programming issues.

Another problem occurs when the calculation does have side effects (for example, you keep min/max tally, reorder the neighbour list to keep pairs within the interaction distance at the beginning of the slice for a particular atom, and so on), but the programmer knows their order does not matter.  I can cheat by implementing the function calls in a separate compilation unit, but declare them as const/pure in the compilation unit they're used in, but then the compiler cannot inline the machine code.

I'd rather not write an assembly generator and test harness to implement optimized versions of these hot code paths, as they're not worth it -- maintainability is much more important than the few percentage point speed increase.  But, if there was a low level language where such concepts could be expressed in, with the compiler itself understanding the SIMD vector concept and atomic ops (including support for both CAS and LL/SC, and preprocessing/choosing an implementation at compile time depending on which one is used), this kind of high-performance code might become somewhat more readable.

This is not, however, just for HPC, but also for low-level libraries, and kernel or freestanding code.  Stuff like privilege boundaries (like between the kernel and userspace, and between userspace processes) needs atomics and exact control of when side effects are to be visible.  Only the compiler knows exactly which hardware architecture and processor type the code is compiled for (and ELF binaries at least have a native way for functions to provide the linkage for other functions at run time), so all this stuff really needs to be part of the language provided by the compiler, and not a "library" in the C sense.

I do not know the "best" feature set myself yet, but looking at other languages (and especially their syntax: how they express intent, side effects, and so on), gives new ideas to me at least.

A similar-ish issue exists in userspace programming, especially GUI applications.  Desktop environments and GUIs seem to be best implemented using an event-based approach, and threads and queues can be used to make responsive, efficient applications.  GTK+ shows that you do not need to have an OOP language to implement one, even C suffices, but Python shows that you can do it in simple, concise code.  (Python's downside is that it has slowish I/O, and is an interpreted language where the standard interpreter can only run bytecode in one thread at a time; I'd much rather see a compiled language with similar features, and a modular, lightweight runtime.)

In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.
Perhaps I'm not quite understanding your requirement here or where/how you see this being done without some form of execution analysis.
No, I thought you were referring to standard runtime analysis tools, which instead of analysing the machine code, simply run it, measuring the time taken, sampling the instruction pointer.

On superscalar processors, C's sequence points (in the presence of side effects) mean that the pipelines are not fully utilized, especially for simple code.
I would like to mark code so the compiler gets to ignore the order of side effects, and thus sequence points altogether, and try its hardest (even use extraordinary amounts of time) to optimize these rare code sequences.

Also, parallelism would look at functional blocks and it's not clear if that's what you are referring to by operation.
I am referring to parallelism; specifically, instruction-level parallelism on superscalar architectures, not thread- or process-based parallelism, for chunks of code whose order of execution or order of side effects are irrelevant for the correctness of the program; that being up to the programmer, and not for the compiler to detect.

 (I do not know how much you know about the subject, and anyway always try to write in a way that keeps anyone interested in the subject along with the ride.  So, please do not read my tone as condescending or anything.  I have no subtext, me no English wrote that good.)

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code;
I'm a little lost here. Instruction sequencing (which includes scheduling,  dependencies, instruction groups, available execution units, store queues etc) are considered during optimization.
Yes, but the first problem occurs at the higher stage.  For example, how do you tell the compiler that you have two or more program sequences that may be executed in any order, even simultaneously (instruction-level parallelized on superscalar architectures, not using different threads of execution)?

The secondary issue is that when you have a very hot (takes much of the total CPU time) set of e.g. mathematical operations, there are many different ways to turn that sequence of operations into machine code.  Brute-force, i.e. trial and error, method applied to the entire program would make the compiler insufferably slow, but it might be worth it to spend such efforts for specifically marked blocks of code.  I'm not even sure how one would implement such an optimizer, but I'm pretty sure it'd have to include generating variants of the relevant abstract syntax tree, and so on.

The mechanism used for this could be used to e.g. tell the compiler that code that does graphics using single precision floating point numbers does not need to be fully IEEE-754 compliant (only finite normal values, and larger than 0.5ULP error allowed), while some other code doing e.g. statistics in the same compilation unit does need to.  (I know: why you'd have them both in the same compilation unit in the first place? but this is just the example that popped in my mind.)

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.
Perhaps, but as I said my use/interest of FPDR was primarily to look at branching. Run-time analysis can provide a lot more insight than that use case even tho it is not feed back to the optimizer.
Sure, I didn't mean to imply it is not useful; again, just that run-time analysis in this particular case shows that the block(s) of code at hand are indeed very "hot", with well over 50% of the CPU time spent there.

Optimization is kind of a funny problem: trying to find an efficient expression, without using excessive resources (especially time) while doing so.  There are some very rare, often complex but branchless, chunks of code I'd like the compiler to go beast-mode on when generating the code, because I already know that the CPU will spend most of its time in those particular chunks of code, no matter what the dataset.

I wonder how many optimization methods have been deemed un-applicable, because they were deemed too slow in practice?  Or how many have not been explored, because they are known to not scale well enough to be practical?
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 07, 2019, 04:26:32 pm
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant
I think one way to do that is with , instead of ;
No, comma is a sequence point in C: the side effects of the left side must be complete before the side effects of the right side start.

Yep.

Anyway, regarding the order of execution, I get your point (sort of), but I think it can get hairy real fast. Whereas the "simple" cases (in which the compilers can safely infer that operations can be re-ordered because there are no dependencies between them, and no "side-effect" in the sense that said side-effect is unknown - external calls for instance) are handled pretty well by most decent optimizing compilers, the more complex cases can be VERY hard to handle. A complete optimization (in the sense that you'd get the *optimal* execution) is, I think, a NP-hard problem.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 06:55:25 pm
Anyway, regarding the order of execution, I get your point (sort of), but I think it can get hairy real fast. Whereas the "simple" cases (in which the compilers can safely infer that operations can be re-ordered because there are no dependencies between them, and no "side-effect" in the sense that said side-effect is unknown - external calls for instance) are handled pretty well by most decent optimizing compilers, the more complex cases can be VERY hard to handle. A complete optimization (in the sense that you'd get the *optimal* execution) is, I think, a NP-hard problem.
Yes, and this is exactly why I'd like a low-level language with some kind of annotation to mark those rare chunks of code.

In a very real sense, this is a human problem, and not a machine one.  What kind of syntax is needed for humans to intuitively indicate these things?

One problem is that text is essentially an one-dimensional stream for us humans.  I don't like visual programming, which just simplifies things even further, but working in environments like Macromedia Director over two decades ago showed me that attaching textual code (Lingo (https://en.wikipedia.org/wiki/Lingo_(programming_language))) to visual elements, even non-programmers can easily grasp event-based programming.  (Although it is described as OOP, at the core it is much more about events than objects.)

I believe the way many experienced programmers use editors -- having several windows of code -- is related; that is, a way to overcome the one-dimensional nature of text.  Is there a syntax that would help?  Or do we simply need better editors?  Or is there a syntax that is human-readable, but at the same time allows much better editors?  Could we embed code within a markup language for this; would it work for humans, cognitively?

I am personally comfortable with both indent-based (Python) and delimiter-based (C, C++) code scoping, but only the latter gives an obvious way to attach compiler directives or attributes to the code block (by extending the delimiter syntax).  However, one of the new experimental languages might have found an even better way, and this is a big part of why I do not mind there being a lot of programming languages: programmers may discover interesting ways to do things.

An odd wrinkle to this is that when familiarising oneself with a new project, or examining one, the linear one-dimensional stream is basically the easiest, most effective way.  Our books (both dead tree and electronic) are essentially one-dimensional, and we use those when we learn stuff.  However, when editing and modifying, the linear one-dimensional stream is a hindrance -- at least if you are one of those who use several windows to the same codebase when editing.  Other than developing new programming languages, and creating new abstractions in an effort to move programming away from the hardware and "closer to human ways of thinking", we haven't really explored the human cognitive aspects of programming for low-level languages; we still use the same ones we used thirty, forty years ago!

Personally, I do not mind using different programming languages in the same project.  I would much prefer having a nice simple event-based language to implement graphical user interfaces, and a low-level/systems programming language to implement the heavy-duty data mangling part.  I kinda like Python and C for this, especially because you can keep the Python part open source (even if proprietary; for users to edit or fix visual issues), and the secret sauce hard work in the C part, and have the entire project surprisingly easy to port between the current operating systems and architectures.  For intermediate-level stuff, like daemons and services dealing with passing data and privilege boundaries, a yet another language (likely OOP like C++), could be useful.
Title: Re: Too many programming languages?
Post by: tggzzz on December 07, 2019, 08:52:21 pm
Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
I was once advised to use Pi as the factor. :)

Also, I'm one of the people who do not multitask well, and have had to learn to be specific and clear about priorities and timetables.

Pi works well, but it is nice to have at least a little justification for using it :)

Far more people think they can multitask than can actually multitask effectively.

Not being specific about priorities is a best a sign of a confused mind, and at worst a sign that somebody is being setup as the project scapegoat.
Title: Re: Too many programming languages?
Post by: tggzzz on December 07, 2019, 08:59:55 pm
While I don't like many (most?) of the current programming languages, every one of them has provided further insight into what features benefit and what detract from a programming language.  So, while there are lots of languages I would recommend against using when writing widely-used tools or end-user applications, I do see them having a valid use in research and experimentation: without them, our future programming languages would be worse.

I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

Nowadays there is a similar point to be made about languages. Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
Title: Re: Too many programming languages?
Post by: tggzzz on December 07, 2019, 09:05:16 pm
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so.

Not quite true. It is more helpful to state "...when the C language specification indicates they can safely do so and the programmer understands the C language specification and the programmer has appropriately written C code".

Yes, there are quite a few real and practical pitfall traps there.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 10:17:26 pm
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
For the same reason, I'd like a mid-level systems language (for services and daemons) with a heavy emphasis on asynchronous I/O, queues, message passing, basic lockless atomic ops (CAS or LL/SC), and thread-local state (as opposed to global state).  (Essentially the same thing, I think, but different emphasis or approach.)

At this point, even a router (at least my Asus RT-AC51U running OpenWRT trunk) has a couple of dozen processes, most of which are single-threaded, but almost all would benefit from async I/O and message passing (events, state changes at minimum).  The multithreaded ones (services!) even more so.

It seems to me that heterogenous processing is becoming more and more common.  I've watched with interest how the "packet filtering" mechanisms in the Linux kernel -- essentially a minimal bytecode to examine data and make decisions based on it, passed from an unprivileged context, verified, then executed in a privileged context -- has evolved, and something very much like that will eventually end up in one of the embedded network-I/O-gadget processors.  Definitely superior approach to IP filtering, for example.

So, I do think we need several programming languages, to cater for the different contexts.

there are quite a few real and practical pitfall traps [in reordering operations when compiling C code].
Especially when there are side effects whose order does not matter at all.  I really haven't found a good way to tell that to even GCC.

Because of this discussion, I did think of one possibility: writing static inline helper functions marked pure, implementing basic atomic operations (to storage only accessed atomically) in inline assembly -- essentially cheating, telling the compiler there are no side effects.  This would actually suffice, as these functions do modify the program state, but at an unspecified time at any point of the scope they are used in, so the compiler is free to reorder them as it sees fit within that scope.  Good enough for me!

This also indicates to me that such annotation is not needed for arbitrary expressions, only for functions and (entire) scopes.  Surprising!
Title: Re: Too many programming languages?
Post by: brucehoult on December 07, 2019, 10:47:47 pm
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 07, 2019, 11:32:54 pm
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.
That's true now, but in 1978?
Perhaps the interviewer didn't care about technical accuracy, but the attitude/approach to tools?

I mean, you probably have asked an interviewee which programming language they'd use to solve problem X, then which language they like best, and drawn conclusions about whether they answer with the same programming language or environment for both, instead of which particular languages they mention?  ;)
Title: Re: Too many programming languages?
Post by: Nusa on December 08, 2019, 12:06:07 am
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.

C didn't have a lot of penetration in 1978 (first K&R edition was published that year) except in Bell Labs and university groups. Pascal was more established, but not for micros. Assembly was still the way to go for most at that time. Development environment and tools were primitive compared to today, but they still mattered.
Title: Re: Too many programming languages?
Post by: westfw on December 08, 2019, 02:24:19 am
Quote
That's true now, but in 1978?
I was going to mention that...
BDS C for CP/M didn't come out until 1979, Turbo Pascal not until 1983.Stiil, I guess some chips had both integer AND floating-point BASICs, and the bigger CP/M systems were advertising Cobol, Fortran, and Pascal support, and ASM quality (as well as BASIC quality) was pretty variable (all of which were rather expensive, BTW.)There were also mainframe-based cross compilers and assemblers for some of the micros, which was probably pretty helpful in some cases.  All of my early code was compiled on something relatively big and expensive (include the 8086SDK Senior Design Project in ~1980 - I think I used a cross-assembler on a "big" CP/M system.  8086 penetrated really slowly.  It wasn't until IBM did the PC and Andy chose the 68k for the Sun-1 (both in ~81) that things really started to take off.)
Title: Re: Too many programming languages?
Post by: brucehoult on December 08, 2019, 06:48:30 am
Quote
That's true now, but in 1978?
I was going to mention that...
BDS C for CP/M didn't come out until 1979, Turbo Pascal not until 1983.Stiil, I guess some chips had both integer AND floating-point BASICs, and the bigger CP/M systems were advertising Cobol, Fortran, and Pascal support, and ASM quality (as well as BASIC quality) was pretty variable (all of which were rather expensive, BTW.)There were also mainframe-based cross compilers and assemblers for some of the micros, which was probably pretty helpful in some cases.  All of my early code was compiled on something relatively big and expensive (include the 8086SDK Senior Design Project in ~1980 - I think I used a cross-assembler on a "big" CP/M system.  8086 penetrated really slowly.  It wasn't until IBM did the PC and Andy chose the 68k for the Sun-1 (both in ~81) that things really started to take off.)

That's for full-strength languages, yeah.

Wirth's "Algorithms + Data Structures = Programs" was out in 1976 and contained a byte code compiler for a simple subset of Pascal in the back. I'm sure a lot of people typed that into a bigger computer and made an interpreter for the bytecode on a micro. I know I did.

USCD Pascal came out in 1977. A full Pascal running *on* the micro, built on a portable byte-code interpreter written in assembly language for various micros. Apple started selling it for the Apple ][ in 1979. It made a fairly usable self-hosted system even at 1 MHz if you had the full 64 KB of RAM and two floppy disk drives. I certainly wrote quite a few medium-sized programs using it. You could call out to assembly-language for parts where speed was critical, and the assembler and linker were integrated.

You really can't do much better than than for the 6502 for large programs unless you have a *really* sophisticated compiler, which I don't think anyone has ever written anyway.
Title: Re: Too many programming languages?
Post by: tggzzz on December 08, 2019, 08:38:43 am
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
For the same reason, I'd like a mid-level systems language (for services and daemons) with a heavy emphasis on asynchronous I/O, queues, message passing, basic lockless atomic ops (CAS or LL/SC), and thread-local state (as opposed to global state).  (Essentially the same thing, I think, but different emphasis or approach.)

Yes, but I would be content with a language that enabled the above and which came with a standard library of components that implemented the above. That way people would be guided towards doing things properly, while still allowing capabilities to be easily expanded later. In that vein Java enabled Doug Lea to produce his wonderful[1] concurrency library, which was later incorporated into the Java standard libraries.

Never had the chance to use Erlang :(

[1] because it stole the good concepts that had been developed and field proven over the decades, and implemented them in a form that was easily usable. But then the "reuse proven good ideas" was at the heart of the Java culture, cf C++'s continual reinvention of oblong wheels.

Quote
At this point, even a router (at least my Asus RT-AC51U running OpenWRT trunk) has a couple of dozen processes, most of which are single-threaded, but almost all would benefit from async I/O and message passing (events, state changes at minimum).  The multithreaded ones (services!) even more so.

Multithreading is overused.

Frequently it is better to have a small number (i.e. one per core) of "worker threads" that (asynchronously) pick "jobs" from an input queue,  process the job and (synchronously) put the result in another queue.

Quote
It seems to me that heterogenous processing is becoming more and more common.  I've watched with interest how the "packet filtering" mechanisms in the Linux kernel -- essentially a minimal bytecode to examine data and make decisions based on it, passed from an unprivileged context, verified, then executed in a privileged context -- has evolved, and something very much like that will eventually end up in one of the embedded network-I/O-gadget processors.  Definitely superior approach to IP filtering, for example.

So, I do think we need several programming languages, to cater for the different contexts.

Yes, but all things can be taken too far. In this case the trap is a grotty little domain specific language (DSL) than would be better implemented as a library in a decent standard language.

DSLs almost always catch feature creep, so that even the originators don't understand/predict all the interactions[2]. Much better to have a standard language with good general purpose tool support.

[2] Happens to some mainstream languages too, e.g. C++

Quote
there are quite a few real and practical pitfall traps [in reordering operations when compiling C code].
Especially when there are side effects whose order does not matter at all.  I really haven't found a good way to tell that to even GCC.

Because of this discussion, I did think of one possibility: writing static inline helper functions marked pure, implementing basic atomic operations (to storage only accessed atomically) in inline assembly -- essentially cheating, telling the compiler there are no side effects.  This would actually suffice, as these functions do modify the program state, but at an unspecified time at any point of the scope they are used in, so the compiler is free to reorder them as it sees fit within that scope.  Good enough for me!

This also indicates to me that such annotation is not needed for arbitrary expressions, only for functions and (entire) scopes.  Surprising!

And then we consider maintenance in 5 years time by new staff, and/or changes to compiler implementations and capabilities.
Title: Re: Too many programming languages?
Post by: tggzzz on December 08, 2019, 08:47:22 am
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

In early 1978 the 8086 didn't exist. We are thinking about processors such as 8080, 8085, Z80, 1802, 6800, 6502, SCMP, F100.


Quote
Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.

In 1978 the only Pascal implementation was UCSD p-code. C was not generally known, and the tools would have been primitive. Not interesting to an electronic engineer.

The realistic choice for an HLL was intel's PL/M running in their "blue boxes".
Title: Re: Too many programming languages?
Post by: tggzzz on December 08, 2019, 08:50:31 am
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.
That's true now, but in 1978?
Perhaps the interviewer didn't care about technical accuracy, but the attitude/approach to tools?

I mean, you probably have asked an interviewee which programming language they'd use to solve problem X, then which language they like best, and drawn conclusions about whether they answer with the same programming language or environment for both, instead of which particular languages they mention?  ;)

Basically yes.

When. as an interviewer, I've asked such open-ended questions, I've been looking to see to what extent the interviewee can justify their answers. That can reveal a lot about their thought processes, depth/breadth of understanding, and flexibility.
Title: Re: Too many programming languages?
Post by: chickenHeadKnob on December 08, 2019, 11:13:19 am

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

In early 1978 the 8086 didn't exist. We are thinking about processors such as 8080, 8085, Z80, 1802, 6800, 6502, SCMP, F100.

In 1978 the only Pascal implementation was UCSD p-code. C was not generally known, and the tools would have been primitive. Not interesting to an electronic engineer.

The realistic choice for an HLL was intel's PL/M running in their "blue boxes".

All of what you wrote conforms to my memory as well. I believe there may have been mainframe implementations of Pascal extant in that year, it was rapidly gaining popularity with academics as the language you should be teaching beginners.  For micros the choice was assembler or PL/M. Motorola had MPL their PL/M equivalent    for the 6800 family. I ended up using both PLM and MPL on concurrent projects right after graduation.

MPL manual http://bitsavers.trailing-edge.com/components/motorola/6800/exorciser/MPL_Language_Reference_Manual_1976.pdf (http://bitsavers.trailing-edge.com/components/motorola/6800/exorciser/MPL_Language_Reference_Manual_1976.pdf)
Title: Re: Too many programming languages?
Post by: legacy on December 08, 2019, 01:55:22 pm
human cognitive aspects of programming

Well, a lot of scientists in my team, continuously say that "if aircraft could copy the way geese fly, they would save fuel!". But our best AI has never tried to solve this because fuel cost is not our problem.

Google scientists say they've achieved 'quantum supremacy'(1), which means that a programmable quantum device can solve a problem that classical computers practically cannot.

One month ago, their Quantum computer "Sycamore" solved a problem in 200 seconds, while the IBM massive supercomputer "Summit" would have taken 10K years; but IBM guys replied that a "patch" would have accelerated time to solution because "even aggressive centipedes will co-operate if they have to", which better explains what the IBM-patch does on the Supercomputer's nodes to solve the problem in just 3 days rather than in thousands years.

That's a milestone, because until recently, every computer on the planet, from a 1960s mainframe to iPhone, has operated on the same rules. These were rules that Charles Babbage understood in the 1830s and that Alan Turing codified in the 1930s.

Human beings need a solid purpose to do things. Their nature is being lazy, and I am afraid this also applies to the human cognitive aspects of programming  :-//

(1) This phrase was coined by the physicist John Preskill in 2012.
Title: Re: Too many programming languages?
Post by: Nominal Animal on December 08, 2019, 04:53:46 pm
Human beings need a solid purpose to do things.
Well, no, just instant gratification.

Consider Foldit (https://en.wikipedia.org/wiki/Foldit). The retroviral protease of M-PMV (monkey HIV/AIDS-like virus) has a crystal structure that was unsolved for 15 years.  Foldit gamified it in 2011, and the players found the enzyme structure in ten days.

True engineers, scientists, and tradesmen who look at things in the longer term, are the oddballs.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 08, 2019, 05:29:05 pm
Human beings need a solid purpose to do things.
Well, no, just instant gratification.

If you define "instant gratification" by getting an instant result, I agree.

But I think both are more closely related than you seem to think. Having a solid purpose and acting upon it gives us a form of instant gratification, even if we can't see the result. It's all related to how our brain reward system works.

So the relatively small fraction of people being able to act on long term rather than short term is not quite, IMO, due to them not needing "instant gratification" per se, but it's likely due to how their reward system specifically works (it's probably more "efficient"!)

Human beings need a solid purpose to do things.

You make it sound as though it was a defect. Actually, all living beings need a purpose for doing anything at all. It's called motivation, and it goes from our very basic needs up to more complex/abstract ones. Doing things without purpose is the oddball. Life does not like wasting energy for no reason. Call that lazyness if you will. In that definition, life is essentially lazy.

The whole point, I think, is not the problem of needing a solid purpose or not for doing things, it's about our capacity of anticipating.
Title: Re: Too many programming languages?
Post by: hazeanderson on December 17, 2019, 02:44:19 pm
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud." Same concept ... just with networking many instances (that can in turn employ highly parallel processing be it threads, message passing or forking processes) instead of one instance.
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 18, 2019, 04:27:52 pm
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud."

Oh, really. ;D
Title: Re: Too many programming languages?
Post by: tggzzz on December 18, 2019, 08:06:33 pm
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud."

Oh, really. ;D

Indeed.

Far be it from me to observe that gives youngsters new opportunities to repeat old mistakes. And old mistakes that have known solutions, if there are any solutions. Too few youngsters grok the eight laws of distributed programming.

As I tried to teach my daughter, it is OK to make new mistakes.
Title: Re: Too many programming languages?
Post by: Nusa on December 18, 2019, 09:05:15 pm
Don't you mean the 8 fallacies of distributed programming?
Title: Re: Too many programming languages?
Post by: tggzzz on December 18, 2019, 09:27:45 pm
Don't you mean the 8 fallacies of distributed programming?

Oh, picky, picky, picky :)
Title: Re: Too many programming languages?
Post by: Kjelt on December 19, 2019, 03:06:35 pm
Don't you guys mean the eight fallacies of distributed COMPUTING  :-//

https://en.wikipedia.org/wiki/Fallacies_of_distributed_computing
Title: Re: Too many programming languages?
Post by: SiliconWizard on December 19, 2019, 03:21:04 pm
Yep. Anyway... these days, kiddies are convinced there is no life outside of being constantly tied to servers on the Internet, so not surprising the old daemons are creeping back...

Good (for some?) thing is that they are excellent clients for Google, MS and Amazon.
Title: Re: Too many programming languages?
Post by: tggzzz on December 19, 2019, 05:52:14 pm
Yep. Anyway... these days, kiddies are convinced there is no life outside of being constantly tied to servers on the Internet, so not surprising the old daemons are creeping back...

Good (for some?) thing is that they are excellent clients for Google, MS and Amazon.

Oldies remember the collective sigh of relief when people could control their own data and computers, and weren't reliant on "the cloud". Of course "the cloud" had a different name: "timesharing bureaux".

While history doesn't repeat, it does rhyme :)