Author Topic: Too many programming languages?  (Read 49234 times)

0 Members and 1 Guest are viewing this topic.

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4957
  • Country: si
Re: Too many programming languages?
« Reply #50 on: September 25, 2019, 12:03:40 pm »
Yep HDLs are not something to go into right away.

They are fundamentally different than normal programing languages. They compile into a circuit rather than a series of instructions. Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

Unless you are working with FPGAs and other programmable logic there is very little reason to know a HDL language at all. The thing is only useful for describing large digital circuits, but for smaller simpler digital circuits regular schematics work better anyway. FPGAs in general are a more "late game" sort of thing in electronics engineering, as you can do a lot of things without using them. The only reason you tend to use a FPGA in a project is because you ran out of all the other options. They are expensive and annoying to develop for.

So if you ask me only learn Verilog or VHDL if you have no other choice.
 

Offline techman-001

  • Frequent Contributor
  • **
  • !
  • Posts: 748
  • Country: au
  • Electronics technician for the last 50 years
    • Mecrisp Stellaris Unofficial UserDoc
Re: Too many programming languages?
« Reply #51 on: September 25, 2019, 12:04:22 pm »

... Suggesting Forth as the language to learn if someone wants a career in embedded development  ...
[/quote]

Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
 

Offline dave j

  • Regular Contributor
  • *
  • Posts: 128
  • Country: gb
Re: Too many programming languages?
« Reply #52 on: September 25, 2019, 12:19:22 pm »
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.
I'm not David L Jones. Apparently I actually do have to point this out.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: Too many programming languages?
« Reply #53 on: September 25, 2019, 12:22:05 pm »
Well yes my interpretation was that the OP would use it in his professional career perhaps not as an embedded software engineer but as he stated an electrical engineer.

And the answer to his question:
The thing is, as an electronics engineer, which one should I learn most!?

can only be a language that is most used in his field of work , used by most companies in his line of work ,
and then it does not matter if you are in love with some obscure other language, the only relevant and to the point answer can be : C .

And I stop this discussion with you now, it is useless.
Let other engineers answer OPs question.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #54 on: September 25, 2019, 12:24:10 pm »
Yep HDLs are not something to go into right away.

They are fundamentally different than normal programing languages.

Correct.

Quote
They compile into a circuit rather than a series of instructions.

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #55 on: September 25, 2019, 12:27:33 pm »
Well yes my interpretation was that the OP would use it in his professional career perhaps not as an embedded software engineer but as he stated an electrical engineer.

And the answer to his question:
The thing is, as an electronics engineer, which one should I learn most!?

can only be a language that is most used in his field of work , used by most companies in his line of work ,
and then it does not matter if you are in love with some obscure other language, the only relevant and to the point answer can be : C .

And I stop this discussion with you now, it is useless.
Let other engineers answer OPs question.

That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.

The world is continually changing. Those that aren't used to changing will become irrelevant.

Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: techman-001

Offline nfmax

  • Super Contributor
  • ***
  • Posts: 1562
  • Country: gb
Re: Too many programming languages?
« Reply #56 on: September 25, 2019, 12:30:14 pm »
HDL's are examples (pretty much) of declarative programming languages. These work by defining what needs to be done, not by explicitly stating a sequence of steps to be taken. There are declarative programming languages which do not compile into a circuit, but instead run on a normal computer: Prolog is one such.
 

Offline techman-001

  • Frequent Contributor
  • **
  • !
  • Posts: 748
  • Country: au
  • Electronics technician for the last 50 years
    • Mecrisp Stellaris Unofficial UserDoc
Re: Too many programming languages?
« Reply #57 on: September 25, 2019, 12:44:52 pm »
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.

It's perfectly natural to read into the OPs post the kind of scenarios that we are familiar with, it happens all the time.

As for  "Forth not being the best language to learn embedded on", are you speaking from experience, or quoting the usual opinions online ?

Very experienced posters here have said "learn the bare metal" again and again in other posts and specifically here in reply to the OP and I quote:
"Being able to program deep embedded applications is more about understanding bare metal and critical timing"

Libraries are irrelevant to someone *really* learning embedded. What is important is the manufacturers databook and something that allows real-time interactivity with the hardware ( Forth is perfect for this ) to obtain hands on experience and confidence in the datasheets. An oscilloscope and or logic analyzer is also very important.
To really LEARN deep embedded you MUST write you own low level stuff for all the peripherals you plan on using because this are essential exercises on the path to deep understanding.

Libraries are critical to Arduino, Python, Lua   'makers' who are after a fast ledshow or a cat door opener based on some cheap Chinese Pre-made board. These users couldn't care less about the intricacies of bare metal embedded design.

If you hate 2000 page databooks and love libraries, I believe that deep embedded is not for you and you will probably hate it. A career in embedded isn't for you in my opinion.

I have zero clue about game development as I'm a electronics technician, so I'm not qualified to offer any opinions there and will refrain from doing so.

 

Offline RoGeorge

  • Super Contributor
  • ***
  • Posts: 6203
  • Country: ro
Re: Too many programming languages?
« Reply #58 on: September 25, 2019, 01:01:28 pm »
Ah, the Humpty Dumpty approach to "programming language", viz
Quote
“When I use a word,' Humpty Dumpty said in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.' 'The question is,' said Alice, 'whether you can make words mean so many different things.' 'The question is,' said Humpty Dumpty, 'which is to be master — that's all.”

You need to understand the difference between "structural" and "behavioural" HDL code.

What did I do?  ;D

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: Too many programming languages?
« Reply #59 on: September 25, 2019, 01:02:38 pm »
That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.
The world is continually changing. Those that aren't used to changing will become irrelevant.
Ofcourse you have to change and adapt, this is practical answer to his question which language should I learn.
I also learned multiple languages and use them now and then.

The same for normal languages, if you ask I want to work as an electrical engineer in the western world, which single language should I learn, I would answer English.
Over 10-15 years this might become chinese but if I would say you should learn chinese and you try to get a job in the western world and don't speak any other language you know how it finishes.

Quote
Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
So what language would you answer to OP as the single language to learn for an EE to do some embedded development ?
 

Offline techman-001

  • Frequent Contributor
  • **
  • !
  • Posts: 748
  • Country: au
  • Electronics technician for the last 50 years
    • Mecrisp Stellaris Unofficial UserDoc
Re: Too many programming languages?
« Reply #60 on: September 25, 2019, 01:06:28 pm »
Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embodied in modern languages.

Technology is moving so fast, it's a scramble to keep up!

I especially don't want to bore you with Forth, but we have a Forth chip called the GA144 which has been around since 2012 I believe. It has 144 independent native Forth computers on a chip and it enables parallel or pipelined programming. This chip has no system clock and is available for $20 USD/ pack of 10.

I admit that I have little clue how to develop on the GA144, but the father of Forth (Charles Moore) felt it was the way forward from single mcu technologies at least 7 years ago.

For anyone interested in the technology: http://www.greenarraychips.com/home/products/index.html and a quality example heart rate monitor video developed with it:
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #61 on: September 25, 2019, 01:16:59 pm »
HDL's are examples (pretty much) of declarative programming languages.

A part of HDLs is declarative. Other parts are not, and are very similar to "conventional" languages.

Your statement is as valid as saying "C++ is a procedural language".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #62 on: September 25, 2019, 01:21:52 pm »
Ah, the Humpty Dumpty approach to "programming language", viz
Quote
“When I use a word,' Humpty Dumpty said in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.' 'The question is,' said Alice, 'whether you can make words mean so many different things.' 'The question is,' said Humpty Dumpty, 'which is to be master — that's all.”

You need to understand the difference between "structural" and "behavioural" HDL code.

What did I do?  ;D

(1) you appear to not understand HDLs

(2) you snipped the context in which my statement was made. Here it is again, to refresh your mind and to allow others to follow the conversation:
VHDL and Verilog are not programming languages, those are for describing digital schematics.
in other words you are using the term "programming language" in a limited sense, without bothering to indicate the limits.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #63 on: September 25, 2019, 01:27:03 pm »
That is a very blinkered attitude, and one that will probably do the OP a diservice in the long term.
The world is continually changing. Those that aren't used to changing will become irrelevant.
Ofcourse you have to change and adapt, this is practical answer to his question which language should I learn.
I also learned multiple languages and use them now and then.

The same for normal languages, if you ask I want to work as an electrical engineer in the western world, which single language should I learn, I would answer English.
Over 10-15 years this might become chinese but if I would say you should learn chinese and you try to get a job in the western world and don't speak any other language you know how it finishes.

Quote
Those that choose an inappropriate tool, or use a tool inappropriately, will become irrelevant as more competent people eat their lunch.
So what language would you answer to OP as the single language to learn for an EE to do some embedded development ?

Mu in the Buddhist sense, as popularised in "Gödel, Escher, Bach" and "Zen and the Art of Motorcycle Maintenance". https://en.wikipedia.org/wiki/Mu_(negative)#In_popular_culture

If you ask the wrong question, you won't get the right answer!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #64 on: September 25, 2019, 01:31:30 pm »
Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embodied in modern languages.

Technology is moving so fast, it's a scramble to keep up!

I especially don't want to bore you with Forth, but we have a Forth chip called the GA144 which has been around since 2012 I believe. It has 144 independent native Forth computers on a chip and it enables parallel or pipelined programming. This chip has no system clock and is available for $20 USD/ pack of 10.

I admit that I have little clue how to develop on the GA144, but the father of Forth (Charles Moore) felt it was the way forward from single mcu technologies at least 7 years ago.

Quite; neither have I!

OTOH, there is a commercially important language and processor family based on key concepts from the 1970s (Hoare's CSP). The first processors and language were developed in the 80s, the modern variant is a delight to use and "just does what it says on the tin": xC and the xCORE processors from XMOS.

However, I wouldn't recommend that a beginner learns them, but the 30,000ft overview of their philosophy and capabilities are very useful to understand.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline dave j

  • Regular Contributor
  • *
  • Posts: 128
  • Country: gb
Re: Too many programming languages?
« Reply #65 on: September 25, 2019, 01:44:23 pm »
Another imaginative poster putting words in the OP's mouth.

Please quote where the OP has said "a career in embedded development" ?
OK, My bad.

Forth isn't the best language for a new programmer though simply because there is far less in the way of libraries, examples, tutorials, other people who know it who could offer advice, etc. That applies whether the OP want to do embedded or game development.

It's perfectly natural to read into the OPs post the kind of scenarios that we are familiar with, it happens all the time.

As for  "Forth not being the best language to learn embedded on", are you speaking from experience, or quoting the usual opinions online ?
You criticized me earlier for "putting words in the OP's mouth". I would have thought you'd have avoided making the same mistake.


Quote
Very experienced posters here have said "learn the bare metal" again and again in other posts and specifically here in reply to the OP and I quote:
"Being able to program deep embedded applications is more about understanding bare metal and critical timing"

Libraries are irrelevant to someone *really* learning embedded. What is important is the manufacturers databook and something that allows real-time interactivity with the hardware ( Forth is perfect for this ) to obtain hands on experience and confidence in the datasheets. An oscilloscope and or logic analyzer is also very important.
To really LEARN deep embedded you MUST write you own low level stuff for all the peripherals you plan on using because this are essential exercises on the path to deep understanding.

Libraries are critical to Arduino, Python, Lua   'makers' who are after a fast ledshow or a cat door opener based on some cheap Chinese Pre-made board. These users couldn't care less about the intricacies of bare metal embedded design.

If you hate 2000 page databooks and love libraries, I believe that deep embedded is not for you and you will probably hate it. A career in embedded isn't for you in my opinion.
Whilst understanding things from databooks upwards is essential to really understand embedded development, presenting someone who doesn't yet know how to program a 2000 page MCU databook is just going to frustrate and discourage them. Frameworks and libraries provide a leg up early on so you can get to grips with the programming without the distraction of understanding the complexities if an MCU. They can be discarded easily enough later.

Quote
I have zero clue about game development as I'm a electronics technician, so I'm not qualified to offer any opinions there and will refrain from doing so.
I've dabbled with graphics and game development as a hobby for nearly 40 years so know a bit about it. I'd advise the OP to focus on learning programming via game development rather than embedded. There is plenty of material available aimed at beginners and drawing stuff on screen means you get lots of visual feedback to keep you interested - something that is lacking with embedded. You also just need your PC - don't need to spend money on additional devices.
I'm not David L Jones. Apparently I actually do have to point this out.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14482
  • Country: fr
Re: Too many programming languages?
« Reply #66 on: September 25, 2019, 02:25:47 pm »
Arduino ... just don't write code like this  :palm: :palm: :palm:

Damn you. Made me want to puke! :-DD
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4957
  • Country: si
Re: Too many programming languages?
« Reply #67 on: September 25, 2019, 04:13:33 pm »

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.

Well yes not everything can compile into a circuit. But these non-synthesizeable features of HDL are just a easily simulateable adaption of the languages core features to make it WAY easier to write testbenches. Having to use a different language to write the testbench would be annoying. Since simulators execute HDL code one line at a time this was easily used to implement the classical sequential execution since that way of programing lends itself much nicer to complicated behavior. So since testbenches are such a key part of HDL development the language reflects that (But still non-synthesizable features are only a fraction of the total feature set of the language)

My point is that HDL languages ware designed from the beginning to look similar to circuits (Much like C tries to work like a more readable platform independant assembler). The language itself calls things a "wire" or "register" rather than a variable. Its variables can be tristate Z or undefined X, things that only make sense in circuits.

In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL. For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.

Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.
 

Offline SparkyFX

  • Frequent Contributor
  • **
  • Posts: 676
  • Country: de
Re: Too many programming languages?
« Reply #68 on: September 25, 2019, 04:33:53 pm »
I think we can agree that you have to learn HDLs to actually use them. Maybe implement a state machine and make them do things sequentially?

Large parts of this discussion are pretty much irrelevant to someone asking what programming language to start with.

Should you ever work with PLCs (maybe an application that interacts with them), then you probably get in contact with proprietary languages as Simatic S7.
Support your local planet.
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #69 on: September 25, 2019, 05:25:33 pm »
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #70 on: September 25, 2019, 05:54:41 pm »

Not true.

Importantly, only a subset can be compiled into a circuit.

Non-synthesisable constructs are often used when writing behavioural test suites. One overly simple example of the benefit is to define a adder circuit in terms of structurally interconnected gates, and the test suit in terms of "expectedOutput = inA + inB", and then "if output != expectedOutput" then log the failure".

Quote
Not a whole lot is to be learned from it if you are programing programs to be run on CPUs, but does take a good bit of effort to properly understand due to how different it is.

FSMs are very beneficial in embedded systems, whether they are implemented in hardware or software. Too few softies understand FSMs, thinking they are "something to do with compilers"!

Thinking in terms of a "single program counter" stepping through lines of code is already dangerously outdated and will become more so in the future. Examples:
  • computation spread across multiple processors distributed across the globe (telecoms, IoT, cloud computing, etc)
  • multicore/processor processors (e.g. Zync, xCORE, intel/AMD x86-64 machines)
The discipline of thinking in terms of parallel computation is mandatory, and HDLs force such thinking. The concepts can be - and are being - embidied in modern languages.

Well yes not everything can compile into a circuit. But these non-synthesizeable features of HDL are just a easily simulateable adaption of the languages core features to make it WAY easier to write testbenches.

They are not "just" that; they are of far wider applicability.

They can be used to
  • enable modelling of the system
  • alternative implementations of a function
Put those two together and you can have stepwise refinement of a system.

Quote
Having to use a different language to write the testbench would be annoying.

Irrititating, but not fundamental. Test vectors are often written in a different language simply because that is the easiest way to create them.

Quote
Since simulators execute HDL code one line at a time this was easily used to implement the classical sequential execution since that way of programing lends itself much nicer to complicated behavior. So since testbenches are such a key part of HDL development the language reflects that (But still non-synthesizable features are only a fraction of the total feature set of the language)

A simulator's run-time execution model is irrelevant, and is not specified as part of an HDL. It may or may not be sequential; consider high-end simulators where some of the simulation is executed in hardware.

I have no idea what you might mean by "fraction of the total feature set".

Quote
My point is that HDL languages ware designed from the beginning to look similar to circuits (Much like C tries to work like a more readable platform independant assembler). The language itself calls things a "wire" or "register" rather than a variable. Its variables can be tristate Z or undefined X, things that only make sense in circuits.

Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.

Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.

Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #71 on: September 25, 2019, 06:04:07 pm »
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.

Agreed.

But the probability of someone knowing the theory is far higher if they have been to a good course at a good university. It is very difficult to imbibe theory solely via self-study; in my career I can count the number of people I've seen achieve that on the fingers of one hand.

I've seen far too many candidates with a stunted view of theory. They often waste a lot of time and energy
  • trying to do something that can never be achieved
  • reinventing wheels, or rather a part of the circumferance of wheels
  • creating "brittle" things that work in a test environment, but intermittently fail out in the real world
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #72 on: September 25, 2019, 07:17:53 pm »
I'm with John. Degrees are pretty much worthless. As a programmer I don't think I've ever had an employer who cared whether or not I had a degree, or what degree it was.

Every decent employer I have had has been extremely interested in both my degree and experience. In interviews they, and I when I have been an interviewer, have always included some questions that rely on the candidate understanding and applying the theory they should have learned in any halfway decent degree.

The *knowledge* of the theory is important. Whether you obtained it via self-study on the internet or via paying tens of thousands of dollars to a university is irrelevant.

Agreed.

But the probability of someone knowing the theory is far higher if they have been to a good course at a good university. It is very difficult to imbibe theory solely via self-study; in my career I can count the number of people I've seen achieve that on the fingers of one hand.

Sure. It's difficult. I said that. That's what interviews are for. Or, better, trial periods or short term contracts leading to possible full-time employment later.

I studied some theory at university in 1981-1983, though actually 90% of what I learned was self-study in the library basement reading CACM and SIGPLAN and doing my own projects, NOT from the lectures and assignments.

Since then I have self-studied at least the following things that didn't exist in 1983:

- MIPS, SPARC, ARM (A32, T16, T32, A64), PA-RISC, i386, PowerPC, Alpha, x86_64, AVR, RISC-V programmer's models and assembly languages.

- C++, Java, C#, Perl, Python, Ruby, Dylan, ANSI Common Lisp, OCaml, lua

- MacOS, Win32, Linux, Solaris (and others), NeXTStep/Rhapsody/OS X (all at home at first, then leading to jobs e.g. Solaris on an old SPARC ELC sold by the local university for $50)

- modern programming language and compiler theory (at home at first, later leading to jobs and learning more on the job)

- instruction set and microarchitecture design (at home at first, later leading to jobs and learning more on the job)


Well, I'm sure there's a whole lot more I can't think of right now.

Even if you get spoon fed some initial theory in three or four or six years at a university, there is going to be a WHOLE HEAP of new stuff invented in the 40+ years you're going to be working after university, so you'd BETTER be capable of learning it on your own if you don't want to become a dinosaur.
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4957
  • Country: si
Re: Too many programming languages?
« Reply #73 on: September 25, 2019, 07:24:23 pm »
Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.
Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Well yes you can make up new concepts in languages and VHDL was designed to encourage that. But the basic things that HDL languages give you from the start are pretty circuit oriented.

My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)

Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.


Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.
Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

I was just trying to show that a parallel architecture does not necessarily need a language designed for parallelism and vice versa. There are compilers that can turn C code into FPGA gates and a HDL simulator is essentially an interpreter that runs HDL code on a CPU. Just that certain languages are better suited for describing certain things.

Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.

Its not the XC language that is at fault here.

Its more that the hardware it it being compiled for is not so great at executing the languages special multitasking features, as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

Go ahead and try out one of there devboards if you don't believe me.
« Last Edit: September 25, 2019, 07:26:50 pm by Berni »
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #74 on: September 25, 2019, 07:58:20 pm »
@tggzzz
Have you ever modeled anything with "Stood"?
My curiousity  :D
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf