Author Topic: Too many programming languages?  (Read 49234 times)

0 Members and 1 Guest are viewing this topic.

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #75 on: September 25, 2019, 08:44:00 pm »
@tggzzz
Have you ever modeled anything with "Stood"?
My curiousity  :D

Nope; never even heard of it!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Too many programming languages?
« Reply #76 on: September 25, 2019, 09:00:46 pm »
Matlab is a commercial and proprietary software.  A free and open source alternative is Octave.  Most of the Matlab scripts can be run in Octave, too.

Recently, there is a trend in using Python instead of Matlab or Octave.

Another recent trend, but with a concept a little different, is Jupyter Notebook https://jupyter.org/

This is a script I wrote for plotting a differential equation using Euler's Method and it produces identical results in Octave and MATLAB.  At the university my grandson is attending, everything they do revolves around MATLAB.  The DE course can spend more time on applications and less time worrying about hand solving/plotting DEs.  What a tremendous improvement over the course I took about 48 years ago.  His course is actually fun!


Code: [Select]
steps=100;                              % number of steps per unit time
dt=1/steps;
t=0:dt:20;                              % t will have 2001 values
n=length(t);
y=zeros(1,n);                           % fill vectors with 0
dy=zeros(1,n);
y(1)=0;                                 % initial value of y(0)
                                        % but index starts at 1
dy(1)=-2*y(1)+2*(1+sin(2*t(1)));        % expression for y'(0) index = 1

for j=2:n                               % step through 2000 values
    y(j)=y(j-1)+dy(j-1)*dt;             % compute value of current y       
    dy(j)=-2*y(j)+2*(1+sin(2*t(j)));    % compute value of y'
end

plot(t,y,t,2*(1+sin(2*t)))
xlabel('t')
ylabel('y')
legend('y(t)','Forcing Function')
title(['Euler''s Method - ' num2str(steps) ' steps per unit time (t)'])[/font]
shg                                     % pull graph to top
[/font]


Is this considered 'programming'?  I suspect it should be...
« Last Edit: September 25, 2019, 09:12:17 pm by rstofer »
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #77 on: September 25, 2019, 09:09:34 pm »
Your view of HDLs is limited.

Have a look at VHDL; there are far more signal types than that, and you can create your own application-specific signal types.

Quote
In fact they are so well aware that HDL is borderline useless for things that are not circuits that development environments for HDL code pretty much all use TCL scripts for there scripting needs rather than Verilog or VHDL.
Completely wrong and irrelevant - simply substitute C/C++ for HDL, and bash for TCL, and you'll see how silly that is.

Well yes you can make up new concepts in languages and VHDL was designed to encourage that. But the basic things that HDL languages give you from the start are pretty circuit oriented.

Naturally they do that reasonably well, but they are far from limited to doing that.

Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)

Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.


Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.


Quote
Quote
For this it just makes more sense to use a language built around the classical sequential execution idea. But don't think all C code gets executed nicely in sequence. If you have a look at large DSPs they have multiple ALUs and MACs while having normal C compilers for them, but the particular compiler is smart enough to recognize what operations don't depend on each other and executes 5 or 10 things in parallel on the DSP core before joining back together into regular sequential execution when the code requires it.
Now you are confusing a language definition and sematics with its implementation by a particular compiler on a particular architecture.

I was just trying to show that a parallel architecture does not necessarily need a language designed for parallelism and vice versa. There are compilers that can turn C code into FPGA gates and a HDL simulator is essentially an interpreter that runs HDL code on a CPU. Just that certain languages are better suited for describing certain things.

I make exactly that point - repeatedly.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.

Quote
Quote
Oh and as for xCORE its not quite as glorious of a futuristic multiprocesor system as it looks at first glance. I have been a moderator on there official forums for a long time and pretty much all the XC software i have seen for it uses the chip like its a 8 core CPU with each core running its own C code. Its more performant to simply fork out execution to 8 threads as the first thing in main() and stop creating more threads. The interprocessor communucation bus is impressive but quite often the software would include some regular C code that uses pointers to make shared memory between threads work (Because XC forbids it, but this circumvents it) since this again gives faster performance. I like how elegantly XC extends C to native multi threading, but it didn't quite work out in practice. And all of this software that uses it as a glorified 8 core CPU and C shared memory pointer hacks is not just code written by the users, its how a lot of the example code provided by XMOS is written.

You appear to contradict yourself in there! Apart from that, xC is irrelevant to HDLs; don't confuse them.

It has long been a truism that you can write Fortran in any language. It shouldn't be a surprise if that tradition continues :(

I hope xC isn't the end of the story, and I'm actively looking for improvements. But it is the best beginning of a story that I have seen.

Its not the XC language that is at fault here.

Its more that the hardware it it being compiled for is not so great at executing the languages special multitasking features,

Please explain that assertion.

Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

Quote
Go ahead and try out one of there devboards if you don't believe me.

I have done.

I found it did exactly what it was designed to do, without any strange "gotchas".

I found using it very easy, especially compared to other MCUs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Too many programming languages?
« Reply #78 on: September 25, 2019, 09:57:36 pm »
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #79 on: September 25, 2019, 10:02:29 pm »
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.

That's sane, even if I might quibble with some details.

I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career. You know the king of thing: not "10 years experience" but "1 years experience repeated 10 times" :)
« Last Edit: September 25, 2019, 10:04:02 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Too many programming languages?
« Reply #80 on: September 25, 2019, 10:12:07 pm »
I think Octave lacks Simulink and that’s a really big deal.

MATLAB Personal Edition is just $149.

SciLab is for you.

It includes Xcos as a substitute for SimuLink so I will install it and play around a bit.  In MATLAB SimuLink, I can plunk down integrators and other devices and model an analog computer solution to some set of equations.  I like analog computing!  That doesn't mean I'm any good at it, scaling time and magnitude are still not well understood, but I'm working on it.

I'll be hung up with MATLAB as long as my grandson is in college since the university uses it for all their math courses.  In fact, there is a first semester mandatory course on just MATLAB.  It was kind of fun!

ETA:  I installed SciLab and it works quite well.  The scripting language is somewhat different and the authors acknowledge that scripts are not nearly as portable as they are between MATLAB and Octave but it isn't a really big deal.  I haven't gotten to simulation.  I printed out the newcomers document and I'll look at it tomorrow.
« Last Edit: September 25, 2019, 11:35:43 pm by rstofer »
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: Too many programming languages?
« Reply #81 on: September 25, 2019, 10:15:21 pm »
I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career.
I disagree, it is not the language, or the components or the tools that make a boring career it is sticking in one job, one company, one domain that makes it repetitive and probably boring. I have been an embedded C programmer for six different companies in five different domains and none of those jobs had much in common except for the language although in many job much more other skills, languages and multi disciplinary skills like mechatronics, mechanics, electronics and software were mixed in the job.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Too many programming languages?
« Reply #82 on: September 25, 2019, 10:19:05 pm »
When it comes to learning programming languages, I'm often overwhelmed. The currently popular ones include but not limited to C, C++, C#, Java, Python, etc. You name it, you gotta learn them!

I think one should master one specific programming language, rather than being an expert of all.

The thing is, as an electronics engineer, which one should I learn most!?


Answering just this question, in the least expansive way, C would be my recommendation for a working EE, at least as a first language.  There are specific applications for the various languages as discussed above:  C++ for business logic, Java for web design, Fortran for number crunching and C for embedded systems.  Python is trying to hit all the bases.

But 'programming' as an art is somewhat independent of language.  It is more closely related to how you 'think' about the problem.  Niklaus Wirth's book 'Data Structures + Algorithms = Programs' is not misnamed and Pascal is my favorite language.  I find myself blocking out programs and writing a kind of pseudo-code before I try putting fingers to keyboard.  I want to have the logic straight before I worry about writing code.

Do not be surprised if you have to become competent is several languages and which is more important will change with every job you get.  But I would suggest starting with C.

That's sane, even if I might quibble with some details.

I'd go further in one respect: if you only need one language in a career, then you will have had a boring repetitive career. You know the king of thing: not "10 years experience" but "1 years experience repeated 10 times" :)

Heck, I quibble with it too!  Here's what I really think:  These is more money in managing engineering than in doing engineering.  You probably need some familiarity with programming but, if you're clever, you BUY programming, you don't DO programming.  In fact, the last thing you want to do is become the company's best programmer!  You'll never get promoted that way!

The day you graduate EE school, you sign up for an MBA program.  If you're an overachiever, go ahead and get your MSEE but still cap it off with an MBA (it'll be easy after all the math in EE school).  Never work a day as an engineer, just buy it!
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #83 on: September 25, 2019, 11:39:55 pm »
Heck, I quibble with it too!  Here's what I really think:  These is more money in managing engineering than in doing engineering.  You probably need some familiarity with programming but, if you're clever, you BUY programming, you don't DO programming.  In fact, the last thing you want to do is become the company's best programmer!  You'll never get promoted that way!

The day you graduate EE school, you sign up for an MBA program.  If you're an overachiever, go ahead and get your MSEE but still cap it off with an MBA (it'll be easy after all the math in EE school).  Never work a day as an engineer, just buy it!

Do you work for Boeing? :)

If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I have written contracts and been a project manager, but decided not to repeat that experience more than necessary :)

OTOH many times I have gone from initial concept, through hardware and software architecture and design (using analogue/FPGA/digital/hard&soft realtime/webshops/high availability telecoms/etc), through implementation, testing and acceptance trials. Great fun. (N.B. there was rarely a dedicated project manager per se, other than the engineers)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline emece67

  • Frequent Contributor
  • **
  • !
  • Posts: 614
  • Country: 00
Re: Too many programming languages?
« Reply #84 on: September 25, 2019, 11:55:07 pm »
.
« Last Edit: August 19, 2022, 02:30:48 pm by emece67 »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Too many programming languages?
« Reply #85 on: September 26, 2019, 12:34:28 am »
If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I think I'll go with the Bureau of Labor Statistics definition of programmer:

https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm

As opposed to Software Developer

https://www.bls.gov/ooh/computer-and-information-technology/software-developers.htm

As opposed to Computer Hardware Engineer

https://www.bls.gov/ooh/architecture-and-engineering/computer-hardware-engineers.htm

In the Electrical Engineer class, the BLS conflates power engineering with electonics engineering

https://www.bls.gov/ooh/architecture-and-engineering/electrical-and-electronics-engineers.htm

The nice thing about the BLS site is that you can drill down by area to find out how much the code weenies are making in Seattle ($129K and that's the median).  Scroll down about 2/3 of the page:

https://www.bls.gov/oes/current/oes151131.htm#st

« Last Edit: September 26, 2019, 12:36:20 am by rstofer »
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #86 on: September 26, 2019, 12:40:58 am »
About HDLs, well I have used (and still use them) a lot, and, although I must admit that, in fact, they are not only HDLs, but also programming languages (hey, VHDL has pointers!!), I think that their purpose is not being used as such, but as HDLs. Maybe you will also need to learn any of them, but I doubt that you'll need them as a replacement for some programming language lacking ways to express parallelism.

Using an HDL as a replacement for a conventional general purpose language would be peverse.

Even if you don't use HDLs for hardware designs, the normal HDL design patterns really emphasise parallel operation - you simply can't ignore it (without being very peverse!).

Being familiar with such "thinking parallel" will be come a continually more important aspect of software over the next few decades.

Quote
I think other languages/tools (say OpenCL/WebCL) are/will be better for that.

Most conventional languages have parallel execution as a bolt-on. Some (and I'm looking at you, C!) couldn't even express threading until the last 5 years!

Quote
SW FSMs and systems comprised of various interconnected SW FSM can be understood without previous knowledge of HW FSMs and wrote even in plain vanilla C, I can's see here any benefit of HDLs over bare programming languages. And, if you insist on HDLs, why not SystemC?

The key point is to understand how and when to use one or more FSMs. How you implement them is a completely different issue. Indeed, frequently you implement part of a single FSM in hardware and the rest in software.

In general purpose languages there are several useful design patterns, e.g. 2D event/state jump tables and state==class.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #87 on: September 26, 2019, 12:42:59 am »
If "programmer" means somebody that implements other people's designs, I agree that sounds terrible. I've never done that.

I think I'll go with the Bureau of Labor Statistics definition of programmer:

They are contested and change over time.

When I was young "computer" was a job title.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline bjdhjy888Topic starter

  • Regular Contributor
  • *
  • Posts: 62
  • Country: ca
Re: Too many programming languages?
« Reply #88 on: September 26, 2019, 01:08:45 am »
Forth?  :horse:

Mom: Son, what's with these huge C/C++ books in your room?  :o
Me: Mom, I'm learning Forth.  8)
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4199
  • Country: us
Re: Too many programming languages?
« Reply #89 on: September 26, 2019, 02:57:31 am »
Quote
These is more money in managing engineering than in doing engineering.
Probably.  Did the OP talk about making lots of money?  I've been a manager; it wasn't as fun as being an engineer.  I hired some good people, though!
If you do go into management, try to be a GOOD manager.
Former co-worker's recent FB post:
Quote
Last week, Facebook employee Qin Chen jumped from the 4th floor of a Facebook building in Menlo Park leaving behind a wife and young daughter. Facebook, Google and other Silicon Valley companies are full of horrible, inexperienced managers and incompetent, uncaring HR staff that are ruining people's lives. It's time for Silicon Valley to clean up their act. Below is a screenshot with my blood pressure from a year ago while working at Facebook and today, one month after leaving Facebook.
[See also]:

Quote
I think one should master one specific programming language, rather than being an expert of all.
Horrible attitude.  I wouldn't have hired anyone who came in saying that.  I did hire people who hadn't programmed much in the "target language", because understanding the problems being solved is more important than being an expert at the programming language...

 

Offline bjdhjy888Topic starter

  • Regular Contributor
  • *
  • Posts: 62
  • Country: ca
Re: Too many programming languages?
« Reply #90 on: September 26, 2019, 03:16:32 am »
Quote
Quote
I think one should master one specific programming language, rather than being an expert of all.
Horrible attitude.  I wouldn't have hired anyone who came in saying that. I did hire people who hadn't programmed much in the "target language", because understanding the problems being solved is more important than being an expert at the programming language...
Yo dah boss!
 :-*
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #91 on: September 26, 2019, 05:05:08 am »
Quote
Last week, Facebook employee Qin Chen jumped from the 4th floor of a Facebook building in Menlo Park leaving behind a wife and young daughter. Facebook, Google and other Silicon Valley companies are full of horrible, inexperienced managers and incompetent, uncaring HR staff that are ruining people's lives. It's time for Silicon Valley to clean up their act. Below is a screenshot with my blood pressure from a year ago while working at Facebook and today, one month after leaving Facebook.

Yeah.

I just moved to Silicon Valley (and the US) this year to work for a startup. It's growing so quickly that inevitably some things are not as well coordinated would be ideal, and customers are constantly wanting things that are on the roadmap for 2020 or 2021 *tomorrow*, as well as asking for more documentation, more example code etc etc.

This is of course much better than no one noticing or caring about what you are doing.

Fortunately there are plenty of customers happy enough with what we *do* have now that design-wins and revenue are snowballing.

There is pressure to get things done but, unlike some of the companies you mention (and Apple and Amazon, by reputation), I don't see any expectation for people to work crazy hours and burn themselves out. I don't know whether this has something to do with our high proportion of seasoned veterans in their 40s, 50s, and even 60s who have seen it all and won't take the same crap that new grads will take.


As for the original question of this thread, which language(s) to learn, I have suggest two:

1) to learn about simple programming using libraries, and algorithms, you want a language like Scheme (especially the Racket system) or another similar language such as Lisp or Python or Ruby. For the more rigorous mathematically minded maybe go for Haskell or OCaml instead. Rust, Go, and D can also fit this bill.

2) to learn about how computers actually work, and best understand both their limitations and how to take maximum advantage of them you should simultaneously and incrementally learn C, a reasonably sane assembly language you have a C compiler for (RISC-V is ideal, MIPS, PowerPC, or one of the ARM variants are the next best thing, and AVR is not bad too), the CPU programmer's model (registers etc). how instructions are encoded in binary.

Write some code in C. Compile it and read the assembly language. Single step in both C and assembly language. Write some assembly language functions and call them from C. Call C library functions from assembly language.

Pay particular attention to pointers, arrays, globals vs stack vs heap. You're never going to be a good programmer, in ANY language, without a good mental model of how those work. And you're very unlikely to come up with the right mental model without looking at the assembly language.

For really advanced understand of algorithms and data structures and how they perform you need to know about caches, TLBs, branch prediction and things like that.
 
The following users thanked this post: bjdhjy888

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4957
  • Country: si
Re: Too many programming languages?
« Reply #92 on: September 26, 2019, 05:58:05 am »
Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)
Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.
Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.

Please decide on the point you are trying to make, because you appear to be arguing against your previous points.

I said in the beginning of this conversation that it doesn't make sense to use HDL for anything other than describing complex digital circuits. I was making my point why this is the case.

As for modeling the whole system. I would love to do that, but have you had any luck getting the HDL source code from MCU or CPU vendors? What you tend to get at most is a executable of a simulator for the CPU core and maybe a few basic peripherals(Like a timer) if you are lucky.

In typical electronics engineering work you will pretty much only encounter HDLs when working with FPGAs. That is unless you work for a integrated circuit vendor like from TI to Intel or a architecture vendor like ARM.

Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #93 on: September 26, 2019, 07:40:52 am »
Quote
My point is that HDL languages are made to serve a niche application in digital circuitry design and are not really all that useful for developing general purpose software to run on computers. (Until perhaps Intel starts including user programmable Altera FPGA coprocessors in PCs)
Shrug. Nobody in their right mind would think of using VHDL as a Python replacement, nor vice versa.

But that's the whole point of recommending that the OP also learns the key features of an HDL. Then they will know which tool is relevant to each circumstance.
Quote
Can you point out a few examples where HDL is used in some context that at no point relates to digital circuitry? Yes you can in theory write an entire operating system in Verilog that can run in a HDL simulator since it is touring complete, but i mean examples where it actually makes sense to use a HDL language versus other more typical programing or scripting languages.

Don't ask stupid questions. Writing an entire operating system in an HDL is as non-sensical as suggesting implementing hardware logic in Java.

The OP is interested in embedded electronic systems. Those usually involve splitting functionality between hardware and software.

When implementing such systems, sometimes it is beneficial to model the system, i.e. both hardware and software. At that point there is merit to using an HDL to model both.

Please decide on the point you are trying to make, because you appear to be arguing against your previous points.

I'm not.

In my first post I wrote (new emphasis) "You should learn the concepts, advantages and disadvantages of one example of each type of programming language. That will enable you to choose the right tool for the job, just as you should know whether to use screws, or nails, or bolts. Once you know the concepts, picking up the next language of that type is simple."

Quote
I said in the beginning of this conversation that it doesn't make sense to use HDL for anything other than describing complex digital circuits. I was making my point why this is the case.

You put the point more strongly than that, and in any case, it is wrong.

Systems are more than merely digital bits, and HDLs can and do model other parts, e.g. analogue and (to stretch the point) humans.

Quote
As for modeling the whole system. I would love to do that, but have you had any luck getting the HDL source code from MCU or CPU vendors? What you tend to get at most is a executable of a simulator for the CPU core and maybe a few basic peripherals(Like a timer) if you are lucky.

That would be simulation, and while it can be done with softcores, it is not very sensible.

What is done is to model a processor's computation. Where appropriate, while refining the design some of the model can be moved into simulation, e.g. the boundary between hardware and software. Such stepwise refinement of a design is normal.

Quote
In typical electronics engineering work you will pretty much only encounter HDLs when working with FPGAs. That is unless you work for a integrated circuit vendor like from TI to Intel or a architecture vendor like ARM.

You should add in the concept of a system that will be partitioned between an FPGA, a processor, and everything else.

In all cases an engineer will use an appropriate tool for the job, and will swap between tools as the project progresses.

Quote
Quote
as a result the users of the language avoid its innovative functionality and end up doing it the same way things ware done in regular oldschool C, since on this particular hardware that results in better performance. In the end the thing it is running on is pretty much a regular CPU that just happens to have "uber hyperthreading" ability of executing 8 threads on a single core and has a fancy proprietary messagebox based bus connecting it to other CPUs.

Correct, but what's your point?

If your point is that implementations have limitations, is that really news to anyone?

Quote
I was really exited about it when i first discovered XMOS processors, but after working with them and writing quite a bit of code for them, seeing new chips that came out etc... i eventually lost hope in it. It ended up being just another MCU that sort of has a built in 'hardware RTOS' but has very little in the way of peripherals, ending up used in applications where other MCUs can be used too. So far its most used application is USB Audio because XMOS is one of the rare few that provide a good working driver and code for doing asyncronous usb audio.

It sounds as if you really don't understand how the boundaries between hardware and software are very grey and movable, especially in the context of system architecture and design. Given that, it doesn't surprise me if you continue to think in the old familiar ways.

A new generation will come and supplant that thinking; they won't have any choice since the existing enhancement techniques (based on scaling semiconductor processes) have run out of steam.

CSP-based concepts offer a way forward. I want to find others, but they haven't appeared yet.

I was only involved enough at XMOS to get a slightly earlier heads up on new products and get some early access to dev tools. So i don't know the long term road map for there products, but what i seen so far was not living up to my multiprocessing expectations, hopefully they can change that in the future because it could be something truly innovative for computing. After all the useful applications of the language is determined by the hardware running it and so far XC only compiles for xmos chips.

Can you name a end application for XMOS processors where they really have an advantage over existing solutions like MCUs, DSPs, FPGAs?

The word "really" implies a subjective response, and there would be legitimate differences of opinion. That's true for all languages and technologies.

Most importantly for this thread, the application of concepts is most definitely not limited by the hardware. In particular, CSP concepts are embodied in many languages (and some hardware), of which xC is the most "pure" example.

The concepts will survive and be useful during an entire career, even though languages come and go and mutate to breaking point.

The OP would do well to understand the concepts.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: Too many programming languages?
« Reply #94 on: September 26, 2019, 08:19:22 am »
Perhaps a good idea to go back on topic: programming languages ?
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4199
  • Country: us
Re: Too many programming languages?
« Reply #95 on: September 26, 2019, 09:43:15 am »
OK.
There is no reason NOT to have exposed yourself at least briefly to a lot of languages.  Vastly Different languages.


Even if you were to become an embedded programmer who does all their products in C, you'll still need to understand "make", and "shell scripts" (or .bat files), possibly including a bunch of those obscure unix tools like sed, awk, and perl (all part of "shell scripts, I guess.  For windows these days you should probably learn powershell.)


People write python programs to extract data from spreadsheets, xml, json, and so on, for use by their C programs.

There's no reason NOT to expose yourself to Forth.  If nothing else, Forth-like interpreters are easy ways to put a relatively complex UI on your product.  Even if you actually write the interpreter in C, keeping to some of the Forth standard word names is "helpful" to other people (and maybe you won't have to document as much.)
Your editor might have an extension language that isn't C-like (EMACS uses lisp!)  And you might need it.
You'd need to know the assembly language of your target CPU, occasionally.   And since that target is likely to change over years, you should probably be pretty familiar with more than one assembler.  Assembly macros are their own language, and they're useful too.

You should figure out why everyone hates COBOL.  It wouldn't hurt to look at some of the other "historic" languages (Fortran, Algol, BASIC.)

You should figure out why the Aviation folk want Ada.

You should have at least an inkling of how to program a desktop and a cellphone, because those are frequently the user interface for an embedded system.
You should try a hardware definition language.  And a math tool that does symbolic manipulation (so you can cheat on your calculus homework!)

Specialization is for Insects.
 
The following users thanked this post: SparkyFX, techman-001, bjdhjy888

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: Too many programming languages?
« Reply #96 on: September 26, 2019, 09:53:45 am »
OK.
There is no reason NOT to have exposed yourself at least briefly to a lot of languages.  Vastly Different languages.


Even if you were to become an embedded programmer who does all their products in C, you'll still need to understand "make", and "shell scripts" (or .bat files), possibly including a bunch of those obscure unix tools like sed, awk, and perl (all part of "shell scripts, I guess.  For windows these days you should probably learn powershell.)


People write python programs to extract data from spreadsheets, xml, json, and so on, for use by their C programs.

There's no reason NOT to expose yourself to Forth.  If nothing else, Forth-like interpreters are easy ways to put a relatively complex UI on your product.  Even if you actually write the interpreter in C, keeping to some of the Forth standard word names is "helpful" to other people (and maybe you won't have to document as much.)
Your editor might have an extension language that isn't C-like (EMACS uses lisp!)  And you might need it.
You'd need to know the assembly language of your target CPU, occasionally.   And since that target is likely to change over years, you should probably be pretty familiar with more than one assembler.  Assembly macros are their own language, and they're useful too.

You should figure out why everyone hates COBOL.  It wouldn't hurt to look at some of the other "historic" languages (Fortran, Algol, BASIC.)

You should figure out why the Aviation folk want Ada.

You should have at least an inkling of how to program a desktop and a cellphone, because those are frequently the user interface for an embedded system.
You should try a hardware definition language.  And a math tool that does symbolic manipulation (so you can cheat on your calculus homework!)

Specialization is for Insects.
Are you a manager by any chance? They have no clue how much time it takes to snif on a language and learn something significant.
You can always say you have to know everything of everything. I don't agree.

You want to know a few programming languages and script languages and learn them when needed on the job, if not needed and you are not interested, don't bother.
Perhaps in the 60s you could know everything, nowadays that is really impossible and you better be pretty good at one thing (specialist) and moderately good at many other things, then be moderately to bad in everything. At least that is my experience. In our programming team we know who to contact for which subject because that person knows more about it then the rest of the team, every one has some kind of specialization (except the newbees that get to choose their future expertise in a team decision).

If we would have 8 specialists with the exact same specialization we could not do our job.
If we would have 8 generalists that know something to moderately about everything we could not do our job esp. not in time because we need to dig in to solve some difficult issues.

Besides the programming and scripting languages, compilers, other tools like versioncontrol etc. etc. etc. the most important thing for any company is the domain knowledge.
 
The following users thanked this post: bjdhjy888

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #97 on: September 26, 2019, 10:02:27 am »
Perhaps a good idea to go back on topic: programming languages ?

It remains on the topic of the thread's title, even if you don't recognise that!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19517
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #98 on: September 26, 2019, 10:21:23 am »
Are you a manager by any chance? They have no clue how much time it takes to snif on a language and learn something significant.

With experience and a halfway description of why a language exists (e.g. a whitepaper), it takes between 15 minutes and 1 hour to decide that language X is merely a minor variation on existing languages. Then language X can be ignored. That deals with 95% of "new" languages.

For the other 5%, it takes a couple of hours to understand the key benefits. You then remember the benefits and, when the benefits are sufficiently compelling for the current task, you start to learn and use that language. Today's example: I'm using OpenSCAD to create a collet for a 2465 fan, because it is a good fit for the problem at hand.

For beginners the timescale will be longer, of course.

Quote
You can always say you have to know everything of everything. I don't agree.

Strawman argument; nobody has said that.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline obiwanjacobi

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: nl
  • What's this yippee-yayoh pin you talk about!?
    • Marctronix Blog
Re: Too many programming languages?
« Reply #99 on: September 26, 2019, 10:51:27 am »
You are all so wrong!
 ;D

It is not the language you have to learn, it's the ecosystem (of a specific language/technology) you have to master.

Once you have some experience with any programming language syntax, learning other languages is really not a big deal (usually).
But learning the the environment is what requires time and experience.
Arduino Template Library | Zalt Z80 Computer
Wrong code should not compile!
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf