Author Topic: Too many programming languages?  (Read 49222 times)

0 Members and 1 Guest are viewing this topic.

Offline blacksheeplogic

  • Frequent Contributor
  • **
  • Posts: 532
  • Country: nz
Re: Too many programming languages?
« Reply #475 on: December 07, 2019, 03:15:35 am »
In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.

Perhaps I'm not quite understanding your requirement here or where/how you see this being done without some form of execution analysis. But I might also be too narrowly focused as I was primarily involved with (and spent most of my time doing) instruction sequencing which does not require execution. Also, parallelism would look at functional blocks and it's not clear if that's what you are referring to by operation. It's an area I had little to no involvement in.

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code;

I'm a little lost here. Instruction sequencing (which includes scheduling,  dependencies, instruction groups, available execution units, store queues etc) are considered during optimization.  Again, you seem to be using operations in a broader context so my response may be too narrow.

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.

Perhaps, but as I said my use/interest of FPDR was primarily to look at branching. Run-time analysis can provide a lot more insight than that use case even tho it is not feed back to the optimizer.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #476 on: December 07, 2019, 04:23:24 am »
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant
I think one way to do that is with , instead of ;
No, comma is a sequence point in C: the side effects of the left side must be complete before the side effects of the right side start.

However, there is one workaround with GCC: splitting individual operations into (often static inline) functions marked pure or const, and using local scope const variables to hold the temporary results.  Then, GCC knows there are no side effects, and is free to reorder and interleave the machine code implementation.  For example:
Code: [Select]
static inline double op1(const double arg1, const double arg2) __attribute__((const));
static inline double op1(const double arg1, const double arg2)
{
   /* Implementation depending only arg1 and arg2, does not change any variables */
}

static inline double op2(double arg1, double arg2, double arg3) __attribute__((pure));
static inline double op2(double arg1, double arg2, double arg3)
{
   /* Implementation depending on arg1, arg2, arg3, and optionally global variables,
       but may only modify arg1, arg2, arg3. */
}
then a sequence similar to say
Code: [Select]
    /* double arg1, arg2, arg3; */
    double result;
    {
        const double temp1 = op1(arg1, arg2);
        const double temp2 = op1(arg1, arg3);
        const double temp3 = op2(arg1, arg2, arg3);
        result = op2(temp1, temp2, temp3);
    }
has maximal opportunities for optimizing the code used to calculate result, since GCC knows the only side effect is the value of result itself.

(It might be better to declare result in the same scope with current versions of GCC; the above code is for illustration only, and not from a real world project.)

However, as you can see, splitting a complicated formula, say something like used in the embedded atom model (EAM, often used in molecular dynamic simulations involving metals), the code does become completely undebuggable and write-only.  Especially so when you vectorize the calculation (for two or four pairs of atoms) using SSE or AVX.

After all, we could write the code in assembly right now.  It just isn't worth the time and especially maintenance effort.  I believe, but do not know for sure, that a new low-level programming language not completely different to C, could solve this and other system-level programming issues.

Another problem occurs when the calculation does have side effects (for example, you keep min/max tally, reorder the neighbour list to keep pairs within the interaction distance at the beginning of the slice for a particular atom, and so on), but the programmer knows their order does not matter.  I can cheat by implementing the function calls in a separate compilation unit, but declare them as const/pure in the compilation unit they're used in, but then the compiler cannot inline the machine code.

I'd rather not write an assembly generator and test harness to implement optimized versions of these hot code paths, as they're not worth it -- maintainability is much more important than the few percentage point speed increase.  But, if there was a low level language where such concepts could be expressed in, with the compiler itself understanding the SIMD vector concept and atomic ops (including support for both CAS and LL/SC, and preprocessing/choosing an implementation at compile time depending on which one is used), this kind of high-performance code might become somewhat more readable.

This is not, however, just for HPC, but also for low-level libraries, and kernel or freestanding code.  Stuff like privilege boundaries (like between the kernel and userspace, and between userspace processes) needs atomics and exact control of when side effects are to be visible.  Only the compiler knows exactly which hardware architecture and processor type the code is compiled for (and ELF binaries at least have a native way for functions to provide the linkage for other functions at run time), so all this stuff really needs to be part of the language provided by the compiler, and not a "library" in the C sense.

I do not know the "best" feature set myself yet, but looking at other languages (and especially their syntax: how they express intent, side effects, and so on), gives new ideas to me at least.

A similar-ish issue exists in userspace programming, especially GUI applications.  Desktop environments and GUIs seem to be best implemented using an event-based approach, and threads and queues can be used to make responsive, efficient applications.  GTK+ shows that you do not need to have an OOP language to implement one, even C suffices, but Python shows that you can do it in simple, concise code.  (Python's downside is that it has slowish I/O, and is an interpreted language where the standard interpreter can only run bytecode in one thread at a time; I'd much rather see a compiled language with similar features, and a modular, lightweight runtime.)

In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.
Perhaps I'm not quite understanding your requirement here or where/how you see this being done without some form of execution analysis.
No, I thought you were referring to standard runtime analysis tools, which instead of analysing the machine code, simply run it, measuring the time taken, sampling the instruction pointer.

On superscalar processors, C's sequence points (in the presence of side effects) mean that the pipelines are not fully utilized, especially for simple code.
I would like to mark code so the compiler gets to ignore the order of side effects, and thus sequence points altogether, and try its hardest (even use extraordinary amounts of time) to optimize these rare code sequences.

Also, parallelism would look at functional blocks and it's not clear if that's what you are referring to by operation.
I am referring to parallelism; specifically, instruction-level parallelism on superscalar architectures, not thread- or process-based parallelism, for chunks of code whose order of execution or order of side effects are irrelevant for the correctness of the program; that being up to the programmer, and not for the compiler to detect.

 (I do not know how much you know about the subject, and anyway always try to write in a way that keeps anyone interested in the subject along with the ride.  So, please do not read my tone as condescending or anything.  I have no subtext, me no English wrote that good.)

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code;
I'm a little lost here. Instruction sequencing (which includes scheduling,  dependencies, instruction groups, available execution units, store queues etc) are considered during optimization.
Yes, but the first problem occurs at the higher stage.  For example, how do you tell the compiler that you have two or more program sequences that may be executed in any order, even simultaneously (instruction-level parallelized on superscalar architectures, not using different threads of execution)?

The secondary issue is that when you have a very hot (takes much of the total CPU time) set of e.g. mathematical operations, there are many different ways to turn that sequence of operations into machine code.  Brute-force, i.e. trial and error, method applied to the entire program would make the compiler insufferably slow, but it might be worth it to spend such efforts for specifically marked blocks of code.  I'm not even sure how one would implement such an optimizer, but I'm pretty sure it'd have to include generating variants of the relevant abstract syntax tree, and so on.

The mechanism used for this could be used to e.g. tell the compiler that code that does graphics using single precision floating point numbers does not need to be fully IEEE-754 compliant (only finite normal values, and larger than 0.5ULP error allowed), while some other code doing e.g. statistics in the same compilation unit does need to.  (I know: why you'd have them both in the same compilation unit in the first place? but this is just the example that popped in my mind.)

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.
Perhaps, but as I said my use/interest of FPDR was primarily to look at branching. Run-time analysis can provide a lot more insight than that use case even tho it is not feed back to the optimizer.
Sure, I didn't mean to imply it is not useful; again, just that run-time analysis in this particular case shows that the block(s) of code at hand are indeed very "hot", with well over 50% of the CPU time spent there.

Optimization is kind of a funny problem: trying to find an efficient expression, without using excessive resources (especially time) while doing so.  There are some very rare, often complex but branchless, chunks of code I'd like the compiler to go beast-mode on when generating the code, because I already know that the CPU will spend most of its time in those particular chunks of code, no matter what the dataset.

I wonder how many optimization methods have been deemed un-applicable, because they were deemed too slow in practice?  Or how many have not been explored, because they are known to not scale well enough to be practical?
« Last Edit: December 07, 2019, 04:33:04 am by Nominal Animal »
 
The following users thanked this post: GeorgeOfTheJungle

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #477 on: December 07, 2019, 04:26:32 pm »
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant
I think one way to do that is with , instead of ;
No, comma is a sequence point in C: the side effects of the left side must be complete before the side effects of the right side start.

Yep.

Anyway, regarding the order of execution, I get your point (sort of), but I think it can get hairy real fast. Whereas the "simple" cases (in which the compilers can safely infer that operations can be re-ordered because there are no dependencies between them, and no "side-effect" in the sense that said side-effect is unknown - external calls for instance) are handled pretty well by most decent optimizing compilers, the more complex cases can be VERY hard to handle. A complete optimization (in the sense that you'd get the *optimal* execution) is, I think, a NP-hard problem.
« Last Edit: December 07, 2019, 04:34:04 pm by SiliconWizard »
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #478 on: December 07, 2019, 06:55:25 pm »
Anyway, regarding the order of execution, I get your point (sort of), but I think it can get hairy real fast. Whereas the "simple" cases (in which the compilers can safely infer that operations can be re-ordered because there are no dependencies between them, and no "side-effect" in the sense that said side-effect is unknown - external calls for instance) are handled pretty well by most decent optimizing compilers, the more complex cases can be VERY hard to handle. A complete optimization (in the sense that you'd get the *optimal* execution) is, I think, a NP-hard problem.
Yes, and this is exactly why I'd like a low-level language with some kind of annotation to mark those rare chunks of code.

In a very real sense, this is a human problem, and not a machine one.  What kind of syntax is needed for humans to intuitively indicate these things?

One problem is that text is essentially an one-dimensional stream for us humans.  I don't like visual programming, which just simplifies things even further, but working in environments like Macromedia Director over two decades ago showed me that attaching textual code (Lingo) to visual elements, even non-programmers can easily grasp event-based programming.  (Although it is described as OOP, at the core it is much more about events than objects.)

I believe the way many experienced programmers use editors -- having several windows of code -- is related; that is, a way to overcome the one-dimensional nature of text.  Is there a syntax that would help?  Or do we simply need better editors?  Or is there a syntax that is human-readable, but at the same time allows much better editors?  Could we embed code within a markup language for this; would it work for humans, cognitively?

I am personally comfortable with both indent-based (Python) and delimiter-based (C, C++) code scoping, but only the latter gives an obvious way to attach compiler directives or attributes to the code block (by extending the delimiter syntax).  However, one of the new experimental languages might have found an even better way, and this is a big part of why I do not mind there being a lot of programming languages: programmers may discover interesting ways to do things.

An odd wrinkle to this is that when familiarising oneself with a new project, or examining one, the linear one-dimensional stream is basically the easiest, most effective way.  Our books (both dead tree and electronic) are essentially one-dimensional, and we use those when we learn stuff.  However, when editing and modifying, the linear one-dimensional stream is a hindrance -- at least if you are one of those who use several windows to the same codebase when editing.  Other than developing new programming languages, and creating new abstractions in an effort to move programming away from the hardware and "closer to human ways of thinking", we haven't really explored the human cognitive aspects of programming for low-level languages; we still use the same ones we used thirty, forty years ago!

Personally, I do not mind using different programming languages in the same project.  I would much prefer having a nice simple event-based language to implement graphical user interfaces, and a low-level/systems programming language to implement the heavy-duty data mangling part.  I kinda like Python and C for this, especially because you can keep the Python part open source (even if proprietary; for users to edit or fix visual issues), and the secret sauce hard work in the C part, and have the entire project surprisingly easy to port between the current operating systems and architectures.  For intermediate-level stuff, like daemons and services dealing with passing data and privilege boundaries, a yet another language (likely OOP like C++), could be useful.
« Last Edit: December 07, 2019, 06:58:41 pm by Nominal Animal »
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #479 on: December 07, 2019, 08:52:21 pm »
Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
I was once advised to use Pi as the factor. :)

Also, I'm one of the people who do not multitask well, and have had to learn to be specific and clear about priorities and timetables.

Pi works well, but it is nice to have at least a little justification for using it :)

Far more people think they can multitask than can actually multitask effectively.

Not being specific about priorities is a best a sign of a confused mind, and at worst a sign that somebody is being setup as the project scapegoat.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #480 on: December 07, 2019, 08:59:55 pm »
While I don't like many (most?) of the current programming languages, every one of them has provided further insight into what features benefit and what detract from a programming language.  So, while there are lots of languages I would recommend against using when writing widely-used tools or end-user applications, I do see them having a valid use in research and experimentation: without them, our future programming languages would be worse.

I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

Nowadays there is a similar point to be made about languages. Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #481 on: December 07, 2019, 09:05:16 pm »
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so.

Not quite true. It is more helpful to state "...when the C language specification indicates they can safely do so and the programmer understands the C language specification and the programmer has appropriately written C code".

Yes, there are quite a few real and practical pitfall traps there.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #482 on: December 07, 2019, 10:17:26 pm »
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
For the same reason, I'd like a mid-level systems language (for services and daemons) with a heavy emphasis on asynchronous I/O, queues, message passing, basic lockless atomic ops (CAS or LL/SC), and thread-local state (as opposed to global state).  (Essentially the same thing, I think, but different emphasis or approach.)

At this point, even a router (at least my Asus RT-AC51U running OpenWRT trunk) has a couple of dozen processes, most of which are single-threaded, but almost all would benefit from async I/O and message passing (events, state changes at minimum).  The multithreaded ones (services!) even more so.

It seems to me that heterogenous processing is becoming more and more common.  I've watched with interest how the "packet filtering" mechanisms in the Linux kernel -- essentially a minimal bytecode to examine data and make decisions based on it, passed from an unprivileged context, verified, then executed in a privileged context -- has evolved, and something very much like that will eventually end up in one of the embedded network-I/O-gadget processors.  Definitely superior approach to IP filtering, for example.

So, I do think we need several programming languages, to cater for the different contexts.

there are quite a few real and practical pitfall traps [in reordering operations when compiling C code].
Especially when there are side effects whose order does not matter at all.  I really haven't found a good way to tell that to even GCC.

Because of this discussion, I did think of one possibility: writing static inline helper functions marked pure, implementing basic atomic operations (to storage only accessed atomically) in inline assembly -- essentially cheating, telling the compiler there are no side effects.  This would actually suffice, as these functions do modify the program state, but at an unspecified time at any point of the scope they are used in, so the compiler is free to reorder them as it sees fit within that scope.  Good enough for me!

This also indicates to me that such annotation is not needed for arbitrary expressions, only for functions and (entire) scopes.  Surprising!
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #483 on: December 07, 2019, 10:47:47 pm »
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #484 on: December 07, 2019, 11:32:54 pm »
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.
That's true now, but in 1978?
Perhaps the interviewer didn't care about technical accuracy, but the attitude/approach to tools?

I mean, you probably have asked an interviewee which programming language they'd use to solve problem X, then which language they like best, and drawn conclusions about whether they answer with the same programming language or environment for both, instead of which particular languages they mention?  ;)
 

Offline Nusa

  • Super Contributor
  • ***
  • Posts: 2416
  • Country: us
Re: Too many programming languages?
« Reply #485 on: December 08, 2019, 12:06:07 am »
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.

C didn't have a lot of penetration in 1978 (first K&R edition was published that year) except in Bell Labs and university groups. Pascal was more established, but not for micros. Assembly was still the way to go for most at that time. Development environment and tools were primitive compared to today, but they still mattered.
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4199
  • Country: us
Re: Too many programming languages?
« Reply #486 on: December 08, 2019, 02:24:19 am »
Quote
That's true now, but in 1978?
I was going to mention that...
BDS C for CP/M didn't come out until 1979, Turbo Pascal not until 1983.Stiil, I guess some chips had both integer AND floating-point BASICs, and the bigger CP/M systems were advertising Cobol, Fortran, and Pascal support, and ASM quality (as well as BASIC quality) was pretty variable (all of which were rather expensive, BTW.)There were also mainframe-based cross compilers and assemblers for some of the micros, which was probably pretty helpful in some cases.  All of my early code was compiled on something relatively big and expensive (include the 8086SDK Senior Design Project in ~1980 - I think I used a cross-assembler on a "big" CP/M system.  8086 penetrated really slowly.  It wasn't until IBM did the PC and Andy chose the 68k for the Sun-1 (both in ~81) that things really started to take off.)
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #487 on: December 08, 2019, 06:48:30 am »
Quote
That's true now, but in 1978?
I was going to mention that...
BDS C for CP/M didn't come out until 1979, Turbo Pascal not until 1983.Stiil, I guess some chips had both integer AND floating-point BASICs, and the bigger CP/M systems were advertising Cobol, Fortran, and Pascal support, and ASM quality (as well as BASIC quality) was pretty variable (all of which were rather expensive, BTW.)There were also mainframe-based cross compilers and assemblers for some of the micros, which was probably pretty helpful in some cases.  All of my early code was compiled on something relatively big and expensive (include the 8086SDK Senior Design Project in ~1980 - I think I used a cross-assembler on a "big" CP/M system.  8086 penetrated really slowly.  It wasn't until IBM did the PC and Andy chose the 68k for the Sun-1 (both in ~81) that things really started to take off.)

That's for full-strength languages, yeah.

Wirth's "Algorithms + Data Structures = Programs" was out in 1976 and contained a byte code compiler for a simple subset of Pascal in the back. I'm sure a lot of people typed that into a bigger computer and made an interpreter for the bytecode on a micro. I know I did.

USCD Pascal came out in 1977. A full Pascal running *on* the micro, built on a portable byte-code interpreter written in assembly language for various micros. Apple started selling it for the Apple ][ in 1979. It made a fairly usable self-hosted system even at 1 MHz if you had the full 64 KB of RAM and two floppy disk drives. I certainly wrote quite a few medium-sized programs using it. You could call out to assembly-language for parts where speed was critical, and the assembler and linker were integrated.

You really can't do much better than than for the 6502 for large programs unless you have a *really* sophisticated compiler, which I don't think anyone has ever written anyway.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #488 on: December 08, 2019, 08:38:43 am »
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.
For the same reason, I'd like a mid-level systems language (for services and daemons) with a heavy emphasis on asynchronous I/O, queues, message passing, basic lockless atomic ops (CAS or LL/SC), and thread-local state (as opposed to global state).  (Essentially the same thing, I think, but different emphasis or approach.)

Yes, but I would be content with a language that enabled the above and which came with a standard library of components that implemented the above. That way people would be guided towards doing things properly, while still allowing capabilities to be easily expanded later. In that vein Java enabled Doug Lea to produce his wonderful[1] concurrency library, which was later incorporated into the Java standard libraries.

Never had the chance to use Erlang :(

[1] because it stole the good concepts that had been developed and field proven over the decades, and implemented them in a form that was easily usable. But then the "reuse proven good ideas" was at the heart of the Java culture, cf C++'s continual reinvention of oblong wheels.

Quote
At this point, even a router (at least my Asus RT-AC51U running OpenWRT trunk) has a couple of dozen processes, most of which are single-threaded, but almost all would benefit from async I/O and message passing (events, state changes at minimum).  The multithreaded ones (services!) even more so.

Multithreading is overused.

Frequently it is better to have a small number (i.e. one per core) of "worker threads" that (asynchronously) pick "jobs" from an input queue,  process the job and (synchronously) put the result in another queue.

Quote
It seems to me that heterogenous processing is becoming more and more common.  I've watched with interest how the "packet filtering" mechanisms in the Linux kernel -- essentially a minimal bytecode to examine data and make decisions based on it, passed from an unprivileged context, verified, then executed in a privileged context -- has evolved, and something very much like that will eventually end up in one of the embedded network-I/O-gadget processors.  Definitely superior approach to IP filtering, for example.

So, I do think we need several programming languages, to cater for the different contexts.

Yes, but all things can be taken too far. In this case the trap is a grotty little domain specific language (DSL) than would be better implemented as a library in a decent standard language.

DSLs almost always catch feature creep, so that even the originators don't understand/predict all the interactions[2]. Much better to have a standard language with good general purpose tool support.

[2] Happens to some mainstream languages too, e.g. C++

Quote
there are quite a few real and practical pitfall traps [in reordering operations when compiling C code].
Especially when there are side effects whose order does not matter at all.  I really haven't found a good way to tell that to even GCC.

Because of this discussion, I did think of one possibility: writing static inline helper functions marked pure, implementing basic atomic operations (to storage only accessed atomically) in inline assembly -- essentially cheating, telling the compiler there are no side effects.  This would actually suffice, as these functions do modify the program state, but at an unspecified time at any point of the scope they are used in, so the compiler is free to reorder them as it sees fit within that scope.  Good enough for me!

This also indicates to me that such annotation is not needed for arbitrary expressions, only for functions and (entire) scopes.  Surprising!

And then we consider maintenance in 5 years time by new staff, and/or changes to compiler implementations and capabilities.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #489 on: December 08, 2019, 08:47:22 am »
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.

That's true now, but in 1978?

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

In early 1978 the 8086 didn't exist. We are thinking about processors such as 8080, 8085, Z80, 1802, 6800, 6502, SCMP, F100.


Quote
Everything else was restricted to writing in assembly language, or in a high level language and using a bytecode or token interpreter which enabled reasonably compact programs, but that ran 10 or 20 times slower than the (already slow) machine was capable of.

In 1978 the only Pascal implementation was UCSD p-code. C was not generally known, and the tools would have been primitive. Not interesting to an electronic engineer.

The realistic choice for an HLL was intel's PL/M running in their "blue boxes".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #490 on: December 08, 2019, 08:50:31 am »
I remember a job interview in 1978 where I was asked about which microprocessor I thought was best. The interviewer liked my response that they were all pretty similar, and that a more important point was the development environment and tools.
That's true now, but in 1978?
Perhaps the interviewer didn't care about technical accuracy, but the attitude/approach to tools?

I mean, you probably have asked an interviewee which programming language they'd use to solve problem X, then which language they like best, and drawn conclusions about whether they answer with the same programming language or environment for both, instead of which particular languages they mention?  ;)

Basically yes.

When. as an interviewer, I've asked such open-ended questions, I've been looking to see to what extent the interviewee can justify their answers. That can reveal a lot about their thought processes, depth/breadth of understanding, and flexibility.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline chickenHeadKnob

  • Super Contributor
  • ***
  • Posts: 1055
  • Country: ca
Re: Too many programming languages?
« Reply #491 on: December 08, 2019, 11:13:19 am »

In 1978 the only microprocessors that were at all reasonable targets for running languages such as Pascal or C and actually getting most of the available performance capability (and with a reasonable code size) were the 8086 and 6809, both introduced in that year. Or the TMS9900 and LSI-11, but they were quite uncommon.

In early 1978 the 8086 didn't exist. We are thinking about processors such as 8080, 8085, Z80, 1802, 6800, 6502, SCMP, F100.

In 1978 the only Pascal implementation was UCSD p-code. C was not generally known, and the tools would have been primitive. Not interesting to an electronic engineer.

The realistic choice for an HLL was intel's PL/M running in their "blue boxes".

All of what you wrote conforms to my memory as well. I believe there may have been mainframe implementations of Pascal extant in that year, it was rapidly gaining popularity with academics as the language you should be teaching beginners.  For micros the choice was assembler or PL/M. Motorola had MPL their PL/M equivalent    for the 6800 family. I ended up using both PLM and MPL on concurrent projects right after graduation.

MPL manual http://bitsavers.trailing-edge.com/components/motorola/6800/exorciser/MPL_Language_Reference_Manual_1976.pdf
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #492 on: December 08, 2019, 01:55:22 pm »
human cognitive aspects of programming

Well, a lot of scientists in my team, continuously say that "if aircraft could copy the way geese fly, they would save fuel!". But our best AI has never tried to solve this because fuel cost is not our problem.

Google scientists say they've achieved 'quantum supremacy'(1), which means that a programmable quantum device can solve a problem that classical computers practically cannot.

One month ago, their Quantum computer "Sycamore" solved a problem in 200 seconds, while the IBM massive supercomputer "Summit" would have taken 10K years; but IBM guys replied that a "patch" would have accelerated time to solution because "even aggressive centipedes will co-operate if they have to", which better explains what the IBM-patch does on the Supercomputer's nodes to solve the problem in just 3 days rather than in thousands years.

That's a milestone, because until recently, every computer on the planet, from a 1960s mainframe to iPhone, has operated on the same rules. These were rules that Charles Babbage understood in the 1830s and that Alan Turing codified in the 1930s.

Human beings need a solid purpose to do things. Their nature is being lazy, and I am afraid this also applies to the human cognitive aspects of programming  :-//

(1) This phrase was coined by the physicist John Preskill in 2012.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #493 on: December 08, 2019, 04:53:46 pm »
Human beings need a solid purpose to do things.
Well, no, just instant gratification.

Consider Foldit. The retroviral protease of M-PMV (monkey HIV/AIDS-like virus) has a crystal structure that was unsolved for 15 years.  Foldit gamified it in 2011, and the players found the enzyme structure in ten days.

True engineers, scientists, and tradesmen who look at things in the longer term, are the oddballs.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #494 on: December 08, 2019, 05:29:05 pm »
Human beings need a solid purpose to do things.
Well, no, just instant gratification.

If you define "instant gratification" by getting an instant result, I agree.

But I think both are more closely related than you seem to think. Having a solid purpose and acting upon it gives us a form of instant gratification, even if we can't see the result. It's all related to how our brain reward system works.

So the relatively small fraction of people being able to act on long term rather than short term is not quite, IMO, due to them not needing "instant gratification" per se, but it's likely due to how their reward system specifically works (it's probably more "efficient"!)

Human beings need a solid purpose to do things.

You make it sound as though it was a defect. Actually, all living beings need a purpose for doing anything at all. It's called motivation, and it goes from our very basic needs up to more complex/abstract ones. Doing things without purpose is the oddball. Life does not like wasting energy for no reason. Call that lazyness if you will. In that definition, life is essentially lazy.

The whole point, I think, is not the problem of needing a solid purpose or not for doing things, it's about our capacity of anticipating.
 

Offline hazeanderson

  • Newbie
  • Posts: 9
  • Country: us
Re: Too many programming languages?
« Reply #495 on: December 17, 2019, 02:44:19 pm »
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud." Same concept ... just with networking many instances (that can in turn employ highly parallel processing be it threads, message passing or forking processes) instead of one instance.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #496 on: December 18, 2019, 04:27:52 pm »
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud."

Oh, really. ;D
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #497 on: December 18, 2019, 08:06:33 pm »
Personally I like languages that directly address the issues of highly parallel processing, since they will become increasingly important now that Moores "Law" can no longer be used as an excuse for not thinking.

We have moved on from that ... now it is about highly available micro-services via clustering and node balancing in "The Cloud."

Oh, really. ;D

Indeed.

Far be it from me to observe that gives youngsters new opportunities to repeat old mistakes. And old mistakes that have known solutions, if there are any solutions. Too few youngsters grok the eight laws of distributed programming.

As I tried to teach my daughter, it is OK to make new mistakes.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Nusa

  • Super Contributor
  • ***
  • Posts: 2416
  • Country: us
Re: Too many programming languages?
« Reply #498 on: December 18, 2019, 09:05:15 pm »
Don't you mean the 8 fallacies of distributed programming?
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19516
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #499 on: December 18, 2019, 09:27:45 pm »
Don't you mean the 8 fallacies of distributed programming?

Oh, picky, picky, picky :)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf