Author Topic: Too many programming languages?  (Read 49220 times)

0 Members and 1 Guest are viewing this topic.

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #450 on: December 05, 2019, 03:20:13 pm »
new challange: dev-lang/ghc on a Japanese PDA
ghc-what? give me "H" "a" "s" "k" "e" "l" "l"

Haskell :D
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #451 on: December 05, 2019, 04:14:03 pm »
No Rust though? ;D
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #452 on: December 05, 2019, 04:15:10 pm »
yup, the PowerPC has autoincremented index opcodes, but the ICE doesn't like this too much, for *a lot* of reasons, and, worse still, people can do a mess with the GreenHills C compiler.
Why are you using C at all in those critical projects?

As he said, they are routinely using Ada as well.

But out of curiousity, what language would you use instead?
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #453 on: December 05, 2019, 04:22:23 pm »
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))

Yup. And has everyone forgotten about the 68k already? The pre-dec/post-inc addressing modes were handy on those and used a lot. Separating the access and inc/dec of the address register instead would take up significantly more cycles. It was not just luxury. I remember the earlier C compilers for 68k were explicitely mapping the C pre-dec/post-inc statements using those modes when the size of the pointed object allowed it. It was available way before decent optimizers came out, and it was a pretty straightforward task for compilers, not requiring any fancy algorithm.

I think many have just forgotten what it meant to code for < 1MIPS CISC CPUs with basic tools. We now take for granted what GCC produces for instance (and most modern compilers), but even with -O0, GCC produces code (for most targets) that is light-years more efficient than the compilers of the early days...
 

Offline blacksheeplogic

  • Frequent Contributor
  • **
  • Posts: 532
  • Country: nz
Re: Too many programming languages?
« Reply #454 on: December 05, 2019, 08:53:23 pm »
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).
And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))
I think this turned out to be relatively cheap, hardware-wise.  You needed something similar for the PC anyway, and for stack instructions if you have them...

Often instructions are cracked or have group restrictions or other execution unit dependencies.  So for example store with update instructions were not used as alternatives are equally as good and can be better sequenced. Grabbing a stack frame and using a multi-word store to save the non-volatile registers on paper looked good but the cracked instruction and was slower than alternatives, it also had a first in group restriction.

Unfortunately a lot of the documentation at this level is NDA.
« Last Edit: December 05, 2019, 08:54:56 pm by blacksheeplogic »
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4039
  • Country: nz
Re: Too many programming languages?
« Reply #455 on: December 05, 2019, 09:05:18 pm »
Quote
There were some architectures where ++/-- were supported by specific inc/dec instructions.
Umm.  By the addressing modes available on nearly ALL instructions, on some CPUs.Notably the PDP/11, which supported pre-decrement and post-increment of a memory index register on nearly all of the instructions (and which had some influence on the design of C!).

And everything that copied or claimed to copy the PDP/11 instruction set; surviving is MSP430.  ARM has this too, I guess (some versions of ARM, anyway.)   (But not X86, MIPS, or RISC-V?  (results of a quick check; I may not be up to date!))

Yup. And has everyone forgotten about the 68k already?

Who could forget it? When I need to do accounting I still use a Mac 68000 accounting program I bought in 1992. In an emulator, of course.

M68000 is clearly "one that copied the PDP/11". It's even somewhat "surviving" in the form of Coldfire, which I see the manual describes as both "variable-length RISC" and "A majority of the instructions are binary compatible or optimized 68K opcodes".

You really can't have both :-)

I've never actually had a look at it before now .. you've got the same D0..7 and A0..7 (which was a good hack to double the PDP-11 register set with still 3 bit register and addressing mode fields). And yes, still has (An)+ and -(An) modes.

Aha. "The ColdFire instruction set is a simplified version of the M68000 instruction set. The removed instructions include BCD, bit field, logical rotate, decrement and branch, and integer multiply with a 64-bit result. Nine new MAC instructions have been added." Ok, wouldn't miss most of those.

And, most vitally .. "doesn't include all the useless fancy crud introduced in the 68020". (I made that quote up)

Leaving out the rotate instructions (ROL, ROR, ROXL, ROXR) is interesting. We get a certain amount of shit with RISC-V not having rotate. But it's not used all that often and is easily implemented with a left shift, a right shift, and an OR.

But, yeah, it's a 68k pretty much. I'm guessing it's only the omission of the DBcc instructions that would make many or most 68k binaries incompatible.

Quote
I think many have just forgotten what it meant to code for < 1MIPS CISC CPUs with basic tools. We now take for granted what GCC produces for instance (and most modern compilers), but even with -O0, GCC produces code (for most targets) that is light-years more efficient than the compilers of the early days...

I don't agree with that. Apple's MPW compilers produced pretty good code. Certainly better than gcc -O0 ! THINK C and Pascal were not bad too.

Early compilers for x86 were really awful.

gcc -O1 though, yeah it would beat any of those old compilers.
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #456 on: December 05, 2019, 09:17:14 pm »
No Rust though? ;D

It depends on LLVM, which has some problem with Catalyst at the moment  :-//
 

Offline blacksheeplogic

  • Frequent Contributor
  • **
  • Posts: 532
  • Country: nz
Re: Too many programming languages?
« Reply #457 on: December 05, 2019, 09:24:37 pm »
in today's world we want everyone to pass the test without putting any work in.
Yup. Our automatic and autonomous tests aim for this. But also consider that a part of our "test-cases" is recycled and used to help guys and robots in production to test what they build.

My point though is that competent programmers should understand basic operators and precedence without needs packets of aspirin and a special senior classification. This is irrespective of preferred style and coding standards in the specific organization. Some styles of code I feel are much nicer to maintain than others but it should not prevent a programmer from doing their job and it should not require a special title or be considered otherwise too difficult.

It reminds me of a discussion I have with a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language. They now teach using python and basic. One of his KPI's is the number of students passing. We teach our children by lowering the standards until they too pass the test rather than having the hard conversation, therefore everyone is a winner and we have to put fences in to keep all but a select few out.

There's posts that talk about 'Rockstar' programmer as if a competent programmer nowadays is considered special. The real 'Rockstars' in the team are the ones with domain knowledge.

I think getting back to the original question of this long topic, it's not the number of programming languages that is the problem. The language is just a tool some like 'C' are general purpose and some like IL are job specific. A good developer will pick up a new programming language and become productive. The root problem is the number and quality of the programmers. Writing a script on the weekend and calling yourself a programmer is analogous to changing a light and calling your self an electrician.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #458 on: December 05, 2019, 09:33:00 pm »
No Rust though? ;D

It depends on LLVM, which has some problem with Catalyst at the moment  :-//

I see. Anyway, I'm still not convinced by Rust for a number of reasons.

And now, this last piece seems to be a potential additional reason: https://msrc-blog.microsoft.com/2019/11/07/using-rust-in-windows/
Be very afraid!  :-DD
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #459 on: December 05, 2019, 10:46:49 pm »
competent programmers should understand basic operators and precedence without needs packets of aspirin and a special senior classification [...] a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language

bah, ... I am a bit perplexed. Do students have to deal with complex hardware, ICEs, dynamic coverage, DO178B supersonic certifications, voters, AI, autonomous testing machines, high responsibility for human life, and high psychological pressure due to deadlines to be respected?

Sure, we had too much difficulties with the C language, but these are different kind of difficulties that are part of the product "life cycles" we develop, rather than a "lazy mind-attitude", like in your example with students.

was it not clear? probably my fault, perhaps not properly explained :-//
 
The following users thanked this post: blacksheeplogic

Offline Tepe

  • Frequent Contributor
  • **
  • Posts: 572
  • Country: dk
Re: Too many programming languages?
« Reply #460 on: December 05, 2019, 10:54:45 pm »
But out of curiousity, what language would you use instead?
I would probably be a bit cautious and lean towards Ada.
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Too many programming languages?
« Reply #461 on: December 05, 2019, 11:30:09 pm »
Quote
competent programmers should understand basic operators and precedence without needs packets of aspirin

the packets of aspirin anecdote was not telling that people need aspiring because they do not correctly understand basic operators and procedures, but rather because our ICEs and AI might get so confused by certain features of the C language combined with the weird avionics testing ecosystem that people have to find "workaround" to avoid extra-problems happening. But your task is not finding "workarounds". Sure, you might think it's a creative task, but ... well, it is a creative thing the first and second time, but in the long term, most people do perceive it as stressing and frustrating, especially if you have deadlines to respect.

Due to the human ego, before it became "banned rule" (like "don't use recursion"), most guys in the dev-team refused to avoid to use the pointers arithmetic, even if this is a notorious source of confusion for our testing ecosystem; they probably wanted to underline their competence with the C language, their Ego, and as result, my boss continuosly asked my squad to edit their files.

You can imagine the extra work for the testing squad. It's like when devs do program wild (without respecting anything), and then someone has to make the code Misra compliant.

Now I am in the dev squad, but did it make sense? I know the answer by experience.

Hence I appreciate that a simple "starting from now, it's banned(1), do not question" solved the issue and it also improved efficiency, and reduced frustration.


(1) since now it's a rule, "do not use the pointers arithmetics", their Ego feels comfortable. They know how to use the pointers arithmetics, but they are not allowed, hence if they do not use, it's not because they do not know the C language, but rather because it's imposed.
The human psycology is ... funny  :D
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19515
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #462 on: December 05, 2019, 11:51:02 pm »
It reminds me of a discussion I have with a tutor at a local technical college. They stopped teaching C because students were having too much difficulty with the language. They now teach using python and basic.

A tool should solve real-world problems without introducing unnecessary problems that hinder solving the real-world problem.

So, the college avoided using such problematic tools when teaching neophytes. Good.

Quote
One of his KPI's is the number of students passing. We teach our children by lowering the standards until they too pass the test rather than having the hard conversation, therefore everyone is a winner and we have to put fences in to keep all but a select few out.

That, of course, is objectionable and a real problem.

Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #463 on: December 06, 2019, 12:47:16 am »
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19515
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #464 on: December 06, 2019, 08:14:32 am »
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.

Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: legacy, blacksheeplogic

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #465 on: December 06, 2019, 04:08:49 pm »
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.

Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)

The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed. (Heck, it's important in many areas in life actually.) Lacking this ability would definitely cause you reccuring problems in your professional life.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #466 on: December 06, 2019, 08:18:37 pm »
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
Fully agreed!  ^-^

Fortunately, in real life -- as opposed to exams -- it is easier to find out the objective.  In exams, the lecturer may have any number of different objectives, and it is not always easy to tell which.

The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed.
I definitely agree.  (Overstrikes mine, because it really is essential nowadays in just about every task.  Even research.)

I have an unfortunate tendency to underestimate the effort needed for my own part (a personality flaw), and must remember to compensate by a factor of two or so.  It does not affect comparison of tasks, though.  Outside work, in idle chat or discussions like in this forum, I often forget to compensate.

Like I said, am a researcher/problem solver, not an engineer.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19515
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Too many programming languages?
« Reply #467 on: December 06, 2019, 10:09:02 pm »
Professor Eric Laithwaite at Imperial College used to set exams where one question was easy and sufficient get you a pass mark, one was more challenging and couuld get you a good degree, and one could not be answered adequately in the time available. He expected his undergraduate engineers to be able to determine which questions to avoid. If they couldn't, they wouldn't make good engineers anyway.
I'd have failed immediately; I've always been a sucker for interesting hard questions.  Probably a reason I'm more into science/research than engineering.
Yes.

But in an exam you are up to your neck in alligators, and if you can't remember that your objective is to drain the swamp then you don't deserve to be an engineer :)
Fully agreed!  ^-^

Fortunately, in real life -- as opposed to exams -- it is easier to find out the objective.  In exams, the lecturer may have any number of different objectives, and it is not always easy to tell which.

I find it much more difficult to define "success" in normal life!

Quote
The ability to both keep one's goals in mind when it's most needed, and to assess, even roughly, how much time a given task will take you, is an essential part of engineering indeed.
I definitely agree.  (Overstrikes mine, because it really is essential nowadays in just about every task.  Even research.)

I have an unfortunate tendency to underestimate the effort needed for my own part (a personality flaw), and must remember to compensate by a factor of two or so.  It does not affect comparison of tasks, though.  Outside work, in idle chat or discussions like in this forum, I often forget to compensate.

Like I said, am a researcher/problem solver, not an engineer.

Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #468 on: December 06, 2019, 11:08:14 pm »
It does not affect comparison of tasks, though.

Relatively comparing tasks is not helpful, though, when you need to make a decision which one to tackle, unless of course you have a hard rule of always picking the one which will take the least time (or the most).

Like in the "exam" example, you may have perfectly identified that one of the questions was definitely the hardest one, but you may still have chosen to pick it because you still felt you could manage in the available time.

The same can happen when working on projects. You may have a choice to make between two options, one you know for sure is harder than the other, but you could still think it's manageable (and of course a lot more interesting)... This is in fact part of a manager's nightmare. ;D Oh, and more often than not, this guy that the manager let take this decision will end up resenting their manager for letting them do it if it ever fails or time slips awfully... (yes engineering management can be tough!)


« Last Edit: December 06, 2019, 11:11:28 pm by SiliconWizard »
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #469 on: December 06, 2019, 11:26:28 pm »
Relatively comparing tasks is not helpful, though, when you need to make a decision which one to tackle, unless of course you have a hard rule of always picking the one which will take the least time (or the most).
Not just a hard rule, it can be useful in other situations too.  For example, if you know that one of the tasks is intentionally too complex to perform in the given time, or when you need to split a set of tasks to a varied group of workers.

You may have a choice to make between two options, one you know for sure is harder than the other, but you could still think it's manageable (and of course a lot more interesting)...
Yup, a classic footgun situation.  Reliable absolute estimates are much more often needed than comparisons.

I find it much more difficult to define "success" in normal life!
True; I associate that sort of thing with loved ones and carrying meaningful responsibilities reliably, but other than that, I wouldn't much bother with the definition.

(Above, I meant "successfully" only in the sense of not failing horribly, causing wasted time and resources.)

Many people estimate how long it will take to do a task assuming they will be productive for 100% of that time. In reality many people are the only directly productive for 30-40% of the time. Hence a multiplication factor of 2.5-3.3 is appropriate. Or just convert to the next higher unit :)
I was once advised to use Pi as the factor. :)

Also, I'm one of the people who do not multitask well, and have had to learn to be specific and clear about priorities and timetables.  It is not always clear from the context whether you're asked to switch to do thing B first, to do things A and B at the same time, or to do A first and then B.  This has bitten me bad when I was younger.

Which, by the way, gets us back to programming languages.

As I've mentioned before, I'd like to have a C-like programming language, but more restricted to the types of processors and microcontrollers we have.  One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.  This is somewhat useful in initialization loops, but would be really useful in hard optimization of complex non-loop math operations (typical in certain types of simulations, especially force-field models in chemistry).

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(

While I don't like many (most?) of the current programming languages, every one of them has provided further insight into what features benefit and what detract from a programming language.  So, while there are lots of languages I would recommend against using when writing widely-used tools or end-user applications, I do see them having a valid use in research and experimentation: without them, our future programming languages would be worse.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14481
  • Country: fr
Re: Too many programming languages?
« Reply #470 on: December 06, 2019, 11:58:07 pm »
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so. You probably had something more advanced in mind, would you care to give an example?

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(

I've seen that kind of loop in ParaSail (the language I created a thread for - didn't get much traction), I think. Of course it's just a curiousity more than anything else at the moment.

For the little I've explored with OpenMP, I think you can definitely do that with it. It will parallelize iterations for things that don't depend on one another. OK, OpenMP is more an "extension" than an integral part of a language.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #471 on: December 07, 2019, 01:01:30 am »
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.
You'd probably need to be more specific though. At high optimization levels, good C compilers already re-order operations when they can safely do so. You probably had something more advanced in mind, would you care to give an example?
I'd like to allow reordering and especially interleaving of operations, ignoring sequence points and order of side effects.  (If you consider C++ atomics, you'll immediately see how this is essentially extending the models to sequence points and order of side effects.)

Essentially, I'd like to be able to tell the compiler that if I have code like
    {
        v1 = complex_expression_one();
        v2 = complex_expression_two();
    }
the compiler is allowed to evaluate the contents of the block in parallel (interleaved, using a single thread of execution), completely ignoring sequence points or the order of side effects.

These code segments are rare, but very hot, and I'd like to also annotate them so the compiler knows to do lots of extra work to optimize this particular chunk of code, even as far as brute-forcing some details.  Technically, you could do that via attributes or pragmas, but I'd like the language to natively support it.

I really liked the Fortran 95 FORALL loop (where the iteration order is undefined), but they're deprecating that in future Fortran standards.  :(
I've seen that kind of loop in ParaSail (the language I created a thread for - didn't get much traction), I think. Of course it's just a curiousity more than anything else at the moment.
Right.  It is in the same category as memrep(buffer, offset, length), that is the complementary of memcpy()/memmove(), in the sense that it fills buffer by repeated copies of the first offset bytes.  (It is the inverse of memcpy() with respect to memmove(), implementation-wise.  It is very useful for initializing arrays with structures and/or floating-point members, since only the storage representation matters.  While you can trivially implement it yourself, it should be something the compiler can optimize for the target architecture; i.e., in GCC, a built-in function, and not just a library function.)

For the little I've explored with OpenMP, I think you can definitely do that with it.
OpenMP parallelizes the loops with multiple threads.  That's not what I am talking about.  I'm talking about generating machine code that interleaves several operations on superscalar architectures with enough registers.

Also, like I said, I want this even closer to the metal than C, and OpenMP is quite a complicated abstraction, with a lot of hidden costs in the thread management.  When writing kernels, or simulators using MPI and a fixed number of threads per node (which is typical), OpenMP is a square peg in a round hole.

Does anyone know of a programming language that exposes the flag register after operations or function calls?  The standard flags (Zero, Carry, Negative, Overflow) would be rather useful.  I do realize that it would require emulation on MIPS and RISC-V at least, as they don't have them in a dedicated register.
 

Offline blacksheeplogic

  • Frequent Contributor
  • **
  • Posts: 532
  • Country: nz
Re: Too many programming languages?
« Reply #472 on: December 07, 2019, 01:10:52 am »
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.

Compile-time optimization can only go so far in determining 'best', and best varies depending on resource availability. FDPR is much more interesting and potentially a better direction to explore and extend. I've used FDPR primarily for it's quick win with branch prediction but it along with tools like CTRACE can give a lot of insight into the run time behavior.
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6264
  • Country: fi
    • My home page and email address
Re: Too many programming languages?
« Reply #473 on: December 07, 2019, 01:43:03 am »
One ability I'd like to have, is to tell the compiler to accomplish unrelated things in whichever order is best.
Compile-time optimization can only go so far in determining 'best'
In this case 'best' is defined as using the minimum number of cycles in the operating thread, including statistical latency estimates, possibly interleaving the operations.  These are extremely rare but extremely hot (typically simulation calculations; I do not recall seeing them in application-type software). Like I said, I'd love the compiler to even brute-force the various implementation possibilities.  I could even accept if this was only possible for native compilation, and disabled for cross-compilation.

A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant, typical in initialization.  There, the intent is for the compiler to choose how to implement the machine code; to allow optimization if such is available on the target architecture.  Currently, in C, the sequence points and therefore the order of side effects is dictated by the standard, and the compiler is quite restricted in its ability to reorder operations (and still stay standards-compliant).

Neither of these involve branches (conditionals sometimes yes, but only very rarely branches), so runtime profiling is useless here.
(The code profiling above, for brute-forcing different implementations, is done as part of the compilation, and there a statistical answer suffices; no code needs to be "run" at all.)
« Last Edit: December 07, 2019, 01:45:54 am by Nominal Animal »
 

Offline GeorgeOfTheJungle

  • Super Contributor
  • ***
  • !
  • Posts: 2699
  • Country: tr
Re: Too many programming languages?
« Reply #474 on: December 07, 2019, 01:55:15 am »
A secondary use case is to interleave unrelated operations because their order and side effects is irrelevant

I think one way to do that is with , instead of ;

If you write complex_expression_one(); complex_expression_two(); _one()'s side effects must have happened before executing _two(), but if you write instead complex_expression_one(), complex_expression_two(); you're telling the compiler that it doesn't matter. I might be wrong as always, though.
The further a society drifts from truth, the more it will hate those who speak it.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf