Author Topic: how is the job prospect of FPGA or Labview?  (Read 6020 times)

0 Members and 1 Guest are viewing this topic.

Offline david-ITTopic starter

  • Contributor
  • !
  • Posts: 17
  • Country: us
  • never stop learning
how is the job prospect of FPGA or Labview?
« on: March 29, 2016, 10:05:38 am »
i am a begineer of FPGA,and know Labview,now i want to improve myself that proficient at.Can anyone tell me how is the job prospect of FPGA or Labview?
Thanks very much
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #1 on: March 29, 2016, 10:33:07 am »
i am a begineer of FPGA,and know Labview,now i want to improve myself that proficient at.Can anyone tell me how is the job prospect of FPGA or Labview?
Thanks very much

The half-life of knowledge about any particular tool is a few years at most. If, say, it is 3 years then in 9 years 87.5% of your knowledge is obsolete. (FPGA example: how much use is knowledge of Xilinx's ISE, now that Vivado is "the future").

Knowledge of fundamental principles can and does last a lifetime.

If you want to find out how many companies are hiring people with experience of X, look at the job adverts. Of course, by the time you have learned X, the market may have moved onto X2  or Y.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: denimdragon

Offline danadak

  • Super Contributor
  • ***
  • Posts: 1875
  • Country: us
  • Reactor Operator SSN-583, Retired EE
Re: how is the job prospect of FPGA or Labview?
« Reply #2 on: March 29, 2016, 11:23:12 am »
C Language has had a 44+ year life, but prior poster right on
concerning IDEs. ASM langauge, for some processors, like Intel 8051
that will not die, now ARM, stay current for a large number of years.

The good news is that vendors tools in any specific technology seem to
"merge" in adaptability for the user. That is to say user can jump from
tool to tool, once experienced in one tool, and be able experience a fast
learning curve for user. That's because increasingly the tools are GUI and
provide a lot of the "mundane" IP that no longer has to be re-invented
every time a new project is started.

Labview seems to have taken on a new life for low cost rapid turn interface
to instruments, sensors. A quick way of getting a custom GUI together for
product operation. New is Python, but on a wider application basis.

Regards, Dana.
« Last Edit: March 29, 2016, 11:27:40 am by danadak »
Love Cypress PSOC, ATTiny, Bit Slice, OpAmps, Oscilloscopes, and Analog Gurus like Pease, Miller, Widlar, Dobkin, obsessed with being an engineer
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #3 on: March 29, 2016, 11:57:52 am »
C Language has had a 44+ year life, but prior poster right on
concerning IDEs.

As someone that started writing C in 1981 when there were a total of two books on the language, I can assure you that C has mutated almost out of all recognition - and not always for the better.

Quote
The good news is that vendors tools in any specific technology seem to
"merge" in adaptability for the user. That is to say user can jump from
tool to tool, once experienced in one tool, and be able experience a fast
learning curve for user. That's because increasingly the tools are GUI and
provide a lot of the "mundane" IP that no longer has to be re-invented
every time a new project is started.

GUIs have very little to do with it.

The fundamentals are the same no matter which IDE tool is used. All that changes is where the buttons are, or what the command line options are.

Even languages are very similar to each other, within broad groups. For example, there is a strong common thread from Smalltalk->Objective C->Java->C#. The syntax changes, but the semantics have remained largely unchanged over 30 years. Once you know one proficiently, it is very quick to use another proficiently.

OTOH, there is more of a problem switching to any of the FSM languages, or Lisp, or Prolog, or C.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline danadak

  • Super Contributor
  • ***
  • Posts: 1875
  • Country: us
  • Reactor Operator SSN-583, Retired EE
Re: how is the job prospect of FPGA or Labview?
« Reply #4 on: March 29, 2016, 01:41:36 pm »
Quote
GUIs have very little to do with it.

Interesting. I started with Fortran, early 70's, then several ASM, mini's, Test Systems (more
than once I had to code the system to discover its HW language, horrible docs), then some
Basic, started late on C. Now working on Verilog and Python, at a snails pace I might
add.

I found that GUIs eliminated, largely, need for command line execution, or need to learn
its own language. Which frankly after punching tons of cards and paper tape were a blessing.
Then we got good editors, then finally drag and drop components with attached code base
attached to the GUIs. And about the same time debuggers/JTAG/OnChip debug, again all
wrapped around GUIs. So for me going from one un-guied toolset to another is like trying
to climb Everest, as opposed to the ease with which I adapt to a language supported by a
GUI.

So I guess our definition of what a GUI does probably an expression of experience.

Quote
As someone that started writing C in 1981 when there were a total of two books on the language, I can assure you that C has mutated almost out of all recognition - and not always for the better.

For me learning C was a disappointment, but a necessity. So strongly typed. I am fascinated
by your comment it is unrecognizable from origin, I thought it so simple already how could
it have evolved much. I will take your word for it.

Regards, Dana.
« Last Edit: March 29, 2016, 01:43:21 pm by danadak »
Love Cypress PSOC, ATTiny, Bit Slice, OpAmps, Oscilloscopes, and Analog Gurus like Pease, Miller, Widlar, Dobkin, obsessed with being an engineer
 

Offline krivx

  • Frequent Contributor
  • **
  • Posts: 765
  • Country: ie
Re: how is the job prospect of FPGA or Labview?
« Reply #5 on: March 29, 2016, 02:58:30 pm »
For me learning C was a disappointment, but a necessity. So strongly typed. I am fascinated
by your comment it is unrecognizable from origin, I thought it so simple already how could
it have evolved much. I will take your word for it.

Regards, Dana.

Yeah, I agree. I learnt C relatively recently (at least compared to people in this thread) but I can read K&R just fine.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #6 on: March 29, 2016, 04:23:20 pm »
Quote
GUIs have very little to do with it.

Interesting. I started with Fortran, early 70's, then several ASM, mini's, Test Systems (more
than once I had to code the system to discover its HW language, horrible docs), then some
Basic, started late on C. Now working on Verilog and Python, at a snails pace I might
add.

I found that GUIs eliminated, largely, need for command line execution, or need to learn
its own language. Which frankly after punching tons of cards and paper tape were a blessing.
Then we got good editors, then finally drag and drop components with attached code base
attached to the GUIs. And about the same time debuggers/JTAG/OnChip debug, again all
wrapped around GUIs. So for me going from one un-guied toolset to another is like trying
to climb Everest, as opposed to the ease with which I adapt to a language supported by a
GUI.

So I guess our definition of what a GUI does probably an expression of experience.

Well, you have chosen to omit the point I was making, and have started down a different track. I agree with all you say about GUIs, except that command-line options are preferable for automated build and test systems where multiple people are working on the same codebase. However that doesn't invalidate the point I was making.

BTW, FWIW I actually started with Algol-60, five channel paper tape, 5cps telytypes, and a 2kIPS 39-bid machine.


Quote
Quote
As someone that started writing C in 1981 when there were a total of two books on the language, I can assure you that C has mutated almost out of all recognition - and not always for the better.

For me learning C was a disappointment, but a necessity. So strongly typed.

C data is completely untyped; that is one of the reasons C is so useful and is still used. C variables are typed, but the typing can be - and often is - overriden.
Contrast that with Smalltalk and Java (and the decent bits of C#) where the data is strongly and unbreakably typed.

Quote
I am fascinated
by your comment it is unrecognizable from origin, I thought it so simple already how could
it have evolved much. I will take your word for it.

A starting point would be to look at the number of reserved keywords. Then add their interaction. And the latest variant finally has a memory model, although who knows when that will be completely implemented and used.

For amusement, have a look at the FQA http://yosefk.com/c++fqa/ which has a lot in common with C, of course. I'm particularly amused by the "const" sections, since I remember the unresolved and unresolvable discussions as to whether it should/must be possible to "cast away constness".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #7 on: March 29, 2016, 04:33:30 pm »
For me learning C was a disappointment, but a necessity. So strongly typed. I am fascinated
by your comment it is unrecognizable from origin, I thought it so simple already how could
it have evolved much. I will take your word for it.

Regards, Dana.

Yeah, I agree. I learnt C relatively recently (at least compared to people in this thread) but I can read K&R just fine.

I haven't seen K&R C in decades. You know, the C where
Code: [Select]
int foo(int baz, int bar) {
   /* body omitted */
  return 0;
}
won't compile due to a syntax error,  whereas
Code: [Select]
int foo(baz, bar)
  int baz;
  int bar;
  {
   /* body omitted */
  return 0;
}
is OK.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline danadak

  • Super Contributor
  • ***
  • Posts: 1875
  • Country: us
  • Reactor Operator SSN-583, Retired EE
Re: how is the job prospect of FPGA or Labview?
« Reply #8 on: March 29, 2016, 07:11:58 pm »
What did you mean by this -

Quote
C data is completely untyped; that is one of the reasons C is so useful and is still used. C variables are typed, but the typing can be - and often is - overriden.

I understand the overidden, eg. compiler intervention, and I think of the wiki def of typing as defined function call vars typed,
but "data is completely untyped" ? By this irrespective of defined typing do you mean by itself data has no meaning until we
the programmer define what it is ? Or am I missing the meaning altogether ?

Back to the original point,

Quote
GUIs have very little to do with it.

The fundamentals are the same no matter which IDE tool is used. All that changes is where the buttons are, or what the command line options are.

My way of thinking is the GUI is like the original horesless carriage, without a body it was a pain to drive in a snow or rainstorm.
And when it came along not only did the body come but many additional capabilities. I agree fundamentals did not change, eg.
engine, wheels, etc.. but how much better the ride is now. Maybe we are arguing the horse before the carriage problem, for
which I still am severly confused...... :o

Quite familiar with teletype, main I/O on PDP-8 I used to test ICs. Not to exclude switch/light panel.

Regards, Dana.
« Last Edit: March 29, 2016, 07:14:51 pm by danadak »
Love Cypress PSOC, ATTiny, Bit Slice, OpAmps, Oscilloscopes, and Analog Gurus like Pease, Miller, Widlar, Dobkin, obsessed with being an engineer
 
The following users thanked this post: david-IT

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #9 on: March 29, 2016, 07:33:58 pm »
What did you mean by this -
Quote
C data is completely untyped; that is one of the reasons C is so useful and is still used. C variables are typed, but the typing can be - and often is - overriden.
I understand the overidden, eg. compiler intervention, and I think of the wiki def of typing as defined function call vars typed,
but "data is completely untyped" ? By this irrespective of defined typing do you mean by itself data has no meaning until we
the programmer define what it is ? Or am I missing the meaning altogether ?

In C, the data is one or more bytes, and the meaning of the bytes is determined by whatever is "looking at" and "interpreting" the bytes. Hence 0x01, 0x02 could be a unicode character, 258 decimal, 513 decimal, or a pointer to a function, or.... That is most clearly visible when one thinks of casts, of attempting to interpret core dumps, or when you point your debugger at a random piece of memory. In the debugger you have to tell the debugger how to interpret the bytes usually by accessing it in the context of a typed variable - but if the type is wrong, you simply won't know it is wrong.

In contrast, in Java/Smalltalk/C# etc every piece of data has a unique meaning defined by an associated "class pointer" or similar. Thus it is impossible to cast a horse into a camel, and no way to "downcast" an animal into a horse unless the animal really is a horse - it if was a camel then there would be a runtime "ClassCastException". Requiring every piece of data to also have a class pointer increases the memory footprint, of course, but in many circumstances that is of less concern than correct program operation and graceful recovery when the unexpected occurs. When in the debugger, you are offered the exact set of valid operations on the data.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline david-ITTopic starter

  • Contributor
  • !
  • Posts: 17
  • Country: us
  • never stop learning
Re: how is the job prospect of FPGA or Labview?
« Reply #10 on: March 30, 2016, 10:26:04 am »
i am a begineer of FPGA,and know Labview,now i want to improve myself that proficient at.Can anyone tell me how is the job prospect of FPGA or Labview?
Thanks very much

The half-life of knowledge about any particular tool is a few years at most. If, say, it is 3 years then in 9 years 87.5% of your knowledge is obsolete. (FPGA example: how much use is knowledge of Xilinx's ISE, now that Vivado is "the future").

Knowledge of fundamental principles can and does last a lifetime.

If you want to find out how many companies are hiring people with experience of X, look at the job adverts. Of course, by the time you have learned X, the market may have moved onto X2  or Y.
yes,you are right,thanks very much,i will work hard to learn
 

Offline david-ITTopic starter

  • Contributor
  • !
  • Posts: 17
  • Country: us
  • never stop learning
Re: how is the job prospect of FPGA or Labview?
« Reply #11 on: March 30, 2016, 10:28:29 am »
C Language has had a 44+ year life, but prior poster right on
concerning IDEs. ASM langauge, for some processors, like Intel 8051
that will not die, now ARM, stay current for a large number of years.

The good news is that vendors tools in any specific technology seem to
"merge" in adaptability for the user. That is to say user can jump from
tool to tool, once experienced in one tool, and be able experience a fast
learning curve for user. That's because increasingly the tools are GUI and
provide a lot of the "mundane" IP that no longer has to be re-invented
every time a new project is started.

Labview seems to have taken on a new life for low cost rapid turn interface
to instruments, sensors. A quick way of getting a custom GUI together for
product operation. New is Python, but on a wider application basis.

Regards, Dana.
yes,the Python is widely used now,i just learned front-end language and php,the knowledge changed so fast
 

Offline david-ITTopic starter

  • Contributor
  • !
  • Posts: 17
  • Country: us
  • never stop learning
Re: how is the job prospect of FPGA or Labview?
« Reply #12 on: March 30, 2016, 10:30:47 am »
C Language has had a 44+ year life, but prior poster right on
concerning IDEs.

As someone that started writing C in 1981 when there were a total of two books on the language, I can assure you that C has mutated almost out of all recognition - and not always for the better.

Quote
The good news is that vendors tools in any specific technology seem to
"merge" in adaptability for the user. That is to say user can jump from
tool to tool, once experienced in one tool, and be able experience a fast
learning curve for user. That's because increasingly the tools are GUI and
provide a lot of the "mundane" IP that no longer has to be re-invented
every time a new project is started.

GUIs have very little to do with it.

The fundamentals are the same no matter which IDE tool is used. All that changes is where the buttons are, or what the command line options are.

Even languages are very similar to each other, within broad groups. For example, there is a strong common thread from Smalltalk->Objective C->Java->C#. The syntax changes, but the semantics have remained largely unchanged over 30 years. Once you know one proficiently, it is very quick to use another proficiently.

OTOH, there is more of a problem switching to any of the FSM languages, or Lisp, or Prolog, or C.
C is interesting sometimes,but sometimes it is a little difficult.Now the software seems more popular
 

Offline krivx

  • Frequent Contributor
  • **
  • Posts: 765
  • Country: ie
Re: how is the job prospect of FPGA or Labview?
« Reply #13 on: March 30, 2016, 10:41:11 am »
For me learning C was a disappointment, but a necessity. So strongly typed. I am fascinated
by your comment it is unrecognizable from origin, I thought it so simple already how could
it have evolved much. I will take your word for it.

Regards, Dana.

Yeah, I agree. I learnt C relatively recently (at least compared to people in this thread) but I can read K&R just fine.

I haven't seen K&R C in decades. You know, the C where
Code: [Select]
int foo(int baz, int bar) {
   /* body omitted */
  return 0;
}
won't compile due to a syntax error,  whereas
Code: [Select]
int foo(baz, bar)
  int baz;
  int bar;
  {
   /* body omitted */
  return 0;
}
is OK.

I meant the reading the book, but these examples are very similar. I don't see how one has mutated out of recognition, it usually seems more like project-specific coding styles/standards have more impact on the code than which C language standard it was written against.  :-//
 

Offline david-ITTopic starter

  • Contributor
  • !
  • Posts: 17
  • Country: us
  • never stop learning
Re: how is the job prospect of FPGA or Labview?
« Reply #14 on: March 30, 2016, 10:59:05 am »
What did you mean by this -
Quote
C data is completely untyped; that is one of the reasons C is so useful and is still used. C variables are typed, but the typing can be - and often is - overriden.
I understand the overidden, eg. compiler intervention, and I think of the wiki def of typing as defined function call vars typed,
but "data is completely untyped" ? By this irrespective of defined typing do you mean by itself data has no meaning until we
the programmer define what it is ? Or am I missing the meaning altogether ?

In C, the data is one or more bytes, and the meaning of the bytes is determined by whatever is "looking at" and "interpreting" the bytes. Hence 0x01, 0x02 could be a unicode character, 258 decimal, 513 decimal, or a pointer to a function, or.... That is most clearly visible when one thinks of casts, of attempting to interpret core dumps, or when you point your debugger at a random piece of memory. In the debugger you have to tell the debugger how to interpret the bytes usually by accessing it in the context of a typed variable - but if the type is wrong, you simply won't know it is wrong.

In contrast, in Java/Smalltalk/C# etc every piece of data has a unique meaning defined by an associated "class pointer" or similar. Thus it is impossible to cast a horse into a camel, and no way to "downcast" an animal into a horse unless the animal really is a horse - it if was a camel then there would be a runtime "ClassCastException". Requiring every piece of data to also have a class pointer increases the memory footprint, of course, but in many circumstances that is of less concern than correct program operation and graceful recovery when the unexpected occurs. When in the debugger, you are offered the exact set of valid operations on the data.


for software,php seems popular now
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #15 on: March 30, 2016, 01:25:04 pm »
I meant the reading the book, but these examples are very similar. I don't see how one has mutated out of recognition, it usually seems more like project-specific coding styles/standards have more impact on the code than which C language standard it was written against.  :-//

Sigh. That was a trivial example of one difference.

Please find "little things" like volatile, sequence points, the memory model [sic], const (and many many more) anywhere in the first two books.

Most importantly, in the first two books there were never any arguments (let alone interminable arguments) about what the code would do after compilation. Now the standards are a bloated mess that require language lawyers. And even then they often misinterpret and/or fail to agree on what is stated.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline krivx

  • Frequent Contributor
  • **
  • Posts: 765
  • Country: ie
Re: how is the job prospect of FPGA or Labview?
« Reply #16 on: March 30, 2016, 05:27:42 pm »
I meant the reading the book, but these examples are very similar. I don't see how one has mutated out of recognition, it usually seems more like project-specific coding styles/standards have more impact on the code than which C language standard it was written against.  :-//

Sigh. That was a trivial example of one difference.

Please find "little things" like volatile, sequence points, the memory model [sic], const (and many many more) anywhere in the first two books.

Most importantly, in the first two books there were never any arguments (let alone interminable arguments) about what the code would do after compilation. Now the standards are a bloated mess that require language lawyers. And even then they often misinterpret and/or fail to agree on what is stated.

I think any failure of the standards is mostly due to the many cases that result in undefined behaviour. I think a lot of that could be deliberate for performance reasons but I don't really know enough about the compilers to be sure.

As for volatile, sequence points, const etc I guess I don't find that too cryptic as it was introduced when I leant the language originally, I have the opposite view.  :-// Most additions seem necessary, I can't see embedded programming working too well with optimizing compilers without a volatile model for example. cdecl (http://cdecl.org/) and static-analysis tools get me over the rest of it.

C++ is a different kettle of fish.
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Re: how is the job prospect of FPGA or Labview?
« Reply #17 on: March 30, 2016, 06:27:53 pm »
Totally off topic, but this thread has made me feel young! I started programming in FORTRAN in 1979 (punched cards handed over to computer operators) and didn't switch to C until the mid 1980s!

It is good to see others who started even earlier. :)
 
The following users thanked this post: david-IT

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #18 on: March 30, 2016, 07:00:36 pm »
I think any failure of the standards is mostly due to the many cases that result in undefined behaviour. I think a lot of that could be deliberate for performance reasons but I don't really know enough about the compilers to be sure.

There are various reasons, but deliberately leaving something undefined for performance reasons would be peverse. It would lead to getting the answer faster but not being sure that it was the right answer or a repeatable answer. And if you don't have to get the right answer, you can make the progam infinitely fast :)

One of the prime reasons for leaving something undefined is because there are irreconcilable differences between what it does on one processor and another. Another is that it would be excessibel slow on some architectures. Another is that getting it right would break some existing programs. And of course, sometimes it is just too difficult to get all parts of the spec to agree with each other.

And be careful about the difference between "undefined" and "implementation dependent". The former is the cause of nasal demons, the latter is the cause of subtle portability problems.

Quote
As for volatile, sequence points, const etc I guess I don't find that too cryptic as it was introduced when I leant the language originally, I have the opposite view.  :-// Most additions seem necessary, I can't see embedded programming working too well with optimizing compilers without a volatile model for example. cdecl (http://cdecl.org/) and static-analysis tools get me over the rest of it.

Judging by the statements of those that were on various national committees, you think you understand them, but you don't. The problems come with the corner cases, with optimisation, and with feature "interaction".

First example that comes to mind... C (until possibly the latest spec) defined anythign to do with multithreading as outside the language spec and being the libraries' responsibility. Therefore POSIX has to rely on C features that are deliberately not defined.

Remind me, can you "cast away constness". There were endless (i.e. years long) debates about whether it must be allowed (for libraries) or must be prohibited (to allow known correct optimisations). I believe it is allowed, but the existence of the debate demonstrated that the language was being pulled in two incompatible directions and was out of control.

Quote
C++ is a different kettle of fish.

Yes. See the FQA, which is most entertaining - unless you have to deal with it. (Bits are relevant to C, of course).

For myself, I decided in 1988 that if C++ was the answer, I thought the question was wrong ;} Cue story about how the designers didn't understand the template language was itself Turing-complete, until someone wrote a short program that caused the compiler to output the sequence of prime numbers during compilation. If the designers don't understand their creation, what hope do the rest of us have.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline krivx

  • Frequent Contributor
  • **
  • Posts: 765
  • Country: ie
Re: how is the job prospect of FPGA or Labview?
« Reply #19 on: March 30, 2016, 07:13:53 pm »
Remind me, can you "cast away constness". There were endless (i.e. years long) debates about whether it must be allowed (for libraries) or must be prohibited (to allow known correct optimisations). I believe it is allowed, but the existence of the debate demonstrated that the language was being pulled in two incompatible directions and was out of control.

I understand it's not the point you were making but "casting away const" seems like a pretty bad idea. I can't think of a reason to do so other than legacy libraries.

I just tried it in https://gcc.godbolt.org/ and it appears to compile OK. So it seems it can be done, but I probably shouldn't do it. Interesting example.
 
The following users thanked this post: david-IT

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: how is the job prospect of FPGA or Labview?
« Reply #20 on: March 30, 2016, 07:44:47 pm »
Remind me, can you "cast away constness". There were endless (i.e. years long) debates about whether it must be allowed (for libraries) or must be prohibited (to allow known correct optimisations). I believe it is allowed, but the existence of the debate demonstrated that the language was being pulled in two incompatible directions and was out of control.

I understand it's not the point you were making but "casting away const" seems like a pretty bad idea. I can't think of a reason to do so other than legacy libraries.

I just tried it in https://gcc.godbolt.org/ and it appears to compile OK. So it seems it can be done, but I probably shouldn't do it. Interesting example.

I was using the term "library" far too loosely. Think of systems level or "meta"  programming, e.g. a debugger where you want to change the value of something declared const. If you prefer a higher level topic, consider reflection in a dependency injection framework.

Of course, more coherent languages manage that in a typesafe way, which is one of the reasons they are so popular and commercially important.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: david-IT


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf