Author Topic: No more code-size-limited version of IAR embedded workbench for ARM?  (Read 12885 times)

0 Members and 1 Guest are viewing this topic.

Offline mark03Topic starter

  • Frequent Contributor
  • **
  • Posts: 750
  • Country: us
Does anyone know if IAR got rid of the code-size-limited (32 kB) free version of embedded workbench for ARM?  I used the full version of IAR in a previous job and got in the [bad] habit of doing my personal projects on the free code-size-limited version for a number of years.  I should have just used gcc but I dislike Eclipse and the other free alternatives like VSCode hadn't yet gained popularity.

Now I'd like to install the free version of IAR on a new laptop, but as far as I can tell on their web site, there is no evaluation version except a 14-day time-limited license.  I'm 90% certain they've axed the code-size-limited "loophole" :(  but I guess it could still be buried on some page they've effectively hidden.  Anyone know for sure?

Edit:  It took 135 posts to get there, but @mwb1100 got the definitive answer from IAR:  this license type has been discontinued.  Therefore IAR is no longer an option for hobby / student / nonprofit projects.
« Last Edit: December 05, 2024, 12:56:01 am by mark03 »
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 10143
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #1 on: November 18, 2024, 01:35:01 am »
Does anyone know if IAR got rid of the code-size-limited (32 kB) free version of embedded workbench for ARM?  I used the full version of IAR in a previous job and got in the [bad] habit of doing my personal projects on the free code-size-limited version for a number of years.  I should have just used gcc but I dislike Eclipse and the other free alternatives like VSCode hadn't yet gained popularity.

Now I'd like to install the free version of IAR on a new laptop, but as far as I can tell on their web site, there is no evaluation version except a 14-day time-limited license.  I'm 90% certain they've axed the code-size-limited "loophole" :(  but I guess it could still be buried on some page they've effectively hidden.  Anyone know for sure?
I thought IAR only offered code size limited versions when they had a deal with a silicon vendor. Were you using the generic IAR for ARM, with support libraries for a wide range of vendors, or something vendor specific?
 

Offline mark03Topic starter

  • Frequent Contributor
  • **
  • Posts: 750
  • Country: us
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #2 on: November 18, 2024, 04:49:54 pm »
I thought IAR only offered code size limited versions when they had a deal with a silicon vendor. Were you using the generic IAR for ARM, with support libraries for a wide range of vendors, or something vendor specific?
Yes, I believe this was the full-featured EW-ARM product (minus a few extras like their code safety checker, if I remember correctly).  It would have been 5-6 years ago now.
 

Offline neil555

  • Contributor
  • Posts: 42
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #3 on: November 18, 2024, 05:18:22 pm »
I would recommend Segger embedded studio, it's free for non commercial use and has no size limits.
 

Offline Doctorandus_P

  • Super Contributor
  • ***
  • Posts: 4009
  • Country: nl
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #4 on: November 18, 2024, 05:29:17 pm »
Why use crippled software when you can use GCC?

Sure, it can be a nuisance to get started with GCC, but once you get though that, you've got a very wide landscape of options.
But that said, the commercial compiler vendors also need they need something to compete, and they tend to have bundled libraries for USB & Ethernet stacks, MP3 players LCD libraries and such.
 
The following users thanked this post: Siwastaja

Offline cgroen

  • Supporter
  • ****
  • Posts: 642
  • Country: dk
    • Carstens personal web
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #5 on: November 18, 2024, 05:34:02 pm »
Keil(ARM) has a community version of their tool free for hobby use
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #6 on: November 18, 2024, 05:40:53 pm »
they tend to have bundled libraries for USB & Ethernet stacks, MP3 players LCD libraries and such.

Enabling quick proof-of-concept and then total destruction of the company when no one knows how to continue to the actual saleable product.
 
The following users thanked this post: bson

Offline mark03Topic starter

  • Frequent Contributor
  • **
  • Posts: 750
  • Country: us
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #7 on: November 18, 2024, 08:20:14 pm »
Without getting into a full blown discussion of IDE alternatives, I will only say that yes, I use gcc, and my go-to IDE solution at the moment would probably be Visual Studio Code with appropriate plugins.  I wish the process of getting that up and running was about 95% less painful than it is, but this seems to be a constant in the embedded universe (like printing from Linux---negligible probability of being satisfactorily resolved in my lifetime).

Some people like vendor code and vendor IDEs.  Maybe there will come a day when they don't all suck.  I'm still waiting.

Regarding gcc (and veering a bit off topic), I have been curious to see how gcc, clang/llvm, and proprietary compilers stack up, especially on the new SIMD instructions (Helium), and also on RISC-V and its equivalent extensions.  I have seen benchmarks which seem to indicate that gcc in particular is falling behind.  ARM apparently now targets clang for all of its improvements, and says that they may or may not ever make it into gcc.  (ARM's own commercial compiler is significantly better than both.)  Certainly, gcc is "good enough" for 99% of embedded work, but I do hope we are not regressing from the paid/free performance ratio that existed ten years ago; my impression is that at that time, it was pretty close to 1.0.  Also, are there sufficiently "big guns" behind compiler development for RISC-V?  Or will it end up with a performance penalty merely due to the lack of compilers as good as ARM has.
« Last Edit: November 18, 2024, 08:24:04 pm by mark03 »
 

Offline mikerj

  • Super Contributor
  • ***
  • Posts: 3398
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #8 on: November 18, 2024, 08:30:53 pm »
I would recommend Segger embedded studio, it's free for non commercial use and has no size limits.

I'd second this, ES is very good.  We used Keil for many years at work and transitioned over to ES with no regrets at all.
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 10143
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #9 on: November 18, 2024, 10:51:44 pm »
Why use crippled software when you can use GCC?

Sure, it can be a nuisance to get started with GCC, but once you get though that, you've got a very wide landscape of options.
But that said, the commercial compiler vendors also need they need something to compete, and they tend to have bundled libraries for USB & Ethernet stacks, MP3 players LCD libraries and such.
GCC itself is a fine tool. The problems come when you try debugging. That's kinda weak for a lot of embedded targets, compared to the better commercial tools.
 
The following users thanked this post: tooki

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #10 on: November 19, 2024, 07:15:22 am »
GCC itself is a fine tool. The problems come when you try debugging. That's kinda weak for a lot of embedded targets, compared to the better commercial tools.

Which forces you to use real, production-ready debugging strategies from day one, which becomes a huge timesaver in the long run.

Even if you are just a hobbyist, the time and grieve saved by the fact you don't have to learn new point&click tool every 3 years or for every microcontroller brand is invaluable. Not to even speak about the fact that you can't reproduce all bugs on a lab table with probe attached.

Start from "printf debugging" and extend from there as needed. The bottom line is, this is not the poor man's alternative, it's the other way around, these fancied "debug tools" are poor mans alternatives to real software practices. You do a colossal disservice to yourself by learning the wrong thing first, it is way more difficult then to unlearn.

Single stepping and watching memory in debugger trying to figure out what the code does is an absolutely archaic strategy and should be discouraged, not touted as some kind of professional way of working (even though professional software development processes are sometimes ineffective).

For example, pretty much the whole internet with all of its complexity (linux kernel, networking stacks etc.) is developed and managed using tools like GCC and not those "better commercial tools" (and trust me, if they were better, they would be used; e.g. Linus T. insisted on using commercial versioning system, bitkeeper, when suitable open source tool did not exist.) And one of the classic mistake young players (me in the past included) do is they think that developing embedded software is somehow fundamentally different to developing something like a linux kernel.

99% of your problems are higher level (than some peripheral register reacting unexpectedly to a write), therefore you should instrument and log on a higher level, but these fancy IDEs have no idea what your code means or is supposed to work. Single-stepping or adding breakpoints in a debugger is like digging a hole using a toothpick when you could use an excavator. Total stone age.

Validate function inputs. Log the calls and arguments. Log state changes. In your code, not depending on some tool and point&click because trying to reproduce a difficult problem on the lab with probe attached is colossal waste of time.

Trust me. You don't need any of that tool hell. gcc + binutils and learning suitable software practices is all you need.
« Last Edit: November 19, 2024, 07:31:25 am by Siwastaja »
 
The following users thanked this post: AndersJ

Online nctnico

  • Super Contributor
  • ***
  • Posts: 28497
  • Country: nl
    • NCT Developments
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #11 on: November 19, 2024, 12:38:26 pm »
GCC itself is a fine tool. The problems come when you try debugging. That's kinda weak for a lot of embedded targets, compared to the better commercial tools.

Which forces you to use real, production-ready debugging strategies from day one, which becomes a huge timesaver in the long run.

Even if you are just a hobbyist, the time and grieve saved by the fact you don't have to learn new point&click tool every 3 years or for every microcontroller brand is invaluable. Not to even speak about the fact that you can't reproduce all bugs on a lab table with probe attached.

Start from "printf debugging" and extend from there as needed. The bottom line is, this is not the poor man's alternative, it's the other way around, these fancied "debug tools" are poor mans alternatives to real software practices. You do a colossal disservice to yourself by learning the wrong thing first, it is way more difficult then to unlearn.

Single stepping and watching memory in debugger trying to figure out what the code does is an absolutely archaic strategy and should be discouraged, not touted as some kind of professional way of working (even though professional software development processes are sometimes ineffective).
I disagree with not needing a debugger at all. Especially when I work on inherited code or third party libraries, I find a debugger a useful tool every now and then to just see where a crash occurs or how data flows through the code. For example: at the moment I have a project where I need to modify code for a product developed somewhere in China which has a boatload of communication layers stacked on top of eachother. Setting a breakpoint at the data reception point and stepping through the code gives a good insight in what the hell is going on. For other software issues in this product I use the communication interface to output status messages to check program flow.

All in all Iike to use both methods. In some cases using a debugger is more convenient and in other cases using printf is more convenient. But this also depends on how well the debugger works. For STM32 debugging from Eclipse (CubeIDE) works pretty well using ST's own ST-link. For ESP32 I'm more inclined to use printfs even for cases where using a debugger would be more efficient as debugging the ESP32 from the SDK (Eclipse / GCC based) provided by Espressif is super flaky. Which circles back to the (IMHO valid) point coppice made that having good software tools to begin with is beneficial. Still, GCC and the associated tools are very good, in my experience most of the problems when debugging microcontrollers are in the software layer & hardware (JTAG/SWD interface) between GDB and the microcontroller hardware.

I do strongly agree with you though that good software starts with good coding practises.
« Last Edit: November 19, 2024, 12:45:34 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #12 on: November 19, 2024, 12:49:53 pm »
I disagree with not needing a debugger at all.

Me too. I seem to use it approx. once a year. For that, I use gdb, console interface because it does not need much setup (but I have to spend an hour googling and trying to remember how it is used because in a year, I forget all other commands except run, print and quit).

Quote
Especially when I work on inherited code or third party libraries, I find a debugger a useful tool every now and then to just see where a crash occurs or how data flows through the code. For example: at the moment I have a project where I need to modify code for a product developed somewhere in China which has a boatload of communication layers stacked on top of eachother. Setting a breakpoint at the data reception point and stepping through the code gives a good insight in what the hell is going on. For other software issues in this product I use the communication interface to output status messages to check program flow.

Sure, yeah, but note this is more like reverse-engineering or coping with a failed process out of necessity, more than a description how you should usually develop your own projects, if you have the choice. As a pragmatist I understand very well how this is sometimes needed, of course.
« Last Edit: November 19, 2024, 12:55:04 pm by Siwastaja »
 

Offline elektryk

  • Regular Contributor
  • *
  • Posts: 145
  • Country: pl
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #13 on: November 19, 2024, 02:05:51 pm »
I use debugger almost everytime, the only exception are MCUs, which don't support it.
Code with printf() everywhere remind me some Arduino programs (especially those for AVR 8 bit) where it adds a lot of delay.
I experienced this, when the program stopped working after commenting out some printf() calls.

All in all Iike to use both methods. In some cases using a debugger is more convenient and in other cases using printf is more convenient. But this also depends on how well the debugger works. For STM32 debugging from Eclipse (CubeIDE) works pretty well using ST's own ST-link. For ESP32 I'm more inclined to use printfs even for cases where using a debugger would be more efficient as debugging the ESP32 from the SDK (Eclipse / GCC based) provided by Espressif is super flaky.

That's why I only use ESP32 when I really need wireless connectivity, as a general purpose MCU I still preffer STM32.
« Last Edit: November 19, 2024, 02:14:07 pm by elektryk »
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #14 on: November 19, 2024, 02:41:08 pm »
Code with printf() everywhere remind me some Arduino programs (especially those for AVR 8 bit) where it adds a lot of delay.

Log into memory, and it's just a few instructions...

Quote
That's why I only use ESP32 when I really need wireless connectivity, as a general purpose MCU I still preffer STM32.

... and define some sort of protocol to get that log through the radio. Bang, you are already much better off than using SWD probe.

It isn't rocket science but requires a little bit of creativity to get used to. Then you can do pretty much anything you need. I like to do simple wrappers which insert C file line number and ancillary data for a trace of events, then transferred through radio or ethernet.

And remember you have no truly non-interfering debugger available anyway. The debugger competes with memory access cycles on a single-port RAM, and worse, reads peripheral registers where read operation itself triggers an operation (e.g. FIFO pop) with user wondering what the fuck is happening, when all you really needed to do is to store the value you read into a variable and print that variable out when you have time.
« Last Edit: November 19, 2024, 02:44:54 pm by Siwastaja »
 

Online 5U4GB

  • Frequent Contributor
  • **
  • Posts: 639
  • Country: au
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #15 on: November 19, 2024, 02:48:18 pm »
GCC itself is a fine tool.

... as long as you don't use it for any kind of mission-critical code.  That writeup actually mentions an issue I've run into on an RTOS-controlled device which, on critical errors, would drop into a while(1) until the watchdog restarted the system, thus providing rejuvenation-based error recovery.  Except that at some point gcc decided to silently remove the while(1) code so that on error it continued on in the error state.  There are plenty of related writeups that go into this, e.g. this one for security-specific issues and this for just outright WTF-ery.

Do the embedded-targeted compilers like Keil/Arm have this problem, or do they create object code that follows the programmer's intent?  Segger ES AFAIK is based on clang so would have the problems mentioned in the linked articles.
« Last Edit: November 19, 2024, 02:58:03 pm by 5U4GB »
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #16 on: November 19, 2024, 03:01:23 pm »
GCC itself is a fine tool.

... as long as you don't use it for any kind of mission-critical code.

Pretty extreme opinion, given that GCC is used in mission-critical code (whatever that could mean) all the time. Not a huge fan of GCC myself; everything has downsides and defects, probably the alternatives seem better only because they are less scrutinized due to seeing much less use. Safety critical industry especially uses esoteric stuff all the time and carefully limit themselves to specific constructs, workarounds and known limitations. Most importantly, they don't update their tools overnight without extensive testing.

Besides, I fail to see what the link you provide has anything to with GCC. Plus clearly the author does not have the slightest clue about how high-level optimized languages like C and C++ are supposed to work. Did you post a wrong link accidentally, maybe?

C and C++ have clear defects as languages but many people and companies seem to be able to cope with them, but this is getting quite off-topic already. And suggested "better" languages would also optimize out the maybeStop example, and possibly offer a standardized way to make stop variable something hardware is allowed to modify, just like C does, so  :-//

this for just outright WTF-ery.

Do the embedded-targeted compilers like Keil/Arm have this problem, or do they create object code that follows the programmer's intent?  Segger ES AFAIK is based on clang so would have the problems mentioned in the linked articles.

Problems like this surface from time to time and GCC developer attitude is sometimes very shitty when it comes to this "this is UB, we can break existing code in any way because it's not standard compliant" phenomenon. But you can rest assured every other compiler has occasional issues as well. I don't agree problems like this completely prevent the usage of gcc for "mission critical stuff".
« Last Edit: November 19, 2024, 03:39:25 pm by Siwastaja »
 
The following users thanked this post: newbrain, SparkMark

Offline elektryk

  • Regular Contributor
  • *
  • Posts: 145
  • Country: pl
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #17 on: November 19, 2024, 03:45:48 pm »
... and define some sort of protocol to get that log through the radio. Bang, you are already much better off than using SWD probe.

Good idea but that's also not always possible.

And remember you have no truly non-interfering debugger available anyway. The debugger competes with memory access cycles on a single-port RAM, and worse, reads peripheral registers where read operation itself triggers an operation (e.g. FIFO pop) with user wondering what the fuck is happening, when all you really needed to do is to store the value you read into a variable and print that variable out when you have time.

Also code compiled with -Og/-O0 may behave different than with -Os/-O3 but it is nice to know limitations of various debuging methods.
 

Online JPortici

  • Super Contributor
  • ***
  • Posts: 3578
  • Country: it
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #18 on: November 19, 2024, 04:10:16 pm »
Why use crippled software when you can use GCC?

Sure, it can be a nuisance to get started with GCC, but once you get though that, you've got a very wide landscape of options.
But that said, the commercial compiler vendors also need they need something to compete, and they tend to have bundled libraries for USB & Ethernet stacks, MP3 players LCD libraries and such.

Wish CLang would get more love...
GCC is pretty weak these days when it comes to C. Microchip's new compiler come with clangd-server for the integration with VS Code it's such a welcomed addition. Much better and useful warnings, which you can't simply get from GCC because of how it's implemented... such as knowing which functions are actually never called (whereas gcc's linker option to remove unused sections may will backfire very badly on indirect calls) or typos in sizeof, pointer arithmetics, ...
Which we also had smilarly on commercial compilers, just not with GCC.

I wonder how many bugs we would not have if we used CLang + static analysis instead of GCC + Static analysis
 

Offline KE5FX

  • Super Contributor
  • ***
  • Posts: 2113
  • Country: us
    • KE5FX.COM
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #19 on: November 19, 2024, 05:35:33 pm »
Problems like this surface from time to time and GCC developer attitude is sometimes very shitty when it comes to this "this is UB, we can break existing code in any way because it's not standard compliant" phenomenon. But you can rest assured every other compiler has occasional issues as well. I don't agree problems like this completely prevent the usage of gcc for "mission critical stuff".

It doesn't rule out gcc, but what it does do is make people compile at less-than-maximal levels of optimization, just to avoid being bitten by their "LOL that's UB we'll do whatever we want" attitude. 

At a time when safer code is called for, gcc has elected to compete by producing faster code, egged on by benchmark fanatics rather than actual customers.  Good job, guys... we'll all be forced to write Ada by the time you're done.  But hey, those elided while(1) loops will really run fast.
 
The following users thanked this post: 5U4GB

Online coppice

  • Super Contributor
  • ***
  • Posts: 10143
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #20 on: November 19, 2024, 05:40:19 pm »
I disagree with not needing a debugger at all.
I agree that debugging itself is something not everyone needs. However, these days the ability to get code in and out of an MCU is usually embedded in the debugger, so you at least need that.
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 28497
  • Country: nl
    • NCT Developments
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #21 on: November 19, 2024, 05:52:18 pm »
Problems like this surface from time to time and GCC developer attitude is sometimes very shitty when it comes to this "this is UB, we can break existing code in any way because it's not standard compliant" phenomenon. But you can rest assured every other compiler has occasional issues as well. I don't agree problems like this completely prevent the usage of gcc for "mission critical stuff".

It doesn't rule out gcc, but what it does do is make people compile at less-than-maximal levels of optimization, just to avoid being bitten by their "LOL that's UB we'll do whatever we want" attitude. 

At a time when safer code is called for, gcc has elected to compete by producing faster code, egged on by benchmark fanatics rather than actual customers.  Good job, guys... we'll all be forced to write Ada by the time you're done.  But hey, those elided while(1) loops will really run fast.
Mistakes like this happen at every level. People just don't oversee what effect some code has.

Years ago I reported a bug in the Linux kernel. I had a problem with a SOC which wouldn't always restart after a reset. It turned out that the reset code didn't reset the power management to supply the nominal voltage to the CPU (no de-init on reset). So when a reset happened during low frequency + low voltage, the voltage would remain too low and the CPU would not start reliable. The reply I got from one of the kernel maintainers was: Hmm, we removed the de-init calls before reset because we assumed this would be not needed. But it does explain some weird effects we see on various platforms (including PCs).
« Last Edit: November 19, 2024, 05:56:38 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #22 on: November 19, 2024, 07:22:54 pm »
It doesn't rule out gcc, but what it does do is make people compile at less-than-maximal levels of optimization, just to avoid being bitten by their "LOL that's UB we'll do whatever we want" attitude. 

At a time when safer code is called for, gcc has elected to compete by producing faster code, egged on by benchmark fanatics rather than actual customers.  Good job, guys... we'll all be forced to write Ada by the time you're done.  But hey, those elided while(1) loops will really run fast.

Typical alarmism. There is really no need to resort to defaulting to worse optimization levels due to this kind of fearmongering. I mean, bugs happen, this is why we have processes to write better software and testing to catch these bugs. It's not like we are talking about GCC being buggy, we are talking about gcc being unhelpful detecting some bugs caused by programming errors, a huge difference to begin with. And usually, gcc is quite helpful, but there have been a few cases where they have gone too far and being assholes about it, but still the right thing to do is to fix your broken code and go on with your life. And of course report bugs if you find actual compiler bugs but that is quite rare.

But really, usually the story is some horrible randomly cobbled together untested, unmaintained spaghetti codebase which can fail on any compiler and any optimization setting, and then instead of fixing it, it's easier to throw a temper tantrum at compiler optimizations, as compiling at -O0 seemingly fixes it, and this tantrum is easily fueled by some googling revealing blog posts with critique against GCC, some deserved, but this is all unnecessary rationalization, you should be spending the time fixing the code and not explaining.

Plus, of course the good old tale about C supposedly being a "portable assembler" still lives strong, regardless of such idea being completely dismissed by the C abstract machine concept already in the 1989 standard.

C is like any other high-level language, for example C will optimize away a non-volatile-qualified variable which holds a constant value like any other sensible modern language, deal with it, any other compiler will do it too except some super archaic one.

I have used GCC* for various projects for years and have never needed to decrease optimization level to solve a bug, and I'm not a particularly excellent programmer. I don't even use the "decrease the optimization just to see if bug goes away" faultfinding strategy, I think it's a horrible strategy because it never leads to any particular point in code. Just good old debugging strategies, think, validate, log, follow the leads, and you will find any bug. And if necessary, rewrite and simplify.

*) because I have been too lazy to give clang a try, I probably should
« Last Edit: November 19, 2024, 07:29:39 pm by Siwastaja »
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 10143
  • Country: gb
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #23 on: November 19, 2024, 07:27:34 pm »
But really, usually the story is some horrible randomly cobbled together untested, unmaintained spaghetti codebase which can fail on any compiler and any optimization setting, and then instead of fixing it, it's easier to throw a temper tantrum at compiler optimizations, as compiling at -O0 seemingly fixes it, and this tantrum is easily fueled by some googling revealing blog posts with well-deserved critique against GCC.
This is really common, and really annoying. So often, especially with open source projects for some reason, the only explanation for code that used to work not working with a newer compiler is the compiler is buggy. No verification at all. No introspection at all. Even when you submit a proper fix, they will often still be in denial, and reject the fix.
 
The following users thanked this post: Siwastaja

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 9439
  • Country: fi
Re: No more code-size-limited version of IAR embedded workbench for ARM?
« Reply #24 on: November 19, 2024, 07:32:41 pm »
This is really common, and really annoying. So often, especially with open source projects for some reason, the only explanation for code that used to work not working with a newer compiler is the compiler is buggy. No verification at all. No introspection at all. Even when you submit a proper fix, they will often still be in denial, and reject the fix.

This is similar to marginal electronic designs which work on a lab table but when you get a different batch of IC or transistor, it stops working. At that point engineer either accuses the supplier or manufacturer of supplying "bad parts", or looks in the mirror.

The "we must use -O0 from now on" tantrum equivalent would be noticing that running the thing in a fridge makes it work again and then complain all over the internets that manufacturer X is crap because they force us to run our electronics inside fridges.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf