Author Topic: ST or NXP  (Read 29104 times)

0 Members and 1 Guest are viewing this topic.

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
ST or NXP
« on: November 07, 2015, 12:46:02 pm »
Hi to all,

I'm close to the start of a project that will require intensive floating point calculations (I will need Cortex-M4 core) and I cannot decide what to use... What you can suggest about pros and cons of STM32xxxx and LPCxxxx (not only the chips but the whole development experience with the IDE, libraries, drivers, RTOS choices and etc.)?
 

Offline Yansi

  • Super Contributor
  • ***
  • Posts: 3893
  • Country: 00
  • STM32, STM8, AVR, 8051
Re: ST or NXP
« Reply #1 on: November 07, 2015, 01:40:12 pm »
We cannot choose you the best, you have to do it yourself, as we don't know any background you have or what will you need.

If the FPU is the only thing you need, then it seems it doesn't matter which one, just pick one and use it.

As for IDE,  both of these can be KEILed. I don't recommend macking around with that GCC stuff. Is inefficient, poor IDEs that will drive you nuts or even cannot offer most functionality professional IDEs do. - But this also depends on what you need to do.

You should be more specific in your goal, as both ST and NXP has extensive sets of products with CM4 cores, from those small  things up to some pretty beasts.
 

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #2 on: November 07, 2015, 01:51:48 pm »
I will need 2 or 3 UARTs and SD-card interface. I'm worried about the usability of the libraries/drivers for these interfaces! Is it better to write my own code for the peripherals or to use some of the libraries? Also NXP has their own IDE; ST relay on 3-rd party IDEs  - it's a big mess with these so many options...
 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #3 on: November 07, 2015, 03:02:48 pm »
Least year I faced a similar choice between ST and NXP (for a M0) and ended up with NXP, big part because of the free tool chain. Lpcxpresso is a single package install IDE (eclipse based) that works out of the box.
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4227
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: ST or NXP
« Reply #4 on: November 07, 2015, 03:05:42 pm »
ST's Hardware Abstraction Layer has few friends here. IMHO it fails in that basic task; each library call requires a complete set of parameters which match up reasonably well with the actual underlying registers, and all it really does is sanity check them before writing them to the hardware.

For an IDE I use CrossWorks, which is a great deal less costly than Keil but has excellent device support. It scores well in the "just works" stakes, but the basic package gives you basic start-up code and C libraries, and that's about it. If you want driver code for things like storage devices, then it's extra.

Offline Jeroen3

  • Super Contributor
  • ***
  • Posts: 4078
  • Country: nl
  • Embedded Engineer
    • jeroen3.nl
Re: ST or NXP
« Reply #5 on: November 07, 2015, 03:08:24 pm »
NXP does not offer their own ide. They acquired CodeRED which offered an IDE that is based on Eclipse, as is the competition.
Yansi made some dangerous statements by saying GCC is inefficient, shipped with poor IDE's and does not offer professional features.
While partly true for average or new users, it can outperform expensive toolkits when used by highly skilled people. Yet, you need to be highly skilled before the project to not waste any time on it.

When you actually buy keil, you get a lot of middleware with it. Instead of relying on the various open source alternatives which can also take a lot of time for first time users. Compare the cost of an proprietary ide with your hourly rate and see how many hours you can work for that. Not saying keil middleware is of the the utmost fine quality, I have no experience with it. Both vendors come with some version of segger Emwin as binary library.

The main difference between ST and NXP is the level of complexity of the peripherals. ST offers complex peripherals which can do all kind of tricks in hardware, while NXP keeps it more basic. But nonetheless very versatile if you can afford a few lines of code in an interrupt.
For FPU parts you can use either LPC4xxx or LPC5xxxx, or STM32F4 or STM32F7.
Unique points on the NXP side are the dual core or flashless parts, and state configurable timer.
Unique points on the ST side are the camera interface, and fancy DMA for display purposes.
« Last Edit: November 07, 2015, 03:10:22 pm by Jeroen3 »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #6 on: November 07, 2015, 03:18:14 pm »
Yansi made some dangerous statements by saying GCC is inefficient, shipped with poor IDE's and does not offer professional features.
While partly true for average or new users, it can outperform expensive toolkits when used by highly skilled people. Yet, you need to be highly skilled before the project to not waste any time on it.
I agree. A vendor provided IDE like Keil's uVision can be a quick start to get a blinking led going quickly but for serious software development it falls short quickly. GCC + Eclipse are powerful tools for serious software engineering but they do take time to learn. For example: I had to use Freescale's own compiler for a certain microcontroller. There are many things which Freescale's linker can't do but GCC's ld (linker) has no problem with. For a led blinking program that is not a problem but if you want to map variables onto an internal eeprom a good linker is a huge aid to keep things simple and have some self-checking (warning for overlaps) mechanisms in place.

Regarding NXP and ST: I have looked at both in the past en did hands on evaluations. I quickly found the devices from NXP more mature (run from flash at full speed for example) and the pheripherals have no hidden surprises. Either way: make sure to check the errata sheets and look very close whether the function you want is actually supported in the hardware or doesn't interfere with other functions.

I'm also not a fan of vendor provided libraries. Often bloated and written by interns. For dealing with an SD card I can recommend the Fatfs library. A UART is simple enough to come with something yourself or re-use something from the past. I think my own UART handling routines (with circular buffers) are about 2 decades old by now.
« Last Edit: November 07, 2015, 03:41:50 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #7 on: November 07, 2015, 08:45:51 pm »
For an alternative to the vendor libraries etc., you may take a look at our JumpStart API:
http://c4everyone.com/index.php/technologies/jumpstart-api

It gets you up and going with initial stuff fast and continues to help you writing code throughout the project. It works for the STM32F0xx series and we will release the STM32F4xx support today. Once we add the STM32F1xx and STM32F2xx, we will probably add support for the LPC series afterward.
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #8 on: November 07, 2015, 09:16:31 pm »
For an alternative to the vendor libraries etc., you may take a look at our JumpStart API:
http://c4everyone.com/index.php/technologies/jumpstart-api

"mbed also relies on a cloud compiler, which engineers cannot rely on for production."

FYI, that's inaccurate. Mbed supports local toolchains. For example I export my mbed projects to lpcxpresso and do all development locally.  It's a single file project export.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #9 on: November 07, 2015, 09:46:22 pm »

"mbed also relies on a cloud compiler, which engineers cannot rely on for production."

FYI, that's inaccurate. Mbed supports local toolchains. For example I export my mbed projects to lpcxpresso and do all development locally.  It's a single file project export.

Thanks, the points are that by itself, no engineer will ship products using the mbed compiler, and that once you export mbed to a local project, you are now back to either using GCC+Eclipse, or $$$ for IAR/Keil, but I will modify the wordings so it's less ambiguous.

Actually, it's a bit surprising to me that NXP bought Code Red and nothing comes of it. I mean, what's the point?!
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline Yansi

  • Super Contributor
  • ***
  • Posts: 3893
  • Country: 00
  • STM32, STM8, AVR, 8051
Re: ST or NXP
« Reply #10 on: November 07, 2015, 10:07:28 pm »
Yansi made some dangerous statements by saying GCC is inefficient, shipped with poor IDE's and does not offer professional features.
While partly true for average or new users, it can outperform expensive toolkits when used by highly skilled people. Yet, you need to be highly skilled before the project to not waste any time on it.
I agree. A vendor provided IDE like Keil's uVision can be a quick start to get a blinking led going quickly but for serious software development it falls short quickly. GCC + Eclipse are powerful tools for serious software engineering but they do take time to learn. For example: I had to use Freescale's own compiler for a certain microcontroller. There are many things which Freescale's linker can't do but GCC's ld (linker) has no problem with. For a led blinking program that is not a problem but if you want to map variables onto an internal eeprom a good linker is a huge aid to keep things simple and have some self-checking (warning for overlaps) mechanisms in place.

Regarding NXP and ST: I have looked at both in the past en did hands on evaluations. I quickly found the devices from NXP more mature (run from flash at full speed for example) and the pheripherals have no hidden surprises. Either way: make sure to check the errata sheets and look very close whether the function you want is actually supported in the hardware or doesn't interfere with other functions.

Keil or IAR fails with what?  GCC  IS inefficient, in performance regards, IDE regards, all. I know quite a bunch of as you say "skilled" programmers who hate proprietary toolchaines so they use GCC and such freewares. They rarely use onchip debugging, as their crappy IDEs doesn't support that well, they know mostly nothing about  SWO/Trace capabilities (still debug  using UART and FT232 cable... sigh), have problems controlling the programmers (everything is a problem, like "connect under reset" on STLink or so)... now tell me what is efficient. I'd rather learn how to code stuff than spend time learning how to use the toolchain I use for coding.
And also you should explain, why most bigger design companies do use proprietary toolchains, rather than opensource stuff with ZERO support and warranty on it.
But thats just pure opinion, your's might differ. I am a big fan of proprietary toolchains, as they work better for me, every time. Nothing has convinced me enough yet to switch from them to opensource. (Maybe I am used to them as I work just in one of the big corporates, that prefers Keil/IAR)


And also, ST has free toolchain. It is called "System Workbench for STM32" (www.openstm32.org). But never have tried that. Is it worth a try?  ???

Regarding the second paragraph: What do you mean by "run from flash at full speed for example"? Never heard of flash with such low access times. Strongly doubt NXP has its own flash technology, thats so much better and faster than others. Maybe they only didn't tell you, that there are waitstates  ;D

PS: Mbed is c*p for kids, not for serious production.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #11 on: November 07, 2015, 10:17:26 pm »
Indeed some companies have their own flash technology as a competitive advantage. Back in the ARM7 days, one differentiator between vendors' offerings was the flash access speed: some claim they fetch 128 bits at once, some claim other speed-up measures. With the STM32F401/411, when you run at 84 MHz, you have to program the flash to run at 5 wait states. It scales up when you run at 168 MHz. 
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #12 on: November 07, 2015, 10:31:40 pm »
@Yansi: I have used the IAR, Keil's, Analog Devices', Freescale, etc proprietary IDEs and they all suck big time compared to Eclipse. Ever wondered why so many vendors are switching to Eclipse? I have also used my fair share of on-target debugging tools as well but have discovered they are a huge waste of my time. Most are buggy/unstable as hell and besides that there is very little use to debug software running inside a micro. IRQs make it almost impossible to really halt somewhere (you are basically debugging an application with multiple parallel threads!) and some things you actually don't want to halt. At the other end of the spectrum is the higher level code which is much easier/efficient to debug / unit test on a PC. Now unlike any microcontroller compiler GCC is also available for x86 so moving code between a PC and a microcontroller is a piece of cake. Add the excellent debug capabilities of Eclipse CDT to that and you are all set.

Yes, I do know NXP's controllers can add wait states for flash accesses but the flash in the NXP devices have a pre-fetcher so the performance is still better than ST's flash.
« Last Edit: November 07, 2015, 10:33:34 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline hans

  • Super Contributor
  • ***
  • Posts: 1637
  • Country: nl
Re: ST or NXP
« Reply #13 on: November 07, 2015, 10:33:19 pm »
My experience is the complete opposite. I need to use "google" as my primary support tool for Keil as well, and chances are I have more search results for GNU tools than for Keil or IAR.
We used to work with IAR at work, but switched to Keil. I needed to compile (several firmware builds we want to try in the field) and download (released bootloader image) a few SREC images for a few test units. Couldn't find the "Download SREC" and "Compile to SREC" button in Keil. Was thoroughly disappointed.

I have setup my hobby toolchain now with Qt Creator + GCC/GDB + OpenOCD. It has full debugging support, I can write batch scripts for anything I need (like production programming of any file), and IMHO most importantly I wrote my QBS project file in such a way that I can seamlessly switch between x86 device and an ARM device. Once the hardware is fully abstracted or mocked it's pretty darn easy to test program logic & protocols on a desktop machine. And that with the click of 1 button, staying in the exact same IDE & file views, etc.


As for the OP's question: I agree that NXP products seem very mature and robust, IMHO more tailored towards that than the absolute lowest price or highest banner specs. ST seem more feature rich, but consider the front-page just as a brochure (good general tip for any datasheet anyway). Sometimes some combinations are not possible on the ST devices, like I was a bit disappointed in how DMA works on STM32F4 compared to the PIC's or AVR's (which is much simpler to setup). STM32Cube is a nice program however to lay out peripheral pinning on the bigger chips, where finding a path through 6 SPI's, 5 I2C's, 8 UARTS, SDRAM, LCD, Ethernet and 3 ADC's is quite dizzying.

I think both boards (LPCxpresso vs ST Link) have pretty similar debuggers, which only work for NXP/ST brands. Watch out for the older LPCxpresso boards though, they only work with CodeRed's IDE (which also has code size limits I believe).
« Last Edit: November 07, 2015, 10:42:05 pm by hans »
 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #14 on: November 07, 2015, 11:51:52 pm »
Actually, it's a bit surprising to me that NXP bought Code Red and nothing comes of it. I mean, what's the point?!

Don't they have now control over lpcxcpresso that they provide for free under some restrictions?

Personally I find it to be very useful, an easy to install full featured IDE with seamless support for hardware debugger,  a toolchain that can be used from the IDE or command line (it generates standard make files) and runs on all three OSs.


 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #15 on: November 07, 2015, 11:54:29 pm »
As for IDE,  both of these can be KEILed. I don't recommend macking around with that GCC stuff. Is inefficient, poor IDEs that will drive you nuts or even cannot offer most functionality professional IDEs do. - But this also depends on what you need to do.

Are you referring to LPCXpresso?

Works great for me, no need to mack with anything, just download and install a single package.

 

Offline exmadscientist

  • Frequent Contributor
  • **
  • Posts: 342
  • Country: us
  • Technically A Professional
Re: ST or NXP
« Reply #16 on: November 08, 2015, 05:40:31 am »
I've bought fully in to NXP's LPC line here, and I haven't really regretted it. I obviously haven't got the same level of experience with ST, so I can't say how green the grass is on that side of the fence, but here are some thoughts about the LPC ecosystem:

The good:
  • LPCXpresso is a good free IDE. Eclipse has its own pros and cons, but everything does and LPCXpresso is, on balance, as good as anything you'll find.
  • GCC versions are usually kept fairly up to date. In particular, recent LPCXpresso releases have direct interface support and access to GCC's link-time optimizations (LTO). If you haven't used it, LTO is amazing for most embedded work; it cuts the final sizes of some of my binaries in half!
  • You can always go to a bog-standard GCC-based setup if you exceed the capacity of the free LPCXpresso version, rather than being forced to pay to upgrade.
  • Their HAL/interface libraries, LPCOpen, are fairly comprehensive and mostly work. Mostly. There's surprisingly good commonality between the large (Cortex-M4) and small (M0) chips.
  • Few severe chip errata. I've hit a couple (including a nasty 5V tolerance issue on USB pins that got "fixed" by clarifying the datasheet...), but for the most part all those complicated peripherals work right.
  • Good selections of MCU variants to cover most needs, from M0 to triple-core M4F+2xM0.

The bad:
  • Peripherals are often complicated and sometimes (very) poorly documented. My life got a lot easier when I figured out which ARM peripherals were licensed in certain chips and could just go direct to ARM's documentation for that IP core.
  • Pin assignment sucks. But I think that's true of all complicated microcontrollers these days. (I have not been blessed enough to try one of the newer designs with a full I/O routing matrix... someday one of those parts will be the right chip!)
  • Some LPCOpen interfaces are brain-damaged. Often it's less work to just roll your own than figure out how to use the crap in the library.
  • No M7 core options (yet).

The ugly:
  • LPCOpen's example code was written by what I can only assume were outsourced wage slaves. It's terrible in pretty much every respect.
  • Trying to use the baked-in ROM drivers is a waste of time. Just write your own code -- that way you can actually debug it. The ROM drivers are buggy as hell, and source code is not provided.
  • Good luck figuring out which of their USB stacks to use. Navigating the USB maze to figure out where to start and what is and isn't obsolete is actually harder than the work to get USB going on the damned things.
  • Errata don't get fixed quickly. Yeah, I know masks are expensive and they're minor things. I still like seeing errata sheet entries get crossed off!
 

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #17 on: November 08, 2015, 05:43:34 am »
You say that the flash cannot be fast enough to run at full speed. Did you ever reached the point where you must copy a function or group of functions to the RAM in order to execute them at full speed guaranteed?
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #18 on: November 08, 2015, 09:01:13 am »
The good:
  • Their HAL/interface libraries, LPCOpen, are fairly comprehensive and mostly work. Mostly. There's surprisingly good commonality between the large (Cortex-M4) and small (M0) chips.
That is one thing I have always liked about NXP's ARM (ARM7 / Cortex) portfolio: they used the same peripherals for all their chips so existing code is easy to port and you can get a 'new' controller going quickly.[/list]
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #19 on: November 08, 2015, 09:29:45 am »
That is one thing I have always liked about NXP's ARM (ARM7 / Cortex) portfolio: they used the same peripherals for all their chips so existing code is easy to port and you can get a 'new' controller going quickly.

Very good to know, it will make our life easier when we add support for them :-) Did NXP ever release their dual-core LPC4350? I did some work for it as part of their push-out, but I think they re-churned the silicon.
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4227
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: ST or NXP
« Reply #20 on: November 08, 2015, 10:36:36 am »
You say that the flash cannot be fast enough to run at full speed. Did you ever reached the point where you must copy a function or group of functions to the RAM in order to execute them at full speed guaranteed?

ST advertise something called "ART Accelerator", which as far as I can tell means it reads enough bits in parallel from the Flash to execute code at full speed without having to copy to RAM first.

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #21 on: November 08, 2015, 10:42:31 am »
I read about this ART accelerator and it's a combination of prefetch and cache. Does anyone knows how big is this cache?
 

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #22 on: November 08, 2015, 11:11:12 am »
After a short search I found that it's 256 entries each 128 bits = 4 KB

So if my part of the program that needs to run fast is < 4KB - I don't have to do anything special. Otherwise - copy in the RAM and run from there.
 

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #23 on: November 08, 2015, 11:21:47 am »
Just for comparison - NXP also has "flash accelerator" that reads 256 bits at once.
 

Offline Jeroen3

  • Super Contributor
  • ***
  • Posts: 4078
  • Country: nl
  • Embedded Engineer
    • jeroen3.nl
Re: ST or NXP
« Reply #24 on: November 08, 2015, 11:50:20 am »
Flash is wider than 32 bit. Thus one fetch catches multiple instructions since instructions are 16 bit thumb.
Both vendors have accelerators for internal flash, to prefetch for sequential code and cache for loops. But you'll have cache miss when you go to interrupts (put these in ram) or far away functions.
The flash accelerators do not work for external memories if I recall correctly..

Yes, they did release the lpc4350, after several silicon revisions. The SGPIO peripheral is a one of a kind not seen in the competitors yet. Yet it's so simple.

Read the errata sheet before using any of the chips in this class.
 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #25 on: November 09, 2015, 10:30:08 pm »
One nice feature in some NXP MCUs is the usb bootloader. You reset the MCU with a special pin pulled down and it mount itself on your computer as a removal USB hard drive. All it take to update the firmware is a drag and drop of the new binary file. No special cables, not special adapters, no need to install any software on the host. Perfect for field firmware updates.
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: ST or NXP
« Reply #26 on: November 09, 2015, 11:33:04 pm »
And also, ST has free toolchain. It is called "System Workbench for STM32" (www.openstm32.org). But never have tried that. Is it worth a try?  ???

it's not an official ST toolchain, it's an other eclipce based IDE.
i tried it for a short time only to go back to IAR, so many problem some of them you end up heating you head into the wall, with not so good community support, starting with compatibility with st-link and that under-reset crap error , slow flashing and debugging ( few second more that IAR ), unpractical for debug ,you have to perform 4 to 5 mouse clic to enter or exit debug mode , NO RESET BUTTON ON DEBUG INTERFACE , i have to restart it all from the beginning, the target will stop running when exit debug mode , and so on.
The nice thing though is that it support standard peripheral library , so you can use same file you wrote with IAR under OpenSTM32 with no trouble, just put them on the workspace and you can compile them quite easily.

As i didn't use any NXP , can't help on this side, but for ST their uC are pretty awesome, but check erreta first. and avoid using HAL library ... the one who invented that HAL crap deserve a place in hell, standard peripheral library is much readable and easier to follow. 
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #27 on: November 10, 2015, 12:21:44 am »
You say that the flash cannot be fast enough to run at full speed. Did you ever reached the point where you must copy a function or group of functions to the RAM in order to execute them at full speed guaranteed?

In flashless devices like the LPC4370 that use external quad SPI flash, there is a proprietary and largely undocumented (publicly) flash cache/prefetch interface. Depending on what you're doing, you might suffer a 50% degradation in speed compared to running in a carefully selected RAM segment. Sometimes, you won't notice any problems at all. If you move your pinch points and ISRs to RAM segment(s) then you'll get up to zero wait state performance. I had to do this for some demo DSP code I wrote to demodulate RF direct into the chip's high speed ADC. There was barely 5% CPU left.

http://youtu.be/lD3WSCN-5u0
 

Offline MT

  • Super Contributor
  • ***
  • Posts: 1616
  • Country: aq
Re: ST or NXP
« Reply #28 on: November 10, 2015, 12:38:02 am »
QUAD SPI is interesting, also supported on new F446. I got an idea for a design that intentionally was to be used with F429 and SDRAM
but i figured after reading QUAD SPI can run code it could be done with a F446. I have to compare  ST and NXP devices.

446 quad spi spec:
• Three functional modes: indirect, status-polling, and memory-mapped
• Dual-flash mode, where 8 bits can be sent/received simultaneously by accessing twoFlash memories in parallel.
• SDR and DDR support
• Fully programmable opcode for both indirect and memory mapped mode
• Fully programmable frame format for both indirect and memory mapped mode
• Integrated FIFO for reception and transmission
• 8, 16, and 32-bit data accesses are allowed
• DMA channel for indirect mode operations
• Interrupt generation on FIFO threshold, timeout, operation complete, and access error
« Last Edit: November 10, 2015, 12:47:15 am by MT »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #29 on: November 10, 2015, 12:41:47 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #30 on: November 10, 2015, 12:46:16 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.

Why would you get memory corruption (other than shit code!)?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #31 on: November 10, 2015, 12:52:54 am »
QUAD SPI is interesting, also supported on new F446. I got an idea for a design that intentionally was to be used with F429 and SDRAM
but i figured after reading QUAD SPI can run code it could be done with a F446. I have to compare  ST and NXP devices.

446 spec:
• Three functional modes: indirect, status-polling, and memory-mapped
• Dual-flash mode, where 8 bits can be sent/received simultaneously by accessing twoFlash memories in parallel.
• SDR and DDR support
• Fully programmable opcode for both indirect and memory mapped mode
• Fully programmable frame format for both indirect and memory mapped mode
• Integrated FIFO for reception and transmission
• 8, 16, and 32-bit data accesses are allowed
• DMA channel for indirect mode operations
• Interrupt generation on FIFO threshold, timeout, operation complete, and access error

It's been a few months, but ISTR that the LPC4370 will run quad SPI flash at up to 102MHz DDR. While it's interesting, one problem is if you want to protect your IP. Newer "S" devices have built in AES fuses which you may be able to use in some kind of way, I don't think the chip directly supports encrypted images, you'd probably need a bootloader, sounds messy.
 

Offline MT

  • Super Contributor
  • ***
  • Posts: 1616
  • Country: aq
Re: ST or NXP
« Reply #32 on: November 10, 2015, 01:01:13 am »
Im not going to use encrypted material such as pictures so that's one stone off my chest i presume! :phew:
Thanks for the info!
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #33 on: November 10, 2015, 01:08:24 am »
One last thing about the Quad SPI is that NXP only certify a limited subset of devices, most of which I found were difficult to come by in non-production quantities.
 

Offline MT

  • Super Contributor
  • ***
  • Posts: 1616
  • Country: aq
Re: ST or NXP
« Reply #34 on: November 10, 2015, 01:17:09 am »
Ops! Do NXP mentions this officially somewhere?
« Last Edit: November 10, 2015, 01:23:53 am by MT »
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #35 on: November 10, 2015, 01:19:53 am »
Ops! Do NXP mentions this officially somewhere?

The specific devices they support are in the datasheet, but not that some of them are difficult to source!
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #36 on: November 10, 2015, 01:22:37 am »
Im not going to use encrypted material such as pictures so that's one stone off my chest i presume! :phew:
Thanks for the info!

By "images" I meant binary code blobs, not pictures!
 

Offline MT

  • Super Contributor
  • ***
  • Posts: 1616
  • Country: aq
Re: ST or NXP
« Reply #37 on: November 10, 2015, 01:25:05 am »
Yes! That was i thought you meant! Porn images!Binary code boobs! Stone off my chest! :)
« Last Edit: November 10, 2015, 01:29:03 am by MT »
 

Offline Jeroen3

  • Super Contributor
  • ***
  • Posts: 4078
  • Country: nl
  • Embedded Engineer
    • jeroen3.nl
Re: ST or NXP
« Reply #38 on: November 10, 2015, 06:49:11 am »
The spifi bootmode has seen some bugs requiring a double reset to work. Make sure you test this before sending the bom to assembler.
Generally, the lpc18/43 family has been suffering a lot of bugs in its early days. From corrupted ADC when using CAN (or vice verse can't remember) to non functional bootloaders.
Please read the errata sheet for these parts.

This does not make it a useless part. It's the only dual core arm cortex m on the market.
 

Offline BoomerangTopic starter

  • Regular Contributor
  • *
  • Posts: 52
Re: ST or NXP
« Reply #39 on: November 10, 2015, 06:51:06 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.

This is from your experience or this is information from some other source? Flash ECC is reasonable, but if there is chance of memory coruption - why they don't make RAM ECC ?!? And the most interesing (for me) is how exactly runing code from RAM leads to memory coruptions?!? Is this limited to NXP or it's valid for all ARM MCUs?
« Last Edit: November 10, 2015, 07:13:08 am by Boomerang »
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8637
  • Country: gb
Re: ST or NXP
« Reply #40 on: November 10, 2015, 07:19:20 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.

This is from your experience or this is information from some other source? Flash ECC is reasonable, but if there is chance of memory coruption - why they don't make RAM ECC ?!?
A dirty little secret (its not really secret, but few engineers bother to read the details) of the flash MCU business is that a very small percentage of flash cells will leak rapidly and loose their contents. The semiconductor physics people tell me this is unavoidable, and applies to everyone's flash MCU process. Depending on various factors, like the flash size, maybe 1 in 10k devices will suffer this failure. Some MCUs use error correcting flash to work around this issue. It adds to the cost, so only a limit number of devices are built this way. In the devices I have seen with error correcting flash, there is no similar protection for the RAM.
 

Offline andersm

  • Super Contributor
  • ***
  • Posts: 1198
  • Country: fi
Re: ST or NXP
« Reply #41 on: November 10, 2015, 07:36:31 am »
Running code from RAM is not an automatic performance improvement. Not all devices can fetch code and data from RAM simultaneously, and instead have to insert extra wait cycles.

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8637
  • Country: gb
Re: ST or NXP
« Reply #42 on: November 10, 2015, 07:41:44 am »
Running code from RAM is not an automatic performance improvement. Not all devices can fetch code and data from RAM simultaneously, and instead have to insert extra wait cycles.
That depends on which performance parameter you are measuring. Running from RAM almost always reduces current consumption. Many of the cheat ULP demos are run from RAM, while running from flash on the same device might give very unimpressive power figures.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #43 on: November 10, 2015, 08:11:08 am »
Running code from RAM is not an automatic performance improvement. Not all devices can fetch code and data from RAM simultaneously, and instead have to insert extra wait cycles.

In my note about running from RAM in the LPC4370, is stated "carefully selected RAM segment" for precisely this reason.  :D
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4199
  • Country: us
Re: ST or NXP
« Reply #44 on: November 10, 2015, 08:32:06 am »
Quote
Running from RAM almost always reduces current consumption.
Presumably because you can turn the flash section completely off, rather than running from RAM actually being more power efficient.
(Pram + Pflash < Pram + 0, for all real values of Pflash)
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8637
  • Country: gb
Re: ST or NXP
« Reply #45 on: November 10, 2015, 09:18:30 am »
Quote
Running from RAM almost always reduces current consumption.
Presumably because you can turn the flash section completely off, rather than running from RAM actually being more power efficient.
(Pram + Pflash < Pram + 0, for all real values of Pflash)
During cycles when RAM isn't used its consumption in miniscule. During cycles when most types of flash are not read their consumption is very small. This isn't about the consumption from having these modules turned on. Its the energy per cycle from reading them which is important in ULP applications. The energy consumption of a read cycle from flash is bigger than the energy consumption of a read from RAM. For some ULP flash processes the ratio isn't too high. For other flash processes the difference is huge.
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: ST or NXP
« Reply #46 on: November 10, 2015, 09:22:47 am »
Flash news ...
Atollic just announced removal of code size limitation from Atollic TrueSTUDIO Lite.  :clap:
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #47 on: November 10, 2015, 10:33:09 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.
Why would you get memory corruption (other than shit code!)?
Just apply Murphy's law. It is not only bad code which can change memory but also Alpha radiation and other mayhem which is why servers usually have ECC memory. Google did quite a bit of research on this. IIRC MISRA rules also forbids to have anything in RAM which can affect program flow (like function pointers).
« Last Edit: November 10, 2015, 10:36:07 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8637
  • Country: gb
Re: ST or NXP
« Reply #48 on: November 10, 2015, 10:35:46 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.
Why would you get memory corruption (other than shit code!)?
Just apply Murphy's law. It is not only bad code which can change memory but also Alpha radiation and other mayhem which is why servers usually have ECC memory. Google did quite a bit of research on this. IIRC Misra also forbids to have anything in RAM which can affect program flow (like function pointers).
MISRA doesn't prohibit function calls, and the return address goes in RAM.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #49 on: November 10, 2015, 10:39:15 am »
IIRC MISRA only allows function pointers in ROM. There is a statistical probability as well. A return address is only stored briefly on the stack. A piece of code may have to reside in RAM for days up to decades so corruption due to radiation becomes a much more likely scenario (besides common problems such as stack and buffer overflows).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #50 on: November 10, 2015, 11:02:19 am »
I'd be wary of running code from RAM. Too much chance on memory corruption so you'd have to implement some kind of self-checking. Another fun fact: NXP's flash has error correction built in.
Why would you get memory corruption (other than shit code!)?
Just apply Murphy's law. It is not only bad code which can change memory but also Alpha radiation and other mayhem which is why servers usually have ECC memory. Google did quite a bit of research on this. IIRC MISRA rules also forbids to have anything in RAM which can affect program flow (like function pointers).

I would say it's down to a risk analysis. I had not been aware of how bad the problem really is in terrestrial applications: the only place I deal with this is in space applications where you plan and design around failure, both hard and soft, it's not a matter of "if", more a matter of "when".

The terrestrial applications I design frankly it wouldn't matter, the device is orders of magnitude more likely to fail for some other reason - I get three or four devices returned to me for repair after a lightning strike each year for example, now that's definitely a hard error!
 

Offline Bassman59

  • Super Contributor
  • ***
  • Posts: 2501
  • Country: us
  • Yes, I do this for a living
Re: ST or NXP
« Reply #51 on: November 10, 2015, 04:43:57 pm »
Generally, the lpc18/43 family has been suffering a lot of bugs in its early days. From corrupted ADC when using CAN (or vice verse can't remember) to non functional bootloaders.
Please read the errata sheet for these parts.

The LPC18xx has a bug which makes using the external memory interface problematic. See here. I have not checked to see if that was fixed in a later rev of the silicon.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: ST or NXP
« Reply #52 on: November 11, 2015, 08:10:44 am »
Most has been said, miss just two points that might be important for the OP:

- choice in esp. ROM sizes in the manufacturers portfolio. ST wins hands down on that. So if you choose a certain microcontroller, ST offers multiple packages with multiple ROM sizes so if you run out codespace you often can find a compatible package, often even the same micro, with larger ROM. Not always ofcourse there are limits but IMO NXP has far less choice, if you run out you can have a problem.
- price in quantities there are big differences between ST and NXP.
 

Offline Sal Ammoniac

  • Super Contributor
  • ***
  • Posts: 1668
  • Country: us
Re: ST or NXP
« Reply #53 on: November 13, 2015, 06:10:04 am »
To me, one of the key decision factors is the quality of a vendor's documentation. I refuse to use parts from a vendor that has poorly written, confusing, and/or inaccurate data sheets and user's guides even if the part itself is technically excellent. It's just not worth my time to figure out how the part works by what amounts to trial and error.

I use a third-party toolset (Rowley Crossworks), so I'm not tied to any particular vendor's tools. This gives me the freedom to use a Cortex-M part from any vendor, whereas vendor-specific toolsets are often tied to that vendor's parts and won't work with anyone else's parts. I also avoid anything based on Eclipse, primarily because I just don't like it. We use both Keil and IAR at work, and I like both of them, but for my own hobby use at home I just can't justify the high cost of these tools.

With respect to NXP versus ST, I find NXP's documentation better than ST's, and ST's peripherals a little better (and simpler) than NXP's. So for me, it's a toss-up between NXP and ST, and I use both. Right now I'm primarily using ST, however, because NXP doesn't have any Cortex-M7 offerings yet.

I never use vendor-supplied libraries. I prefer to write my own, from scratch. I've found that vendor libraries are often buggy as hell. My code is usually simpler and more robust anyway (primarily due to my thirty years of experience writing embedded code).
« Last Edit: November 13, 2015, 06:12:45 am by Sal Ammoniac »
Complexity is the number-one enemy of high-quality code.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #54 on: November 13, 2015, 04:49:01 pm »
I also avoid anything based on Eclipse, primarily because I just don't like it.
I agree Eclipse looks odd at first and seems convoluted. I took me a while to grasp the concepts behind it because Eclipse has been build from a viewpoint where managing source code and dealing with large projects (with sub-projects) is important to increase productivity and maintainability. Many of the typical microcontroller IDEs are poorly equiped editors with a compile button and some debugging bolted on. These IDEs are totally useless where it comes to navigating through code and figuring out how a piece of software has been put together. In Eclipse this is a trivial task.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Omicron

  • Regular Contributor
  • *
  • Posts: 142
  • Country: be
Re: ST or NXP
« Reply #55 on: November 13, 2015, 05:28:17 pm »
I use a third-party toolset (Rowley Crossworks), so I'm not tied to any particular vendor's tools.

I use Crossworks as well. I have the same sentiment towards Eclipse. I've used it in the past to do Java development (non embedded) and it worked really great for that. But every time I've tried to use the C/C++ tools it has frustrated the hell out of me. Not because of lacking functionality, it has everything you could wish for, but more because of it's convoluted settings architecture and weird behaviour with the indexer getting out of sync causing the strangest errors (like it all compiles just fine in the evening but when you return next morning all of a sudden there's tons of compile errors because somehow it's forgotten where to find the headers). I'm on OSX, maybe CDT is better behaved on other platforms? Crossworks works a lot better for me. It also has a very responsive debugger. I always thought embedded debuggers where inherently slow until I tried Crossworks. That debugger just flies in comparison to GDB on Eclipse (like single stepping that can actually keep up). And it has good support for debugging NXPs multi core parts.


We use both Keil and IAR at work, and I like both of them, but for my own hobby use at home I just can't justify the high cost of these tools.

I'd be interested to learn what your experience is with IAR and Keil compared to Crossworks. Would you still go for Keil or IAR in a professional setting?
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #56 on: November 13, 2015, 09:46:51 pm »
..Many of the typical microcontroller IDEs are poorly equiped editors with a compile button and some debugging bolted on. These IDEs are totally useless where it comes to navigating through code and figuring out how a piece of software has been put together. In Eclipse this is a trivial task.

I don't know where you get this idea from, most 3rd party IDEs are quite competitive with Eclipse. Ours is based on CodeBlocks, and Atmel Studio is based on Visual Studio and both of these have features as good as Eclipse. I bet IAR's workbench is quite decent as well.
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #57 on: November 13, 2015, 11:24:58 pm »
..Many of the typical microcontroller IDEs are poorly equiped editors with a compile button and some debugging bolted on. These IDEs are totally useless where it comes to navigating through code and figuring out how a piece of software has been put together. In Eclipse this is a trivial task.
I don't know where you get this idea from
Simple: from using those IDEs (ofcourse!) and the need to go through / understand / modify really large pieces of software I didn't write myself. Perhaps Visual Studio caught up in the last couple of years but the latest version I used (2008) was not an improvement and still needed lots of add-ons to add things which are standard in Eclipse. No doubt there are equal or better IDEs around compared to Eclipse nowadays but you'd better look for a product which has been designed to be an IDE and not a notepad+ cool looking compile button to get a blinky example going quickly.

You better look at IAR's product before assuming! It takes a lot of work to write a good IDE while the heavy power users will use their own editors, build and code management system. In other words: the actual use for a vendor specific IDE is very limited and is mostly there as beads & mirrors for managers who need to decide what to buy. Many of the vendor provided IDEs lack support for managing code, projects, version control  or are at least behind with the latest trends in that area.
« Last Edit: November 13, 2015, 11:31:52 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #58 on: November 14, 2015, 12:26:33 am »
...
You better look at IAR's product before assuming! It takes a lot of work to write a good IDE while the heavy power users will use their own editors, build and code management system. In other words: the actual use for a vendor specific IDE is very limited and is mostly there as beads & mirrors for managers who need to decide what to buy. Many of the vendor provided IDEs lack support for managing code, projects, version control  or are at least behind with the latest trends in that area.

Well, not my place to advocate for IAR  :-DD, in any case, this is one of the reasons we chose CodeBlocks in 2011 - it has the latest features and used by active community AND build on a lightweight framework (i.e. not Java)
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #59 on: November 14, 2015, 01:35:34 am »
in any case, this is one of the reasons we chose CodeBlocks in 2011 - it has the latest features and used by active community AND build on a lightweight framework (i.e. not Java)
I'm not a fan of Java as well. Based on seeing lots of quirks in Java applications I had serious doubts Eclipse would be a good choice for me to invest time in. But it seems the creators of Eclipse have them under control although once in a while (about 5 times per year with almost daily use) a typical Java-ism rears its ugly head.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: ST or NXP
« Reply #60 on: November 14, 2015, 01:43:41 pm »
The only problem I have with eclipse is that it runs on Java which is continuously updated/attacked and makes it unstable and slow.
 

Offline zapta

  • Super Contributor
  • ***
  • Posts: 6190
  • Country: us
Re: ST or NXP
« Reply #61 on: November 14, 2015, 01:55:36 pm »
I don't know where you get this idea from, most 3rd party IDEs are quite competitive with Eclipse. Ours is based on CodeBlocks, and Atmel Studio is based on Visual Studio and both of these have features as good as Eclipse.

... and both don't run on Linux or Mac OSX. :)
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #62 on: November 14, 2015, 05:21:18 pm »
Java which is continuously updated/attacked and makes it unstable and slow.
Now that is twisted logic. Needing to update Java is total overkill for a user space application which runs from a standard (protected) user account. Besides that the code in Eclipse is signed as well and if you install an unsigned module into Eclipse it will show a big fat warning.

It is a totally different story if you have an internet (web)server where people use all kinds of Java applications. But that is a very special case.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #63 on: November 14, 2015, 08:43:52 pm »
I don't know where you get this idea from, most 3rd party IDEs are quite competitive with Eclipse. Ours is based on CodeBlocks, and Atmel Studio is based on Visual Studio and both of these have features as good as Eclipse.

... and both don't run on Linux or Mac OSX. :)

Plain CodeBlocks does run on Linux and Mac OSX. We did/do not have the resource to port our debugger to Linux/OSX... yet, but it's a doable job.
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #64 on: November 14, 2015, 08:45:53 pm »
These days Java apps seem to be increasingly distributed with their own "integrated" JVM runtime.

Somewhat perversely one of the selling points of Java was a shared framework, but it also became one of its downfalls. As well as having to tailor various mutually incompatible environmental parameters for each application, there are so many backward compatibility issues between versions that to be able to rely on a single JVM for all the Java based apps from different vendors has become a practical impossibility. At one time we even resorted to Citrix and other virtualisation and sandboxing technolgies. So nowadays, many Java applications ship with their own preconfigured sandboxed JVM, warts and all.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26896
  • Country: nl
    • NCT Developments
Re: ST or NXP
« Reply #65 on: November 14, 2015, 08:58:57 pm »
As far as I can see the shared framework philosophy has been a big fail in every case. Too much bloat and too much incompatibilities.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #66 on: November 14, 2015, 10:09:41 pm »
As far as I can see the shared framework philosophy has been a big fail in every case. Too much bloat and too much incompatibilities.

Agreed, .Net has very similar traits, some worse due to inherent distribution and integration with the OS, and the massive overhead of maintenance in the form of security updates for multiple versions in multiple locales. Sadly at the development level, too few folks are interested in understanding the cost of deployment foobars that this introduces until it is too late.
 

Offline richardman

  • Frequent Contributor
  • **
  • Posts: 427
  • Country: us
Re: ST or NXP
« Reply #67 on: November 14, 2015, 10:36:16 pm »
 It's just the modern versions of DLL Hell.
// richard http://imagecraft.com/
JumpStart C++ for Cortex (compiler/IDE/debugger): the fastest easiest way to get productive on Cortex-M.
Smart.IO: phone App for embedded systems with no app or wireless coding
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: ST or NXP
« Reply #68 on: November 14, 2015, 10:59:16 pm »
It's just the modern versions of DLL Hell.

Agreed, but worse! Ever tried dealing with an SxS DLL cock up?
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf