Electronics > Microcontrollers

NXP LPC series Better than an STM32 Which is EASIER/Better for PIC migration?

(1/7) > >>

In a earlier similar posted topic, everyone ignored a suggested easier option to choose NXP over STM for an experienced PIC coder to first try to migrate..why?

STM bandwagon post: https://www.eevblog.com/forum/microcontrollers/from-pic-to-stm32/msg5107758/#msg5107758

Above Topic, According to user nctnico:
"May I suggest not to blindly follow everyone into STM32. Especially for a beginner, NXP's LPC series of ARM microcontrollers are much easier to get going with. Better (correct!) and clearer documentation, peripherals that are designed so that they are easy to configure and all of the controllers have a serial bootloader besides SWD. I have been using both and I'd take an NXP LPC series over an STM32 any day because the latter just take more time to get going."

Can't really tell as I never used NXP.
I have two LPCXpresso LPC1114 boards I got for free in 2010 or so, but never used them! (OMG how time flies!).
But no matter how experienced you are in PICs (Unless you did plenty of PIC32), ARM is a completely different architecture with lots of quirks, it's a steep learning curve!

I have not experienced great difficulty in first using Infineon 5V Vdd PSC4200 series of ARM, not much difference in getting things to work properly, just had to deal with the tiresome  necessity to download many datasheets for any peripheral/topic needed  to setup the PSC4200  MCU.  Having a break-off USB-->Serial bootloader module built-in on a PSC4200 developer stamp made breadboarding and programming seamless, so quick  and the programming is the break-off part  the stamp, so programming is free of charge.

What I dislike using MCU dev. bds is their size, while a stamp dev. board is sometimes small enough to fit into a small enclosure when soldered into/added onto a proto PCB. A  revised PCB after debugging/modifying, can then easily shrink the MCU into the smallest footprint size on the final board. With a stamp, I can quickly show a customer a working proto without a scary array of flying wires!
I have three large Arduino Mega2560 boards gathering dust because they are so damn big and clumsy to work with.

When I compare this with running through the prev topic showing a gauntlet to overcome using CubeIDE to get the STM device to blink a led, the Infineon was a piece of cake.

What I dislike about the Infineon is the dev. stamp cost, 20x more expensive than a STM on Allibaba. However, the Infineon part is now only a day away after being ordered from Mouser while Alli Bab's cheap part takes a mth to wait for.

What about the NXP?

I think the statement was "ignored", because objectively its not true.

The most idiocracies I had when switching to an ARM CPU had more to do with the CPU instead.
For example, having to enable "power" or "clocks" to each peripheral before using it. Or having a 8-bit CPU nicely matches the smallest addressable data size in C, making it easy to program. A 32-bit CPU may introduce some gotcha's such as unaligned memory access and whatnot. Likewise, debugging HardFaults is also one of such things.
The longest bug I had to chase down took 1.5 weeks to find. It was on a LPC1768, but it had nothing to do with NXP's work: it was a bit we forget to manipulate within the ARM CPU core itself that led to stack problems.

And even if there are idicoracies with peripherals, in my experience its mostly to do with chip scaffolding (buses, memory, clocks, power, DMA) than individual peripherals. Its not like setting up a UART or SPI controller on STM32, LPC or PIC is going to make you jump through half a dozen hoops to make it work: a single pass through the peripheral registers is typically enough to make it to work. ST reuses their peripherals among chips and may add a feature here or there, but typically the "i dont care about this feature" => "keep the bit at zero" works, just like a PIC, AVR or any other sanely designed MCU.

Now that doesn't mean there are no issues with e.g. peripherals of a particular vendor.  STM32s for example (used to) have a frustrating to use DMA controller. The peripheral requests could only be mapped to a particular DMA channel, which could lead to conflicts. This "firmware driver work" is in actuality hardware design, similar to how you would allocate peripherals depending on e.g. I/O pad locations. Similarly, some internal signals only have a handful of options on STM32, while a more generic solution would be needed.
Other vendors like SiLabs and Atmel have tried to find solutions for this with PRS and alike, which were easy to use, however, for some of these chips the silicon itself may not tick all the boxes. E.g. if you need industrial peripheral set (many CAN buses), then you probably skip SiLabs offerings. If you need ultra low power, then you probably wont find many suitable options from NXP.

There just isn't a way to "have it all".

All these "easy to configure" aspects are similar for STM32s, LPC's, SiLabs, etc. I've yet to see any vendor to do things drastically different.
And the question is whether thats even possible or desirable. As soon as a vendor says something is "smart", then I can translate also into the terms of non-transparent, frustrating and a can of worms. Similar to how a software ecosystem is a design lock-in, or a "prison".

The reason I think PICs and AVRs have none of these adaption issues is because those chips contain very very little hardware. There is only so much to go wrong with a handful registers for an UART! And also only so many things to use it for....

Regarding documentation: I think all vendors try to do their best, but at one way or another will fall short. They cannot possibly list all possible configurations of their chips. I've been looking at an (admittedly old:rev2) user manual of the LPC1768, and I'm not that impressed. The DMA controller chapter is only 29 pages long, of which 22 is register documentation (e.g. some auto generated tables with some bitfield annotations). The chapter says the DMA controller has support for scatter-gather using linked-lists, but then doesn't mention it once in the remainder 7 pages. Its basically undocumented as the register listing doesn't specify how to use it neither. Probably better to mail their sales/app team on it works.. but that requires human interactions. Eek.

Now STM32 documents can be on the opposite side of an enormous amount of redundant 'blabla' and then (forgetting to) reference content from half a dozen other places in a 2000+ page document. But at this point I also get the impression that using those chips 'bare metal' is the non-mainstream way: many people use ST's HAL and CubeMX software to generate driver initialization code.
Which one is better? Again: can't have it all, pick your poison.

My point is: in the end these 32-bit ARM chips have similar complexity, and some chips may have more quality-of-life features in the silicon (such as remappable I/O) and some in the datasheet. Which one you like to use more is subjective and that's fine. I think this statement however has been repeatedly posted by this user. Subjectively speaking, such statement cannot be false, but I don't think its true neither.

I've spent the last few years of my life on an arm32 (32f417) product. If you read through the many threads I started, you will get the idea :)

And I went from asm programming (decades of it) on the Z80/Z180 and similar chips to C on the arm32.

I'd say the hardest bits have been not related to the chip but to the fact that in modern times a product is 10% basic function and 90% connectivity, and the latter is so complex (TCP/IP i.e. LWIP, TLS, etc) that you have to use open source code for that which is often buggy, usually with no support, and is huge e.g. TLS at 200k is ~ half of my whole product! And yes I started a thread on the crap situation with open source code, too :)

If I was merely moving from a Z180 or H8 to the 32f417, I would have probably spent a week or two to get it going and then it would be downhill all the way. The chip is so fast that typical bottlenecks just disappear.

FreeRTOS integration is also easy enough (and highly desirable).

It is however true that in certain cases the open source code is supplied pre-integrated and the classic one is the ESP32 where Espressif paid some guy to package the whole thing into something that works fully out of the box. The downside? Chinese political risk, and doubtful production life (nobody is likely to beat STM, or Hitachi/Renesas, on production life). The chinese do not really care about shafting customers; their entire culture is opportunistic.

All that said, this project will be my last one in terms of hardware and software integration. I plan to use this as a base for all future products, and with 168MHz I am confident it will do everything I need.


[0] Message Index

[#] Next page

There was an error while thanking
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod