Then stop talking about some very specific, obscure presumably APPLICATION PROCESSORS targeted at a specific single purpose, as the thread is about MCUs. See the difference?
I don't want to be mean, but it is just not true at all, considering general MCUs. Some MPC5xxx or Renesas is not in any way a general purpose MCU. That is just some automotive specific whatever strange part, and I strongly doubt that the OP was asking for these.
Lots of microcontrollers have external memory interfaces. Maybe not on every model or package variant, but it is quite common, and not just on "automotive" stuff. Automotive is also a *huge* industry for microcontrollers and many microcontrollers are designed to support automotive applications even if they are also used elsewhere, so "automotive specific strange part" makes no sense.
Anyway, there are tons of reasons to have external DRAM in a microcontroller, even if you think the application wouldn't require it.
* One reason might be to store log data. Think of a monitoring device that normally sends data immediately to some remote host but needs to be able to function for a few days without access.
* You might want to support safe(er) firmware upgrades. Most microcontrollers have far more flash than SRAM. If you have an external memory you can load new firmware into the DRAM, checksum it, and then start programming the flash. That reduces your risk of corruption if the upload connection is broken or has data errors. In this case, the DRAM would possibly not even get used under normal operation.
* More and more microcontrollers are being asked to host web interfaces. Whether that is a good idea or not, that can take a lot of RAM to render pages. So it makes sense to keep your real time stuff in SRAM and put the web stuff in DRAM.
* A CNC motor control system might need DRAM to store a large program while using SRAM to hold only the currently executing tool path.
* In a low volume application it might make sense to simply add the DRAM for future growth, since it is easier than retrofitting systems.
Also, I want to steadfastly and strenuously object to NorthGuy's characterization of "bad programmers." A bad programmer is one that wastes time trying to write the smallest and least featureful program to fit into a tiny microcontroller when realtime requirements don't require it and the volume cost savings don't justify it. Also, frequently microcontroller developers do really crappy stuff like impose arbitrary limits or fail to test input values properly in order to fit in memory since SRAM is still precious. If your application can support DRAM and that lets you write more functional less brittle code, that is a big win. Frankly, many micocontroller applications, especially IoT nonsense, should not even use microcontrollers, but small application processors. There are now affordable ARM application processors with a reasonable compliment of IO peripherals, and if you are going to run TCP/IP you should really do it with a real OS. Those wiznet chips or tiny IP stacks are cool, but ultimately quite limited. Obviously if you need super accurate real-time or a huge number of IO pins, you may still need a microcontroller, but many applications don't require that.