I think some of the confusion may stem from where the terms originated, to what we now have today which has begun to blur some of the lines.
As stated, original definitions were things like:
microprocessor - is just the CPU. Useless on its own
microcontroller - cpu with all the required support to make it useful (some rom, ram, I/O's of various flavors, etc.). Typically the CPU has much less horsepower than its contemporary "microprocessor" cousins.
SOC - I'll get to that in a minute......esp since it's a comparatively new term.
I think now days, one may also need to consider the applications of said devices to help clarify the definition. (as I'm saying this through gritting teeth because I KNOW someone will argue this point. Yeah.....whatever. It's here to help illustrate my point)
Now, to take the one argument that some of the latest "CPU's" from Intel include a lot of that said stuff...I would agree is starting to blur that line. However, can these same processors run / work completely on their own? (asking because I honestly don't know. I've got better things to do with my time than tract the latest offerings from Int-hell). I suspect they may not. I suspect there isn't any on board flash / rom and if there is any RAM it's not enough to make it useful. If my suspicions are wrong, then yeah, I guess in the strictest sense one could technically call it a "microcontroller".
However, I think when you look at how / where they're put into application you begin to see the real difference. These "microprocessors" are pretty much always used in something that makes up some flavor of a mainstream PC / Server sort of thing. And yes, I'll even toss in things like DVR's into this realm since really all a DVR is, is a PC with a dedicated purpose.
Now on the flip side, when looking at our microcontroller friends, they are pretty much always used in small, specialized, embedded systems type applications. You don't find anyone trying to make a general purpose PC out of a microcontroller. Now having said that, do you ever see a generic Intel / AMD "microprocessor" used in a small, dedicated embedded system? Sure you do. But as I mentioned, the application of said devices only helps clarify its definition...it doesn't dictate it.
So where do SOC's fit in? Well, you have to look at their origins. They come about as a result of ASICs. Again, in the strictest sense could they be defined as a "microcontroller"? Most of the time, probably so (would need to evaluate them on a case by case basis). However, they often are built to serve a dedicated purpose and contain a lot of specialized blocks to fit their dedicated purpose. I've worked on SOCs that the only thing that chip knew how to do was act as a print server, or a storage network bridge, etc. Yes, that was in part due to the firmware / software written for it, but even if you tried to get it to do something else, it wouldn't be very good at it. The general architecture was tuned to facilitate that one specific application (as their roots "ASIC" defines).
So could you force feed a "generic" microcontroller to do what an SOC does? Possibly, but most likely you'll need lots of other supporting stuff and it wouldn't be very good at it. Example would be: when was the last time you saw a microcontroller with a serdes capable of supporting FC/SAS? Even if you could get all the hardware together it would have really poor performance. (one of the key architectural aspects of any SOC is its datapath. It has been optimized to move things around / through the various functional blocks as efficiently as possible).
Could you take a generic microprocessor and have it do what an SOC does? Sure, but it will require more additional hardware and wouldn't fit in your pocket as nicely as your cell phone. We won't even talk about battery life.
The inverse of all these is also true when trying to get an SOC to fit the applications of the others.
I will agree there is beginning to be a convergence. Is it possible that some day we reach a point where you simply have a generic "compute core chip"? Possible.
Now just a footnote on the following reply:
These days a Microprocessor could just be software.
It is a computing core without peripherals.
No. For simulation purposes only. Any useful uP is/has physical hardware.
You can buy or write Code to embed a Microprocessor into an FPGA along with whatever other peripherals you require.
Is the resulting FPGA then a MicroController
No. If that's your thought process, then you don't have a real clear understanding of what's happening. The language you use is an HDL. By definition it is used to "create" hardware. You actually need to be thinking hardware when you write this code, because that's what it truly turns into. The microprocessors you're speaking of are what's known in the industry as a "soft core". It's called that because you're given the HDL code for it and it is left up to you to run it through the synthesizers, place / route, etc. to generate the hardware. If you put all the appropriate pieces together in an FPGA could it then be called a "microcontroller"? In the strictest sense - yes. Why? Because what is a microncontroller but simply a bunch of logic gates arranged to provide you a CPU and all the supporting infrastructure in a single package. The difference of having it done in an FPGA is simply that the FPGA can also be reconfigured to do other things. (there's also trade-offs with power, size, speed, etc., but we'll ignore those for this discussion)
The advances in technology have blurred the old distinctions.
Agreed. 100%. The lines are definitely getting more grey.
EDIT: almost forgot one last thing that sets SOCs apart from "microcontrollers". As most SOCs are based on ASIC processes, they often don't have on board flash. The fab process technologies really aren't compatible (at least enough to make it practical in most cases). So what does that mean? It means that SOCs almost always require an external ROM / FLASH part to get their code from.