If you're going to use an MCU anyway, you might as well.
But, use it only for assigning setpoints. Let the analog circuitry do its thing, safe and robust.
Best method, build a circuit with voltage and current inputs. It regulates to whichever is highest. Doesn't know anything about power, doesn't care (except that, the maximum value for the inputs can only deliver V*I at the most). Run two DAC outputs from the MCU, to drive these. Read the actual V/I with an ADC (note that Vo != Vset when in current limit, and vice versa, so you always have an "I want this much V/I at the output" command, and an "okay this is how much V/I we're getting" response). Do the multiplication in software, and use a PID loop to adjust V or I as needed.
This way, even if this MCU completely shits itself and the supply gets stuck on full output, you can't draw any more than Vmax * Imax watts, and the analog circuit is always 100% in control, regulating or limiting as it does. No transistors get harmed in the process, though things may heat up a bit (you might add a thermistor to regulate or protect against overheating, or a fuse on the supply input to eventually burn out if the current draw is, like, >3 times what it usually is). Presumably, the power limit doesn't need to respond very quickly, so it can be in a secondary loop (which will probably be >= 3 times slower than the analog V or I loops).
Tim