Most replies seem to me to say a MCU chip is too noisy to fabricate a proper monotonic DAC and other schemes such as dithering and sigma-delta clever schemes are needed to attempt 8B DACs.
No, noise is not limiting the analog performance of an MCU to 5 bits, unless the design is completely failed. Either you understood some comment wrong, or someone had no idea what they were talking about.
You could say that noise starts to be a concern after
very roughly about 10 bits. The higher noise margin you want, the more difficult, and expensive, it becomes, but going from 5 bits to, say, 8 bits, definitely isn't a problem defined by
noise; it's something else.
I don't get it. Trying to design with a 5B DAC today is like designing circuits using 20% tolerance resistors and paper capacitors.
The likely key is: it comes for practically free. A 12-bit DAC on that 30-year old architecture would have increased the product price, and moved it to a different product category altogether. You can come up with a lot of practical use cases for a free 5-bit DAC. So it's not a key feature, that MCU is not meant to be used for precision analog circuits; the DAC is a useful bonus.
If you need 2000 arbitrary analog voltage levels, then the 5-bit DAC is of course the wrong tool for the job. Use an external DAC. The noise characteristics will be better, too.
Sometimes you are fine with a 20% tolerance resistor (a heater, for example!); sometimes you are fine with 32 voltage levels. One use case would be setting a current limit to 32 different levels. Another would be to inject a finetune into the voltage feedback network, so if you have, say, a +12V supply, you could finetune it in 0.1V steps from from 11.0 to 14.2V.