In what way would a digital output from a MCU, toggling up and down somewhere around the mains frequency, actually qualify as 'more stable' than a dedicated DAC device designed for the purpose?
What characteristics are you expecting of the digital device's output in each of its two states? When it's a logic '1' it'll be equal to the local supply voltage (whatever that's doing), minus some temperature and load dependent figure, plus some interference from the inside of the chip. Logic '0' doesn't have the same PSU dependency, but it's still not exactly 0V, ever.
How are you powering this digital device of yours? If you're relying on your logic '1' output level to be constant, then if the supply voltage is meant to be 3.3V, but it drifts up to 3.300003V, that's one LSB of error at 20 bits. Where is your PSU getting its voltage reference from, and what is the drift in its control loop?
Let's suppose you can generate a PWM signal at 100 Hz, which you can do to 20 bit precision with an FPGA running at 100 MHz or thereabouts. Let's also suppose that you have an "ideal" digital buffer on the output which delivers absolutely precise, repeatable logic levels, with no noise or drift whatsoever. What then? You need a filter which drifts by less than one LSB in 10 msec in order to keep the output stable.
Full scale at 20 bits is about a million LSBs. If your filter takes >10ms to change its output by 1 LSB, then a full scale change of output will take a million x 10ms, which is nearly 3 hours.