I would first start replying the question of the trim goal. Is it to battle for LTZ output voltage variation? Or is it to make "perfect" 10V output? Or is it to twist the output tempco?
If it's yes for first question, that you would much wider range than few ppms. If it's to make perfect 10V, then one should worry more about long-term stability and not the tempco.
There is a reason why most of the commercial designs do not have digital trim, is to reduce possibility for added instability and noise in reference voltage. Also precise trim often not required because reference recipient (DAC, ADC, PA output stage) can be also trimmed to desired output result, rather than reference itself. Having DAC at the reference side however have good chance for ground loops, digital noise injection and gain instability. Long-term stability is not typically specified for these components. Sure, one can design in digital isolation (including the low-noise PS), careful current path control, etc, but at best you just reach stability of the output resistor network. So it all sounds rather expensive feat without visible gains.
If doing trim, maybe better option to make 7V-10V stage fixed and stable as possible, but trim actual LTZ output in first place. You can do so by injecting offset current at bias resistors (70K, etc) or minor adjustments to zener current (120R, etc). So you trim LTZ input for stable output stage match, rather than trying the opposite. It sound's easier to me, as you have more control points over output voltage and tempco trim on LTZ circuit itself. I recall our LTZ veteran
Dr.Frank did something similar on his original reference design, bringing LTZ output to 7.00000V and then boosting that trimmed level to desired 10V.
There are also alternative ways, for which DAC trim wouldn't be much help. E.g. multiply LTZ-7V by LTC1043 or similar flying cap block, and then divide output to the ratio 10Vish back. That would be also more noisy, but still acceptable if injected noise much under own LTZ output noise.