As it happens I'm finishing up working on a calibrator right now, a Fluke 5101B so I can tell you in great detail how this one works--but that would be a long post. I'll just give you a quick run through and you can ask about any part that isn't detailed enough.
To start, for smaller outputs, the reference is supplied to a DAC that then outputs a specific fraction of that reference. This is supplied as a reference to a control board which compares two signals, the DAC output and the output of the calibrator, and then produces a control signal, which is sent to a power amplifier that responds and produces the calibrator output. The initial state, of course, is zero output, so the control board ramps up the control signal until the power amplifier output matches the DAC signal. That works for the range that corresponds exactly to the DAC output range, which on the 5101B is 0.2 to 2.0 volts.
For higher or lower signals, the output is scaled through the ranging board, which is a complex series of shunts, voltage dividers and relays. So for the 2.0 to 20 volt range, it would divide the actual output by 10 and send that back to the control board. For voltages over 20 volts, the system switches to the HV mode which changes a bunch of relays so that the power amplifier is now producing a modified 2kHz square wave that is fed to a large transformer that transforms it up by a factor of approximately 10 or 50 depending on the range, then that output is rectified and filtered by an especially tricky little circuit and that voltage--as high as 1100VDC--is sent to the ranging board where it is directed to the output terminals and divided down and sent to the control board.
So the short version is that the reference is used as just that to control the output of an entirely separate voltage source by comparison via scaling.
Edit: The way I've described it is actually not correct, I've mishmashed the AC and DC functions. I won't bother correcting it here because the point is the same, even if the numbers are wrong.