The output hangs out at about -0.2V when it is disabled (I cleaned the HV output relay contacts; they seem to be fine). No the Zero controls do nothing, they are for much finer adj than this.
i read this as "when the unit is set to STANDBY/RESET", the output settles to about -0.2 V"
that's ok, the output isn't well defined (i.e., regulated) while it is in standby. however, output must indeed read 0 V when unit is set to "METER VOLTAGE" _and_ output voltage setpoint is dialled in as (example: 343A, RANGE = 10 V) 0.000000 V
Outputs are a few percent low (varying across ranges and with phase of moon)
The reference is spot on and solid. The supply voltages are basically OK, but the two high voltage supplies are about 50V high, each (I wonder what I could have done to... something?... when I briefly ran without the big filter caps to keep the RMS value correct - but you'd think the result there would be a crashed, or lower voltage supply, right?). I think that the real problem here is that I can't get the chopper amplifier to behave. Adjusting it causes the output to move around from a few percent under, to approx correct (but unstable) to runaway high voltage. Even after I cleaned the fancy WW R162 pot out (it's now quite stable when rapped with a screwdriver), I can't get the spikes to stabilize at anything like the right level, and adjusting the Drive doesn't do anything obvious.
The manual helpfully suggests that a pair of FETs or an IC might be at fault, but these are 1969-era parts and unobtanium. Well, I could retrofit a DIP package 709 in place of the TO-39 but the FE0654C and FT704 transistors can't be had. I wish they were socketed; I have yet to very carefully remove them for curve tracing.
consider your unit as uncalibrated / unadjusted for now, so no worries about missing accuracy. what to note here is that it seems the output values aren't stable when repeatedly measured: say, dial in 10 V, then measure the output 5 times at one-minute intervals and find 5 grossly (i.e., in the range of multiple tens of µV) differing readings. if that's the case, there might be some serious issue going on in the amplifier stage,or the divider stages, for that matter. test the output with appropriate equipment: anything lesser than 6 1/2 digits isn't up for the job. (EDIT: better reach out for your 7 1/2!)
doing any re-adjustment _before_ you have the entire supply back in spec isn't a good idea, no matter how good the ref's output is actually looking.
R162 is not "adjusting" the chopper amp, it's meant to suppress switching noise of the chopper. hence, this adjustment has to be done before any attempt to bring the unit back to spec, because it _will_ influence the calibration, but it isn't _meant_ for calibration.
so, before you try to check transistors on a curve tracer or to swap out that war-weary 709 veteran, i'd suggest that you first give the unit a full qualification run: let it sit for 24 hrs. powered-on, then dial in some cardinal values (say, 0.2, 2, 20, 200, 800, whatever fits best into your meter's resolution) and note down all settings and readings (and room temperature). give the instrument ample time to settle in between readings, i'd suggest at least a minute, on the HV range at least 10 minutes.
once done, dial it back to your lowest cardinal setpoint, let it sit another 24 hrs (powered up and _not_ in stand-by) and then re-do all measurements. make sure that you are using the same cables, at the same position on the workbench, with the very same meter, at the same temperature. you get the picture.
if you find that your measuremnts deviate from the ones taken the day before, but are still in spec with the ppm deviations stated in the manual, then you have a winner. if not, well...