One thing that drives me nuts working on old meters (and new meters I guess) are the crazy requirements for AC calibration. You need things like a 1000V AC 400Hz source, and a 20KHz source (I think).
Is there any reasonably priced way to get these signals? I am always on the lookup for a Fluke calibrator, but they usually go for at least $500, which is way too much to spend on something I would only use to repair old stuff.
Old thread, but I'll answer anyway. Back about 1968, I was working at Wavetek, when they were making differential voltmeters, trying to compete with Fluke. Initially for AC calibration, we were using a audio oscillator, into a Macintosh audio power amp into a transformer to step up the voltage. We used a Holt thermal transfer to determine what the output voltage was. Eventually replaced the kludge with an Optomation AC calibrator, which was a 4 foot tall rack mount oscillator, 6 digit ratio transformer and power amp. Still required the Holt to determine AC voltage.
So there are ways to fake it with non-standard signal sources. Not too hard to get the high voltage.
And unless you need to measure high v, high freq with accuracy, skip calibrating the 1KV range or calibrate at a lower voltage. Something like the AnEng AN870 for $30 can measure AC true RMS, 0.3% accuracy up to 3 kHz. A scope could be used to insure the signal source is flat with frequency. A scope with differential plug in, 7603/7A13, 545/W, etc could measure the signal directly with accuracy. Peak voltage X 0.707 = RMS. I'm using the 7A13 to measure AC signal source to verify my new HP 3456 is reasonably accurate on 1 and 10 V AC ranges. DC measurement of the comparision voltage terminal on the 7A13 verifies it's accuracy.