I am (eventually) gonna have to calibrate a Keithley 2000 multimeter that I am trying to repair at the moment. I don't have access to a calibrator, as the calibration manual suggests using. However, I do have access to another calibrated Keithley 2000. Would I be able to basically just take a power supply and generate a voltage of +10V, -10V and +100V, verify that with the calibrated multimeter and then basically enter the values of the calibrated multimeter into the multimeter that I am calibrating? That is, I measure a voltage of 10.01000V with the calibrated multimeter, then plug in the uncalibrated multimeter and enter that value using the arrow keys?
Would I be able to do the same with resistance(1k,10k,100k,1M), by basically going to my parts drawers and find one of these resistors, measure it with the calibrated multimeter and then enter those numbers in the uncalibrated multimeter? What about DC current?
If the above solution work, I guess that I could use the same principle to generate the AC voltages and currents from a power amplifier being fed by a signal generator? Provided that I can find something that can output 50kHz, which I think I can procure. The only issue I have would be to generate the 700V voltage. Any idea?
Also, finally, you are supposed to use a "Keithley 8610 Low-thermal shorting plug", which basically shorts the four inputs(low, high, sense high, sense low) during calibration. I see that someone has given it a go at making one themselves, but what would be the consequences of just soldering four banana plugs together, perhaps via a blank pcb?
Thank you for reading!