I have a Quadtech 1880 milliohm meter that I would like to calibrate but I can find nothing out online as to the exact procedure.
So far, it seems to be:
1) Turn unit on
2) Move calibration switch to enable position
3) Unit shows "0", short probes and press the "Man/Ext" key - unit does call.
4) Press "Man/Ext" again, unit shows required reference value. Connect to probes, press "Man/Ext" - unit does cal
5) Continue steps 3 & 4 through 1.9M range; unit resets.
6) Move calibration switch to disabled.
7) Cycle power, done.
What is happening is that after this, meter only reads somewhat correctly if the measured value (1st digit) is in the lower part of the current range. A value of 1.5 milliohms, 1.5 ohms, 1.5 K or 1.5M measures correctly; values of 4 milliohms, 4 K, 6 milliohms cause erratic readings.
Question is, it this due to a bad cal or are there hardware issues with the meter. If I knew for sure the exact calibration procedure, it would at least help me to know where to start troubleshooting.
TIA,
Hal