EEVblog Electronics Community Forum
Products => Test Equipment => Topic started by: mojoe on June 16, 2015, 02:44:10 am
-
Using an EDC 330 DC voltage standard, I was checking my HP3457A. The full scale DC ranges on the 3457A are 30 mV, 300 mV, 3 V, 30 V and 300 V. I set the EDC 330 for a full scale value for each range. These are the readings I got on the 3457A.
29.99336 mV
(30 - 29.99336) / 30 = .00022% error = 2.2 ppm
299.9348 mV
(300 - 299.9348) / 300 = .00022% error = 2.2 ppm
2.999443 V
(3 - 2.999443) / 3 = .00019% error = 1.9 ppm
29.99526 V
(30 - 29.99526) / 30 = .00016% error = 1.6 ppm
299.9384 V
(300 - 299.9384) / 300 = .00021% error = 2.1 ppm
I set NPLC = 100 and DIGITS DISP = 7. I then waited until I got a stable reading. When the LSD was changing, I took the worst case value.
OK, this should be very simple math, so please tell me if I did something incredibly stupid. If these numbers are correct, then both the voltage standard and the DMM are more than ten times better than spec. This seems too good to be true.
-
yes, your percentages are off by a factor of 100.
(30 - 29.99336) / 30 = .00022
Percent would be .00022 x 100 or .022%
If your meter is anything like my two then your standard is slightly off or your meter isn't calibrated. Did you run autocal before testing?
-
220 ppm, not 2.2 ppm. You should only use one range on your DMM, the most linear range. There will be an offset between ranges.