I have a 6.5 digit multimeter that is calibrated pretty well up to 19.99999V. However in the 0-199.9999V range it is not faring very well. So, I was thinking.
I don't have a voltage standard I can use to calibrate the 199.9999V range. For this, I would need a precise voltage that is at least 60% of the range (i.e., at least 120V dc). Getting an accurate (transfer) voltage reference at this level is quite expensive. So, an idea struck me. What if I take 14 9v batteries and measure them in pairs using my multimeter's 19.99999V setting. I sum up the voltages (say 129.5V total if each battery nominally measures 9.25V on the average) and use that to calibrate my 199.9999V range. Since I know that the measurement of each battery pair will be very accurate (say to 20-30 ppm), I can foretell that my final calibration would be accurate to 140-210 ppm (i.e. 7 times 20-30 ppm) at the 129.5V (sum of all batteries in series) level. This level of accuracy (i.e., 140-210 ppm) is OK with me, since most of the high voltage measurements I need to take will be more in the 20-60V range at which point I will have much lower errors due to it being closer to the voltage of a single battery pair (which has 20-30 ppm error).
Am I correct in my assessment or is there some fundamental flaw to this scheme?