At work I often use my meter for setting the charging voltage of these small battery chargers that are part of our stuff to 13.65 volts. The other day I brought my home meter in and compared it and there was about 0.15 volts difference! The big question is, which one, if either, was right… So I chucked together this $12 kit just for this situation. It puts out 2.5000 volts more or less exactly. The work meter is a 1999 max reading thing and the home meter is 3999 max. Adjusted the work meter to 2.50 volts reading and the home meter to 2.500 volts. All good, so I thought. Then I connected them both up to a 12 volts battery that was just sitting on the bench and the work one read 12.86 volts and the other (better) home one 12.91 volts. Arrrgh! That bothered my inner OCD enough that I opened up the work one again and tweaked it to the same reading as the home one. The bottom line was adjusting the work one to read 2.50 volts has only 1/10 the accuracy of adjusting the home one to read 2.500 volts but setting it to a now known but higher accurate reference voltage is better. All in a day’s work.
------
Upon further reflection, I have realised that the work meter was reading 2.50 volts on its 20V range and the home meter was reading 2.500 volts on its 4 volt range, but now that I have compared and set them at 12 odd volts it occurred to me that the home meter was calibrated on 4V range but (wrongly) expecting a dead-on reading on the 40V range. Sigh… back to square one.