Electronics > Metrology

A tale of two meters

(1/2) > >>

Circlotron:
At work I often use my meter for setting the charging voltage of these small battery chargers that are part of our stuff to 13.65 volts. The other day I brought my home meter in and compared it and there was about 0.15 volts difference! The big question is, which one, if either, was right… So I chucked together this \$12 kit just for this situation. It puts out 2.5000 volts more or less exactly. The work meter is a 1999 max reading thing and the home meter is 3999 max. Adjusted the work meter to 2.50 volts reading and the home meter to 2.500 volts. All good, so I thought. Then I connected them both up to a 12 volts battery that was just sitting on the bench and the work one read 12.86 volts and the other  (better) home one 12.91 volts. Arrrgh! That bothered my inner OCD enough that I opened up the work one again and tweaked it to the same reading as the home one. The bottom line was adjusting the work one to read 2.50 volts has only 1/10 the accuracy of adjusting the home one to read 2.500 volts but setting it to a now known but higher accurate reference voltage is better. All in a day’s work.
------
Upon further reflection, I have realised that the work meter was reading 2.50 volts on its 20V range and the home meter was reading 2.500 volts on its 4 volt range, but now that I have compared and set them at 12 odd volts it occurred to me that the home meter was calibrated on 4V range but (wrongly) expecting a dead-on reading on the 40V range. Sigh… back to square one.

joeqsmith:
Surprised.  Most placed I have worked will cal the equipment with a NIST traceable standard, where it is then tagged and sealed.

Circlotron:
It is in fact a SLA battery charger as part of a larger piece of equipment. The charging voltage is not temperature compensated so the difference between ideal and actual charging voltage is actually greater than the meter error. The batteries are replaced on a 12 month schedule, but it is a bit seat-of-the-pants around here, thus my efforts at progressively establishing some sort of order in this small company...

joeqsmith:

--- Quote from: DiligentMinds.com on July 01, 2016, 01:36:46 am ---
--- Quote from: joeqsmith on July 01, 2016, 01:30:37 am ---Surprised.  Most placed I have worked will cal the equipment with a NIST traceable standard, where it is then tagged and sealed.

--- End quote ---

If you are manufacturing something that has a critical setting [and from the OP's description it sounds like a lead-acid battery charger], then you need to figure out what is your error band, and then choose an instrument that is at least 4X better than that to perform the adjustments.  THIS IS THE VERY REASON why people calibrate things!

It seems that (now) the OP is going to have to find an appropriate instrument, and then RECALL all of the equipment that has been manufactured, and then properly adjust it.  This can end up being quite a bit more expensive than a decent calibrated instrument would have cost in the first place [and the OP will have to buy one of those anyway]...

{...sheesh!...}   :palm:

--- End quote ---

johansen:
i have spent a number of minutes tweaking the 4 resistors in a certain meter.. to tweak them a few percent of a percent to get them within line.

i would not immediately junk a meter if its out by 1%. you get what you pay for.. but you also get what you bought based on hype. even some of the cheapest DMM's have better than 1% matched resistors, and the same dmm chip and circuitry as in a meter that costs 2-3 times as much.. (they are 6000 count meters.. to give you a hint.