Electronics > Metrology

% of reading

(1/4) > >>

DavidTheElectrician:
Hi out there,

At the laboratory where I work we always apply the DUT's "% of reading"-spec to our reference rather than the DUT itself.

For example, let's say I have a multimeter with a spec of +/- 1% of its reading:
If I then go ahead and apply 100 V to the DUT with my reference and the DUT reads 110 V, then I would assume that the specifications (tolerance) of that particular reading would be 110 V x 0.01 = +/- 1.1 V.

However, what we do instead in the laboratory is to apply those specs to our reference so that the specs (tolerance) in turn would be 100 V x 0.01 = +/- 1.0 V.

What is the logic behind doing this?

Thanks

mendip_discovery:
Ok, I have yet to have my first brew so I may have got the wrong end of the stick.

The unit under test has had 100V applied to it. The UUT should read 100V if its working well. So the % of reading is applied to the 100V that is applied and not what is displayed.

If you applied the % of reading to the displayed of the UUT you would be only gaining an very very small amount of wiggle room often less than the resolution. That room is usually taken up with your Uncertainty anyway and should be used in conjunction with the spec of you are saying pass/fail.

Kleinstein:
It should not make a significant difference if the uncertainty is applied to the actual reading or the actual value. Normally the 2 are very close togehter and the specs itself are not super accurate / sharp. So they give something like 1%, but not 1.02 % error limits because the limits are rather difficult to estimate numbers. Even a 1.1 % limit would be an oddity.
With small errors on usually assumes a symmetric errors. Only with with high tolerances one may get things like +50% / -30% to take that into account.

If the reading and reference values are that far out something is seriously wrong anyway.

In many cases the relevant point is the ratio of reading and reference value, so it does not matter one where to ally the factor.

alm:
I agree with Kleinstein and mendip-discovery that if the difference matters, you're probably already so far out of tolerance that it doesn't make a difference. But in a strict interpretation I'd say the spec is X% of the DUT reading, not X% of the true value.

bdunham7:

--- Quote from: alm on October 09, 2022, 02:46:23 pm ---I agree with Kleinstein and mendip-discovery that if the difference matters, you're probably already so far out of tolerance that it doesn't make a difference. But in a strict interpretation I'd say the spec is X% of the DUT reading, not X% of the true value.

--- End quote ---

There is no globally correct answer to this question AFAIK, but as stated already it usually doesn't matter.  Here is a direct quote from a Fluke calibration manual for the 289.  Other instruments may specify differently, but they almost always have [% of reading] + [digits / % of range / similar] as their format.  From the user perspective, they don't have a reference to think about--just the meter.  So if you have a 1% tolerance meter, you expect the actual value to be +/- 1% of what you are seeing.

Detailed Specifications
Accuracy:
Accuracy is specified for a period of one year after calibration, at 18C to 28C (64F to 82 F), with relative humidity to 90%. Accuracy specifications are given as: ±( [ % of reading ] + [number of least significant digits ] ).

However, if you go into the Performance Test section of the same calibration manual, you will find that test specifies an accurate reference input and acceptable limits for the meter to read.  Those limits are calculated by simply multiplying the specified tolerances (both parts) by the reference value.  I've never seen any exception to this, but the difference for any reasonable meter is going to be vanishingly small in most instances.  Even with a 2% analog meter the difference in calculated test limits using either method would be less than the thickness of the paint on the needle.  You have to look at cases where meters might have very wide tolerances in certain functions.  Again, in the same manual you will see that the 400mA AC current range >20kHz is specified as 5% + 40 counts.  The performance test for this range is 300mA at 30kHz and the limits are 284.60 to 315.40mA.  The limits calculated as an error from the reading would be different enough to notice.

On a Fluke 5100-series calibrator, there is an error function.  Instead of supplying a fixed reference and trying to calculate the error from the DUT reading, you can raise and lower the reference until your DUT reads the specified amount, then read the error% display on the calibrator.  So if you start with a 100V reference and then adjust the error control to 110V, the error% display reads "-10.0000%".  I'll leave you to decide what that means!

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod