Author Topic: % of reading  (Read 2560 times)

0 Members and 1 Guest are viewing this topic.

Offline DavidTheElectricianTopic starter

  • Newbie
  • Posts: 4
  • Country: dk
% of reading
« on: October 09, 2022, 07:00:45 am »
Hi out there,

At the laboratory where I work we always apply the DUT's "% of reading"-spec to our reference rather than the DUT itself.

For example, let's say I have a multimeter with a spec of +/- 1% of its reading:
If I then go ahead and apply 100 V to the DUT with my reference and the DUT reads 110 V, then I would assume that the specifications (tolerance) of that particular reading would be 110 V x 0.01 = +/- 1.1 V.

However, what we do instead in the laboratory is to apply those specs to our reference so that the specs (tolerance) in turn would be 100 V x 0.01 = +/- 1.0 V.

What is the logic behind doing this?

Thanks

 

Offline mendip_discovery

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: % of reading
« Reply #1 on: October 09, 2022, 07:51:30 am »
Ok, I have yet to have my first brew so I may have got the wrong end of the stick.

The unit under test has had 100V applied to it. The UUT should read 100V if its working well. So the % of reading is applied to the 100V that is applied and not what is displayed.

If you applied the % of reading to the displayed of the UUT you would be only gaining an very very small amount of wiggle room often less than the resolution. That room is usually taken up with your Uncertainty anyway and should be used in conjunction with the spec of you are saying pass/fail.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14192
  • Country: de
Re: % of reading
« Reply #2 on: October 09, 2022, 10:04:24 am »
It should not make a significant difference if the uncertainty is applied to the actual reading or the actual value. Normally the 2 are very close togehter and the specs itself are not super accurate / sharp. So they give something like 1%, but not 1.02 % error limits because the limits are rather difficult to estimate numbers. Even a 1.1 % limit would be an oddity.
With small errors on usually assumes a symmetric errors. Only with with high tolerances one may get things like +50% / -30% to take that into account.

If the reading and reference values are that far out something is seriously wrong anyway.

In many cases the relevant point is the ratio of reading and reference value, so it does not matter one where to ally the factor.
 

Online alm

  • Super Contributor
  • ***
  • Posts: 2881
  • Country: 00
Re: % of reading
« Reply #3 on: October 09, 2022, 02:46:23 pm »
I agree with Kleinstein and mendip-discovery that if the difference matters, you're probably already so far out of tolerance that it doesn't make a difference. But in a strict interpretation I'd say the spec is X% of the DUT reading, not X% of the true value.

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7849
  • Country: us
Re: % of reading
« Reply #4 on: October 09, 2022, 04:09:21 pm »
I agree with Kleinstein and mendip-discovery that if the difference matters, you're probably already so far out of tolerance that it doesn't make a difference. But in a strict interpretation I'd say the spec is X% of the DUT reading, not X% of the true value.

There is no globally correct answer to this question AFAIK, but as stated already it usually doesn't matter.  Here is a direct quote from a Fluke calibration manual for the 289.  Other instruments may specify differently, but they almost always have [% of reading] + [digits / % of range / similar] as their format.  From the user perspective, they don't have a reference to think about--just the meter.  So if you have a 1% tolerance meter, you expect the actual value to be +/- 1% of what you are seeing.

Detailed Specifications
Accuracy:
Accuracy is specified for a period of one year after calibration, at 18C to 28C (64F to 82 F), with relative humidity to 90%. Accuracy specifications are given as: ±( [ % of reading ] + [number of least significant digits ] ).


However, if you go into the Performance Test section of the same calibration manual, you will find that test specifies an accurate reference input and acceptable limits for the meter to read.  Those limits are calculated by simply multiplying the specified tolerances (both parts) by the reference value.  I've never seen any exception to this, but the difference for any reasonable meter is going to be vanishingly small in most instances.  Even with a 2% analog meter the difference in calculated test limits using either method would be less than the thickness of the paint on the needle.  You have to look at cases where meters might have very wide tolerances in certain functions.  Again, in the same manual you will see that the 400mA AC current range >20kHz is specified as 5% + 40 counts.  The performance test for this range is 300mA at 30kHz and the limits are 284.60 to 315.40mA.  The limits calculated as an error from the reading would be different enough to notice.

On a Fluke 5100-series calibrator, there is an error function.  Instead of supplying a fixed reference and trying to calculate the error from the DUT reading, you can raise and lower the reference until your DUT reads the specified amount, then read the error% display on the calibrator.  So if you start with a 100V reference and then adjust the error control to 110V, the error% display reads "-10.0000%".  I'll leave you to decide what that means!
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline mendip_discovery

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: % of reading
« Reply #5 on: October 09, 2022, 04:36:09 pm »
Even my Transmille does the error thing for you. ppm to start with then moves onto %. I have found it handy for dial meters and scopes. Our Bradly 192 has % dials for when calibrating scopes.

I suspect the wording is written for the user and not to a calibration lab. So the spec is % of reading as in that reading displayed is within 1% of the real value. This is becuase the user of the meter will be measuring an unknown value. A Calibration lab is using a calibrator that is for this example a "known[1]" voltage so therefore the spec needs to be applied to the voltage source.

If you are apply a PASS/FAIL thing it's expected for you to take into account of your uncertainty when doing this so you may see a PASS* or FAIL* for when the result is close enough to the limits (spec) that the uncertainty touches and you end up not being able to say with 95% confidence that is a pass.


[1] YMMV
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline boggis the cat

  • Regular Contributor
  • *
  • Posts: 218
  • Country: nz
Re: % of reading
« Reply #6 on: October 09, 2022, 10:24:52 pm »
It depends on the convention.

For example, Fluke will typically state "% of reading +  digits".  So if the DUT indicates 110 V with 100 V applied, the "% of reading" component should apply to the 110 V 'reading', not the 100 V 'nominal'.  (Note that this use of the DUT reading produces a slightly wider window for compliant results compared to using the nominal.)

You will find, however, that Fluke will give tolerances in their calibration verification documentation that is applying the "% of reading" component to the 'nominal'.  In effect, this tightens the specification for compliance – the window for compliant results is narrower.

We set our systems up to use the DUT 'reading' as the reference for any 'reading' component.  This is based upon our customers looking at the data and (correctly, because it is what is written) calculating the specification from the reported DUT indication.  Other labs (and in Fluke's MET/CAL system, by default) will apply such components to the 'nominal' value.  The later is more conservative ('safer') but does not align with the written specifications.

The accreditation organisations don't have a rule on how to approach this, so each lab has to decide.
 

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7849
  • Country: us
Re: % of reading
« Reply #7 on: October 09, 2022, 11:09:30 pm »
The later is more conservative ('safer') but does not align with the written specifications.

The accreditation organisations don't have a rule on how to approach this, so each lab has to decide.

There's no reason that the manufacturer-specified calibration procedure has to somehow 'align' with the published performance specifications.  Obviously the calibration tolerances should be the same or tighter, but there's no inherent logical conflict just because they differ.  I suppose you could make the argument that you'd be set up for false failures, but I would argue that calibration practices as they generally are may be prone to certifying instruments that may not meet their specifications under all (specified) conditions.
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline mendip_discovery

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: % of reading
« Reply #8 on: October 10, 2022, 05:20:22 am »
Quote
The accreditation organisations don't have a rule on how to approach this, so each lab has to decide.

It's not for the lab to decide but the customer. As long it's made clear what you are doing there isnt an issue. I dont agree with the way you are doing the maths but I am not going to argue with you.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline RYcal

  • Contributor
  • Posts: 31
  • Country: nz
Re: % of reading
« Reply #9 on: October 10, 2022, 07:21:27 am »
Good question

Here's my take on it.

When say a meter comes in for "Performance Verification" (note I'm not using the word calibration here) it simply changes from being a "meter" to a device under test (DUT, UUT, etc.). And as a good calibration tech, I should be leaning toward the more conservative method to avoid potential issues.

"Performance Verification" is simply comparing one instrument against another but to do this successfully one of these instruments has to be better than the other and for that reason, I will base my decision rules around my trust in that instrument. Not to mention being accredited means that my more accurate instruments have to be traceable to the SI.

Another thing to note is Fluke has a user manual with specifications to be used at a user level i.e. if you were using your meter spec'd at 1% at 100 Volts you can be assured to a certain extent that the actual voltage will be between 99 and 101 volts. This is not taking into effect, i.e. current valid calibration and good handling.

Then there is a calibration manual which is most likely a good place to start when building a robust calibration system for that device.

But as others have said the difference is most likely irrelevant in terms of passing or failing. but personally, I'd sway toward the less risky approach. 

Each to their own though...


 

Offline boggis the cat

  • Regular Contributor
  • *
  • Posts: 218
  • Country: nz
Re: % of reading
« Reply #10 on: October 12, 2022, 11:55:13 pm »
It's not for the lab to decide but the customer.
Yes, but as I'm sure you know very few customers have any real idea of what this means.

That leaves the decision on interpretation to the lab, in practice.  If you use the "% of reading" really means "% of known applied value" approach then you are not technically following what the documented specification states.  This is why Fluke have the option to choose which to base the calculation from built in to MET/CAL.
 

Offline RYcal

  • Contributor
  • Posts: 31
  • Country: nz
Re: % of reading
« Reply #11 on: October 13, 2022, 07:09:36 pm »
It's not for the lab to decide but the customer.
Yes, but as I'm sure you know very few customers have any real idea of what this means.

That leaves the decision on interpretation to the lab, in practice.  If you use the "% of reading" really means "% of known applied value" approach then you are not technically following what the documented specification states.  This is why Fluke have the option to choose which to base the calculation from built in to MET/CAL.

And most MET/CAL users would have no idea how to set up MET/CAL to do what they want. Just grab the "Gold" procedure and run it hahaha.
 

Offline mendip_discovery

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: % of reading
« Reply #12 on: October 14, 2022, 06:40:21 pm »
It's Friday night...da da do...muisc is in my head.

So after pondering this it finally got to me and I had to power up Excel to see how much of a difference it makes. I am just shocked, it's like night and day. I can't see why I never spotted this before.

Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14192
  • Country: de
Re: % of reading
« Reply #13 on: October 14, 2022, 07:15:54 pm »
There's no reason that the manufacturer-specified calibration procedure has to somehow 'align' with the published performance specifications.  Obviously the calibration tolerances should be the same or tighter, but there's no inherent logical conflict just because they differ.  I suppose you could make the argument that you'd be set up for false failures, but I would argue that calibration practices as they generally are may be prone to certifying instruments that may not meet their specifications under all (specified) conditions.
This is a very good point - for the calibration there should be separate, in generally (very few exceptions - maybe for lower grade weights) more stringent limits than the normal specs.
This is more than just the accuracy of the standards to define the gray zone.
The diffenrence between % of reading or % of standard is often less than the accuracy of the standard. So nothing to really worry. The tricky part is more how to estimate cal limts when you don't have an extra calibration manual.  Often one does not have to go in the details, as many instruments are still way better than there specs. Still one should be prepared on how to handle cases when it gets close.
 

Offline boggis the cat

  • Regular Contributor
  • *
  • Posts: 218
  • Country: nz
Re: % of reading
« Reply #14 on: January 31, 2023, 12:08:52 am »
This is a very good point - for the calibration there should be separate, in generally (very few exceptions - maybe for lower grade weights) more stringent limits than the normal specs.
This becomes an issue when you have a lot of instruments where no calibration procedure is available, or there are defects in procedures.  (Then there are defective specifications to deal with.)

Fluke tend to adjust whenever anything exceeds 70% of the specification (based on actual applied values), except where they don't because the TUR is poor.

While the ideal situation is to have all of the information available and easily comprehensible, this is very often not the case.  I have in the past queried specifications with manufacturers, and had a case where one series of instruments could not be clarified for some functions because the person who derived the specification was no longer there and the engineers couldn't figure out how they were derived.  Very large Japanese company, by the way (anti-Chinese comments invalid).
 

Offline mendip_discovery

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: % of reading
« Reply #15 on: January 31, 2023, 12:59:59 pm »
The way it seems to be alluded to me is that it's for the customer to define what limits they need it to achieve. Manufacturers specifications are a guide but not the rule.

I kinda agree with it but at the same time most of my customers don't care about working it out. Because it's quite a lot of work that is just too much for most people to wrap thier heads around. They just get a branded meter and that keeps their customer happy and they can go on to the next issue. Though I did have to ask a customer to clarify thier needs as they gave me a insulation tester that happens to have a normal 1k ohms range with a resolution on 0.1ohm  and the wanted it tested at 0.5, 1 and 10 ohms and still be within the manufacturer's specifications.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf