Author Topic: Uncertainty Budgets and Decision Rules  (Read 13820 times)

0 Members and 1 Guest are viewing this topic.

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #50 on: August 11, 2021, 10:08:51 am »
So with the resolution it's about the fact you maybe at any time 1/2 a digit away from the next digit. Let's say you have 10.001V on the display, is it measuring 10.0006 or 10.0014 so there is always 1/2 a digit in there. So my Unc I did is hedging on safer saying 1mV but I could improve it but saying we are looking at 1/2 of a digit so therefore 0.5mV. You will often see in a cert a statement that additional to the Uncertainty there is a furter 1 of 2 lsd that am sure if there to give a little unc to the UUT.

Sorry about the abbreviations that some dont know I am happy to explain them, it's easy to get confused as it depends on what industry you are used to.


Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #51 on: August 11, 2021, 10:27:24 am »
The limiting factor is the resolution and that does give you a bit of a hit with regards to your imported ppm.

I wouldn't call it a "limiting factor" as much as a "invalidating factor".

In your example you calculate the standard deviation using formulas which assume a bell-shaped noise process, but your lack of resolution, relative to the noise in the signal gives you a step-noise process (ie: "±1" noise).

When resolution is insufficient for the noise, the stddev thus determined is useless, because it depends on magnitude of the measured artifact.

Assume a perfect digital voltmeter which measures volt with 100mV resolution, it can show 4.8, 4.9, 5.0, 5.1, 5.2 and so on, and it rounds perfectly.

If you measure a 5.0000…V reference, it will constantly show "5.0", it will be doing that all the way from 4.9500…1 to 5.04999.....9 volt input.

Congratulations: You have a meter with zero stddev!

If instead you measure a 5.05000… reference, the meter will show "5.0" half the time and "5.1" the other half and your stddev is now 0.534 V.
...snip

I'm on the phone so cant do a neat quoting.

Remember am not looking at its error from Nominal but if it repeats that error and that could be caused by my 5V source having a bit of wobble but that is why you need to repeat these tests every year or so to see if you can improve on them with more stable kit. I could do it over more tests to give myself a lower ppm of error, or I could do a sweep and take an sum of all the errors but see below about the risks of driving it low, but there is another issue is that its weighting the budget but it's not the biggest thing on the budget, go for the big numbers and then you can chase the small ones.

I did it with 5V as that is the only reference I have at home. Tbh when at work we use the top of the range as if it's going to get any wobble it's going to be there and this is testing our ability to repeat reading. But if our repeatability was 0ppm because we have such a low resolution device the adding of 1/2 digit should stuff give us about some cover for errors. There is lots you can add but at the end of the day if you quote an amazing uncertainty then you will be expected to back that up and if it goes wrong it could be very damaging for the lab.

Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline FransW

  • Frequent Contributor
  • **
  • Posts: 270
  • Country: nl
Re: Uncertainty Budgets and Decision Rules
« Reply #52 on: August 11, 2021, 11:40:56 am »
"
I would have expected a different, lengthier and deeper explanation that allows to understand the formula instead of learning it by heart and applying it without understanding.
That is disappointing!
Kindergarten somehow.
"
I rest my case.

Regards, Frans
PE1CCN, Systems Engineering, HP, Philips, TEK, BRYMAN, Fluke, Keithley
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 7825
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #53 on: August 11, 2021, 03:21:45 pm »
In practice it is probably not much of a problem, because nobody really cares about the actual stddev of low resolution instruments

Yes, I think it would be pretty unusual for the assumed (worst-case) uncertainty based on resolution to be a large portion of the UUTs specified tolerances.  I have an old 4.5-digit DMM that is specified to +/- 1 count offset plus +/-1 count at full range on it's 1V and 10V scales.  In one part of the calibration procedure they actually have you provide a small stimulus halfway between two counts and verify that it flickers back and forth between the two with approximately equal time (intensity).  Apparently there's just enough noise or other random variations for that to sort of work, and I can actually get it so that it meets the specification.  Even in this extreme case, having a lower STDDEV than an assumed 0.5 digit just barely moves the needle  :)  on the specified uncertainty.

Quote
I reacted to this because the example you used was almost a school-book example of the problem:  When your measurements have ±1 nature, ie: digital presentation, it should have no effect on the resulting uncertainty, if you change any single measurement one step up or down and your example failed that test.

As a general rule of thumb: If you only have one, two or three different sequential numbers in your data, stddev is not what you want.

Which example are you referring to?  Is it something I linked?  In any case, yes STDDEV on data like that has issues but is generally presented just to demonstrate the arithmetic involved. 
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline Echo88

  • Frequent Contributor
  • **
  • Posts: 826
  • Country: de
Re: Uncertainty Budgets and Decision Rules
« Reply #54 on: August 11, 2021, 04:35:05 pm »
Please elaborate on the role of modern physics in uncertainty calculations of testgear FransW.
 

Offline FransW

  • Frequent Contributor
  • **
  • Posts: 270
  • Country: nl
Re: Uncertainty Budgets and Decision Rules
« Reply #55 on: August 12, 2021, 10:54:51 am »
This is probably not what you would expect.
It is however my world.

Physics:
There are no absolute measurements. All measurements relate to a traceable path of “modern” physical phenomena.
The basis of this can be verified and validated.
International & national standards are in the custody of i.e. BIPM and NIST.
All standards are attempts to unify the measurement approaches. Nothing more.

Physics today (as we perceive it) is an open subject. We have to realise that we, mankind, is not all knowing. There are too many subjects we do not have a complete understanding of.
We have to keep this in mind when we select a piece of measurement equipment which should fulfill realistic basic requirements of the person that does the measurements and has a thorough understanding of exactly what and why he/she is trying to measure.

To unify physical forces has sofar proven impossible. Due to mankind’s shortcomings.
We use our own limitations to explain our actions. Therefore everything needs a “beginning” and an “end”. Why? Again, our own limitations.

Wikipedia:
In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.

Uncertainty:
Uncertainty surrounding the values of the input variables compounds the difficulty of determining the accuracy of the output.

Verification and validation :
Verification and validation are independent procedures that are used together for checking that a product, service, or system meets requirements and specifications and that it fulfills its intended purpose. These are critical components of a quality management system such as ISO 9000. The words "verification" and "validation" are sometimes preceded with "independent", indicating that the verification and validation is to be performed by a disinterested third party.

Limitations of mathmatics:
Nature is not mathematico-logically perfect.
Gödel proved that any (sufficiently complex) formal system cannot prove its own consistency, and that any such consistent system contains statements that can neither be proved nor disproved. That may be a limitation of formal systems, but only in a way that is a limitation on everything.
By formalising mathmatics one can create one’s own prison (all historical physical forces unification attempts).

Thinking and believing:
Einstein: “The word God is for me nothing more than the expression and product of human weaknesses, the Bible a collection of honorable, but still primitive legends which are nevertheless pretty childish.”

TEA:
A subject I thoroughly enjoy.

Frans
« Last Edit: August 12, 2021, 10:57:41 am by FransW »
PE1CCN, Systems Engineering, HP, Philips, TEK, BRYMAN, Fluke, Keithley
 

Offline mzzj

  • Super Contributor
  • ***
  • Posts: 1245
  • Country: fi
Re: Uncertainty Budgets and Decision Rules
« Reply #56 on: August 12, 2021, 02:25:10 pm »

This is my problem with calibration services. Calibrating once a year with the 1-year specs as criterion, especially if not providing hard numbers on the measured margin to those specs, that leaves the UUT possibly outside the 1-year specs for the whole year. Well the next calibration, the year after, should catch that the UUT is no longer in the 1-year specs and thus indicate that the measurements during the past year were less accurate than assumed. This makes calibration mostly useful to verify past measurements, not guarantee the performance in the coming year. In contrast, naively reading the datasheet would have you believe that a 1-year calibration schedule awards you the 1-year accuracy specs. Anyone care to comment if this weakness is well-known?

I'm not sure if this is what you were after but this is quite well covered in 17025 standard and 17025 accredited labs.

17025 accredited labs are not allowed (or should not automatically) mark next calibration or recommended calibration intervals in the calibration stickers or in the certificates as this is something that customer should define by themself depending on case.
(customer should base the decision on risk/reward ratio of drifting outside specifications and specific use ie. how heavy-handedly the instruments are used)

Depending on case you can adjust your calibration perioid to shorter or longer or "over-specify" your equipment so that they stay within your requirements for longer perioids.

And after all its still probabilities: ie your meter stays within 1 year specifications with 99,8% probability, calibration results are correct with 95% probability and so on.
You wont get anything absolute but you can  squeeze the probabilities to a level that is acceptable for your operations.
Sometimes it might mean 10x  3458A DVM's in parallel and calibrated every 3 months. Or only one, calibrated once in 5 years.

 
 
The following users thanked this post: Anders Petersson, mendip_discovery

Offline FransW

  • Frequent Contributor
  • **
  • Posts: 270
  • Country: nl
Re: Uncertainty Budgets and Decision Rules
« Reply #57 on: August 12, 2021, 03:55:13 pm »
So what is the probability of finding the true value?
PE1CCN, Systems Engineering, HP, Philips, TEK, BRYMAN, Fluke, Keithley
 

Offline bsdphk

  • Regular Contributor
  • *
  • Posts: 198
  • Country: dk
Re: Uncertainty Budgets and Decision Rules
« Reply #58 on: August 12, 2021, 08:59:16 pm »
So what is the probability of finding the true value?

Practically nill, because the true value depends on a lot of noise-processes for which we generally do not control, even in calibration.

For instance all grid powered instruments are sensitive to the grid at some level, it may be the voltage, the frequency, the harmonic distortion, random noise, shot noise, EMC etc. etc. etc.

In a well designed instrument, these effects are at least an order of magnitude below all the effects you calibrate for, but that does not mean that they stop existing, merely that they do not matter.

But because they are still there, the least significant digit dithers, telling us, that there is no "true value".
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 7825
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #59 on: August 12, 2021, 10:02:36 pm »
So what is the probability of finding the true value?

Are you positing that there is a true value?  8)
 
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline FransW

  • Frequent Contributor
  • **
  • Posts: 270
  • Country: nl
Re: Uncertainty Budgets and Decision Rules
« Reply #60 on: August 13, 2021, 08:44:03 am »
Yes, I do.
The significance might escape us today (& tomorrow) but behind it all there is an explanation.
As an example I often use the Planck time which can be calculated as defined.
As a consequence time-nuttery seems misplaced and irrelevant as the definition of 1 Volt.
The agreement for usage is clear. We need to talk about the same thing if we want to understand each other.
It is however quite a task to extend the current limits.
The tesseract in Interstellar is another example for stretching the limits but certainly challenging (Thorne, theoretical physicist).
PE1CCN, Systems Engineering, HP, Philips, TEK, BRYMAN, Fluke, Keithley
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #61 on: August 16, 2021, 08:46:30 pm »
mendip_discovery thank you for this thread. I read it with great interest.

Can you give an example of a complete calculation for example for calibrating 10V exit Fluke 732a (1) using 3458A and other Fluke 732a (2) to which there is data from the calibration laboratory of the top level.

Those. So we took and received 100 measurements that show the difference between both standards. We know the parameters of the Laboratory Fluke 732a, the temperature in the laboratory, and the 3458A passport data.

Can you make an example of how we get the data for the calibrated Fluke 732a (1)?


So attached is a crude Budget for a 10.0000000V source. I am not sure the Load Regulation is a factor to take but I am assuming it is the worst case scenario when you hook up a load. A bit like when you load a DC power supply. The Imported Unc is the bit that weights up the reading. Without data to prove that the 10V source is going to remain in spec, I can only assume it will do. But if it did you could keep a similar Unc just adjust the ppm stuff for the 10.0000472V (rather than 10V) that you will calibrate to, its what we do with standard resistors.




« Last Edit: August 16, 2021, 08:49:29 pm by mendip_discovery »
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 
The following users thanked this post: bck, MegaVolt

Offline MegaVolt

  • Frequent Contributor
  • **
  • Posts: 917
  • Country: by
Re: Uncertainty Budgets and Decision Rules
« Reply #62 on: August 17, 2021, 08:08:41 am »
Many thanks for the expansion answer. If you do not mind I will ask more :)

The Imported Unc is the bit that weights up the reading.
1. Number 536.9 μV describes the measuring laboratory? Those (lab source standard + NULL Meter) has a 536.9 μV uncertainty?

Quote
2. But if it did you could keep a similar Unc just adjust the ppm stuff for the 10.0000472V (rather than 10V) that you will calibrate to, its what we do with standard resistors.
2. Where did the number of 10.0000472v come from? I do not see this number in calculations?

3. Why do we consider noise and drift having a rectangular distribution?
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #63 on: August 17, 2021, 11:05:18 am »
Many thanks for the expansion answer. If you do not mind I will ask more :)

The Imported Unc is the bit that weights up the reading.
1. Number 536.9 μV describes the measuring laboratory? Those (lab source standard + NULL Meter) has a 536.9 μV uncertainty?

That is from the lab that tested the source. It is their Uncertainty that is very large and so therefore it makes a big impact on the readings. It makes their measurement a little bit of a joke as they can't say its in or out of spec.

Quote
2. But if it did you could keep a similar Unc just adjust the ppm stuff for the 10.0000472V (rather than 10V) that you will calibrate to, its what we do with standard resistors.
2. Where did the number of 10.0000472v come from? I do not see this number in calculations?

That is the measured value from the certificate. If you have it adjusted to be within the specification and it keeps within the spec then its easier to use 10V and apply the Uncertainty above. But if it drifts to 10.00000472 V and let us say and it can't be adjusted to be 10.0V you can still use it, you can use it at 10.0000472 V but you would need to recalculate the ppm values to 10.0000472 V. It has 50 μV of adjustment but it might overtime drift so far you can no longer adjust it.

3. Why do we consider noise and drift having a rectangular distribution?

It is a fairly safe assumption most things are Square unless they are otherwise, M3003 does give some examples and from the top of my head its because its a limit of + and - of the centre point and there is an equal chance of it being +1ppm or -1ppm with noise. The same goes for Drift, it could drift up as well as down, from day to day or year to year.  Noise could be triangular as its potentially going to be the right reading but occasionally not but its hardly a number to worry about.

I would be chasing another lab to get the Imported Uncertainty down.
If you can prove your line voltage is <10% of swing then you could even drill down that ppm, if its 1% then you can take the 0.5ppm and take a stab at saying its 0.05ppm, or if its battery-operated then maybe run it off the mains remove the issue.
If you have an 8.5digit meter with a really good Uncertainty then you could use that and just compare the reading from your source to the Unit Under Test and that would remove the 6ppm load regulation straight of the bat. The 10V source then just becomes a transfer standard and can in theory not be calibrated. But I still would just so if you use it without another meter then you can.

Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 
The following users thanked this post: MegaVolt

Offline MegaVolt

  • Frequent Contributor
  • **
  • Posts: 917
  • Country: by
Re: Uncertainty Budgets and Decision Rules
« Reply #64 on: August 17, 2021, 11:46:20 am »
That is the measured value from the certificate. If you have it adjusted to be within the specification and it keeps within the spec then its easier to use 10V and apply the Uncertainty above. But if it drifts to 10.00000472 V and let us say and it can't be adjusted to be 10.0V you can still use it, you can use it at 10.0000472 V but you would need to recalculate the ppm values to 10.0000472 V. It has 50 μV of adjustment but it might overtime drift so far you can no longer adjust it.
I'm a little confused. We have a standard owned laboratory. Let's call it lab_standard. And there is a user standard. Let's call it Custom_standard.
Lab_standard = 10.00000472 v with UNCERTAINTY 536.9 μV (k = 2). This is data from the calibration certificate.

If I use this standard for a year, I must use the uncertainty calculated in the table and equal to 545.86 μV (k = 2)

And all the calculations that were made while relate only to lab_standard? I understood correctly?
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #65 on: August 17, 2021, 12:23:42 pm »
That is the measured value from the certificate. If you have it adjusted to be within the specification and it keeps within the spec then its easier to use 10V and apply the Uncertainty above. But if it drifts to 10.00000472 V and let us say and it can't be adjusted to be 10.0V you can still use it, you can use it at 10.0000472 V but you would need to recalculate the ppm values to 10.0000472 V. It has 50 μV of adjustment but it might overtime drift so far you can no longer adjust it.
I'm a little confused. We have a standard owned laboratory. Let's call it lab_standard. And there is a user standard. Let's call it Custom_standard.
Lab_standard = 10.00000472 v with UNCERTAINTY 536.9 μV (k = 2). This is data from the calibration certificate.

If I use this standard for a year, I must use the uncertainty calculated in the table and equal to 545.86 μV (k = 2)

And all the calculations that were made while relate only to lab_standard? I understood correctly?

Yes, this Uncertainty relates to that Lab Standard. You can then use it the calibrate measurement devices. So you would say the Customers Multimeter measured this 10V source with an Uncertainty of 545.86 μV (k = 2). You could also use it to calibrate the Customer Source by Nulling/Zeroing using the lab's multimeter and lab standard then measuring the customer's unit but that wouldn't my prefered option but it can be done and you would have to possibly add some extra in for the Lab Multimeter.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 
The following users thanked this post: MegaVolt

Offline MegaVolt

  • Frequent Contributor
  • **
  • Posts: 917
  • Country: by
Re: Uncertainty Budgets and Decision Rules
« Reply #66 on: August 17, 2021, 12:27:46 pm »
Thanks for answers. All clear. :)

You could also use it to calibrate the Customer Source by Nulling/Zeroing using the lab's multimeter and lab standard then measuring the customer's unit but that wouldn't my prefered option but it can be done and you would have to possibly add some extra in for the Lab Multimeter.
Which calibration method do you choose?
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #67 on: August 17, 2021, 12:44:16 pm »
Thanks for answers. All clear. :)

You could also use it to calibrate the Customer Source by Nulling/Zeroing using the lab's multimeter and lab standard then measuring the customer's unit but that wouldn't my prefered option but it can be done and you would have to possibly add some extra in for the Lab Multimeter.
Which calibration method do you choose?

At work, I have a Multiproduct calibrator as a source which I use to calibrated customer meters. I also have a 6.5digit multimeter for the measurement of sources. On the 17025 schedule, the best capability is quoted even if you have other kit available.

It gets fun as I have calibrated a few shunts and the budget for that is in the file but I know I need to give it a damned good stare as I have a feeling its not quite right, but the only shunts I calibrate are our own so Iam not too worried yet. For the budget I use the 10A out of the calibrator with the multimeter measuring the voltage drop, but I need to rethink as I also need to take into account of the calibrator getting warm and the 10A dropping over the calibration period, to do that I would need to use another 6.5digit meter and a suitable shunt. Then there is an issue around 10A being used and you are not testing the shunt at an A that is suitable, ie 10A to test a 4000A shunt is a bit of joke as you are not warming it up etc.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 
The following users thanked this post: MegaVolt

Offline mzzj

  • Super Contributor
  • ***
  • Posts: 1245
  • Country: fi
Re: Uncertainty Budgets and Decision Rules
« Reply #68 on: August 17, 2021, 04:42:18 pm »

The new 17025 documents are putting more pressure on labs to have a decision rule and that is a good thing as there are lots of labs stating stuff is passed when actually given the barn door of an uncertainty they have it could well be undetermined. It is quite funny looking at mechanical calibration certs where the Unc is wider than the spec but even funnier is that labs had an Unc for length but they never had one for flatness or parallelism of that gauge, now they have to and it is causing lots of cuffufle. The alternative is labs just don't say it conforms to a spec but I get a feeling that won't slide for much longer as customers assume compliance of a gauge that has been calibrated.

Around here it seems quite common to skip the specifications and pass/fail statements altogether and only provide the measurement results (and uncertainty) for the customer.

Certainly that way in our lab: Out of the few thousand temperature calibration certificates that I have signed maybe five! had any sort of comparison to specifications.   

For anyone interested the ILAC G8 is the current "bible" on decision rules:
https://ilac.org/latest_ilac_news/revised-ilac-g8-published/

Requirements for decision rules vary quite a lot by field: DMM calibration with 95% uncertainty is good enough for most uses but you might want to consider much larger guardband if you work in drug lab or calibrate police radar gun.
« Last Edit: August 17, 2021, 04:51:47 pm by mzzj »
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #69 on: August 17, 2021, 05:23:34 pm »
Around here it seems quite common to skip the specifications and pass/fail statements altogether and only provide the measurement results (and uncertainty) for the customer.

Certainly that way in our lab: Out of the few thousand temperature calibration certificates that I have signed maybe five! had any sort of comparison to specifications.   

For anyone interested the ILAC G8 is the current "bible" on decision rules:
https://ilac.org/latest_ilac_news/revised-ilac-g8-published/

Requirements for decision rules vary quite a lot by field: DMM calibration with 95% uncertainty is good enough for most uses but you might want to consider much larger guardband if you work in drug lab or calibrate police radar gun.

Yes its pretty much the same here, but I just get a feeling it is not the best option and in the future, they will push labs to have it in place. I have a simple one set up for the mechanical one, if the Measurement and Uncertainty is outside the spec then fail, if its in then pass, if the result is in or out but the Uncertainty is a little out then class as undetermined. I need to get my head around the maths and maybe work out a probability of it being a pass or fail, and report that. I am currently going though getting mechanical and lets just say there are some of it that I am really not enjoying.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 
The following users thanked this post: bck

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 7825
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #70 on: August 17, 2021, 07:50:05 pm »
So attached is a crude Budget for a 10.0000000V source. I am not sure the Load Regulation is a factor to take but I am assuming it is the worst case scenario when you hook up a load. A bit like when you load a DC power supply. The Imported Unc is the bit that weights up the reading. Without data to prove that the 10V source is going to remain in spec, I can only assume it will do. But if it did you could keep a similar Unc just adjust the ppm stuff for the 10.0000472V (rather than 10V) that you will calibrate to, its what we do with standard resistors.

Is this based on the certificate that I had posted or on your own lab?

One of the things that would concern me is the ad-hoc assembly of your uncertainty budget without an explicit, sanctioned guide to doing so.  As a result I think you've made some errors, please comment.  In any case, unless all labs follow similar procedures, they won't get consistent results--and if calibration labs don't give consistent results, then what's the point??

1.  There should be no need to analyze the various details of the stated 6ppm uncertainty from Fluke if that is what you are going with.  That is an all-inclusive k=2.58 (99% CI) number (not rectangular distribution) and includes such things as 10C worth of tempco, line regulation and so on.  You shouldn't need to add anything else in there that is not related to your lab or ability to do repeatable measurements.

2.  If you really want to give an uncertainty at cal temp and use only the drift spec, that spec is 3.0ppm/year, k=2.58 (AFAIK--CI isn't stated explicitly in the document I have but is for later and other models)

3.  Line regulation is already 0.05ppm.

4.  Load regulation is a predictable characteristic, not an uncertainty.  That has nothing to do with calibration, which for a voltage standard should always be done at high impedance.

Now those are nits that have to do with your process, which is what is being discussed here.  However, the real issue for me is that neither your lab nor the one in the certificate I posted has any business letting a voltage standard like this in the door--unless you are buying it for your own use.  You can be 'honest' and list your actual UNC numbers all you want, but what is happening is that somebody is getting a calibration certificate that allows them to say 'calibration certificate' but is only slightly more useful than a 9 volt battery.  However, when the customer sees "10.0000472 v with UNCERTAINTY 536.9 μV (k = 2)" read straight off an old HP3458A, they likely only care about the first number and consider rest to be boilerplate CYA stuff that they can ignore.  I would like to see a policy--or even regulation or law--that requires the sticker, not just the calibration certificate, to note any instrument that is calibrated to less than the manufacturers specifications. 

I'm guessing you mostly agree with all that, so I hope your lab isn't among the ones doing this sort of nonsense.  So what would it take to actually calibrate a 732A for real?  And who can do that?



A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 
The following users thanked this post: MegaVolt

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #71 on: August 18, 2021, 07:58:56 am »
Quote from: bdunham7
Is this based on the certificate that I had posted or on your own lab?

The cert you posted along with the spec sheet.

Quote from: bdunham7
One of the things that would concern me is the ad-hoc assembly of your uncertainty budget without an explicit, sanctioned guide to doing so.  As a result, I think you've made some errors, please comment.  In any case, unless all labs follow similar procedures, they won't get consistent results--and if calibration labs don't give consistent results, then what's the point??

There is no hard and set rules on what you put in or leave out of the budget, it is audited and the auditors will pull you up if they believe you have made an error. The main thing they are concerned with are you erring on the side of caution rather than overestimating your ability. The aim is that labs are getting consistent results within their claimed Uncertainty if our interlab EN ratio is >1 there is a serious issue with the budget because you have an Unc lower than you are actually getting, if it is <0.2 then there is an argument that our budget needs looking at because your Unc is higher than it needs to be.


Quote from: bdunham7
1.  There should be no need to analyze the various details of the stated 6ppm uncertainty from Fluke if that is what you are going with.  That is an all-inclusive k=2.58 (99% CI) number (not rectangular distribution) and includes such things as 10C worth of tempco, line regulation and so on.  You shouldn't need to add anything else in there that is not related to your lab or ability to do repeatable measurements.

Not sure where you get the k=2.58 from as I pulled up a spec sheet for the Fluke and there is no mention of the K figure. I will admit I hadn't looked up the full spec sheet I missed on some of the nuances,  I did wonder about the gap between 18 and 28C didn't get it in my head at the time. In that case it would be Normal and a dividing factor of 2.58.


Quote from: bdunham7
2.  If you really want to give an uncertainty at cal temp and use only the drift spec, that spec is 3.0ppm/year, k=2.58 (AFAIK--CI isn't stated explicitly in the document I have but is for later and other models)

I missed its Normal operating temp is in the expected lab conditions, I would only look at it more if it was outside of that.

Quote from: bdunham7
3.  Line regulation is already 0.05ppm.

Yup, another mistake by me, I did say it was a crude one.

Quote from: bdunham7
4.  Load regulation is a predictable characteristic, not an uncertainty.  That has nothing to do with calibration, which for a voltage standard should always be done at high impedance.

I was just looking at sources of potential error. If for example the meter drops or raises voltage due to 10Mohm or 1Gohm measurements then it will add to the uncertainty.

Quote from: bdunham7
Now those are nits that have to do with your process, which is what is being discussed here.  However, the real issue for me is that neither your lab nor the one in the certificate I posted has any business letting a voltage standard like this in the door--unless you are buying it for your own use.

You can be 'honest' and list your actual UNC numbers all you want, but what is happening is that somebody is getting a calibration certificate that allows them to say 'calibration certificate' but is only slightly more useful than a 9 volt battery.  However, when the customer sees "10.0000472 v with UNCERTAINTY 536.9 μV (k = 2)" read straight off an old HP3458A, they likely only care about the first number and consider rest to be boilerplate CYA stuff that they can ignore.  I would like to see a policy--or even regulation or law--that requires the sticker, not just the calibration certificate, to note any instrument that is calibrated to less than the manufacturers specifications. 

Its the customer's responsibility to check the lab has a suitable Unc and capability to fit their needs. If I had that send it I would do it, I would quote my Unc and that would be that. Its a bit like saying I bought a dollar store multimeter and it is not as good as an HP 8.5digit meter, who is at fault, the store for selling a crap meter to the customer or the customer not doing research on what they need. Manufacturer's don't always explain how they managed to get the results they get they tend to keep some of that data to themselves, some items come with good supportive documentation but that isn't the norm. Often its for the customer to decide what they want, they might only want to know its value so they can use it for correction, some want to know its within 2% even if the spec is 0.5%, just because you need is manufacturer spec then don't assume its the same for others. With regards to an Uncertainty that is larger than the Spec that I feel is going to happen, it is where the emphasis on decision rules is coming in because labs won't be able to say pass or fail if the customer asks them and their uncertainty is larger than the spec, they would lose their accreditation.

My lab would quote an unc of 60ppm + 1mV as I only have a 6.5digit meter at the moment.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline binary01

  • Contributor
  • Posts: 37
  • Country: au
Re: Uncertainty Budgets and Decision Rules
« Reply #72 on: August 18, 2021, 11:36:52 am »
The new 17025 documents are putting more pressure on labs to have a decision rule and that is a good thing as there are lots of labs stating stuff is passed when actually given the barn door of an uncertainty they have it could well be undetermined.

Personally, I am in favour of this push for labs to consider compliance for most customers.  In the pressure calibration world (at least in Australia) we have been required to assess instruments for compliance to a specification for many years, where the specification is provided by the customer (based on their usage, risk, etc) or a default specification is adopted based on the manufacturer's 12 month accuracy.
In my region, most instruments are sent for routine calibration to see if the instrument still performs within the original specification, and if not, ask that it be adjusted to do so - in this case, assessment of the instrument for compliance to a specification is the most important aspect for the general industrial customer.  It can be dissapointing for us metrologists, who are usually focussed on minimising uncertainty, but this is a reality for many customers who may not understand uncertainty.  Pass/fail is easily understood and metrologists are well placed to assess this.  For digital pressure instruments, the arithmetic addition of uncertainty and error/correction at each point must be equal or less than the specification.  We have carried over the same compliance rule in our electrical lab.
Unlike pressure labs, electrical labs in Australia have often historically not considered any compliance/decision rule and simply reported a measurement and uncertainty, which in some cases was disturbingly large, and inadequate for the test instrument. Poor labs were able to do this with little recourse, relying on the customer ignorance.  With the emphasis on decision/compliance rules in 17025, these labs are now obliged to assess if their uncertainty is adequate during contract negotiation.  This is an excellent outcome.  Of course, there are still labs that ignore this with the "barn door uncertainty" and give a 100 ppm uncertainty to assess a 50 ppm accuracy specification, and say that it "passes", but I hope these are picked-up by our accreditation body over time.
Although it is ultimately the user's responsibility to contract a laboratory with a capability that suits their needs, it can be very hard to navigate or understand a scope of accreditation, and I think we also have a duty of care to make sure our capability is likely to be suitable.
Of course, for metrologists seeking calibration of their reference equipment, they understand uncertainty, errors, corrections, drift, etc, very well and can easily ask their reference lab to assess the instrument to lowest possible uncertainties, without adjustment or consideration for compliance to an accruacy specification, and this is permitted based on specific customer request within 17025.
« Last Edit: August 18, 2021, 11:39:23 am by binary01 »
 
The following users thanked this post: mendip_discovery

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #73 on: August 18, 2021, 12:47:38 pm »
Personally, I am in favour of this push for labs to consider compliance for most customers.  In the pressure calibration world (at least in Australia) we have been required to assess instruments for compliance to a specification for many years, where the specification is provided by the customer (based on their usage, risk, etc) or a default specification is adopted based on the manufacturer's 12 month accuracy.
In my region, most instruments are sent for routine calibration to see if the instrument still performs within the original specification, and if not, ask that it be adjusted to do so - in this case, assessment of the instrument for compliance to a specification is the most important aspect for the general industrial customer.  It can be dissapointing for us metrologists, who are usually focussed on minimising uncertainty, but this is a reality for many customers who may not understand uncertainty.  Pass/fail is easily understood and metrologists are well placed to assess this.  For digital pressure instruments, the arithmetic addition of uncertainty and error/correction at each point must be equal or less than the specification.  We have carried over the same compliance rule in our electrical lab.
Unlike pressure labs, electrical labs in Australia have often historically not considered any compliance/decision rule and simply reported a measurement and uncertainty, which in some cases was disturbingly large, and inadequate for the test instrument. Poor labs were able to do this with little recourse, relying on the customer ignorance.  With the emphasis on decision/compliance rules in 17025, these labs are now obliged to assess if their uncertainty is adequate during contract negotiation.  This is an excellent outcome.  Of course, there are still labs that ignore this with the "barn door uncertainty" and give a 100 ppm uncertainty to assess a 50 ppm accuracy specification, and say that it "passes", but I hope these are picked-up by our accreditation body over time.
Although it is ultimately the user's responsibility to contract a laboratory with a capability that suits their needs, it can be very hard to navigate or understand a scope of accreditation, and I think we also have a duty of care to make sure our capability is likely to be suitable.
Of course, for metrologists seeking calibration of their reference equipment, they understand uncertainty, errors, corrections, drift, etc, very well and can easily ask their reference lab to assess the instrument to lowest possible uncertainties, without adjustment or consideration for compliance to an accruacy specification, and this is permitted based on specific customer request within 17025.

I agree. Electrical has just got away with giving a result, and the mechanical stuff has just been saying pass on elements they don't even have an Unc for. Like flatness of the anvils for a vernier. I know this as I am just going through the mech stuff and I have to jump through the new hoops. Mechanical has also been saying Pass even when there is a Unc greater then the specifications. Part of this isn't helped that the specifications were written before Uncertainty was even really thought about.

I remember doing pressure gauges to UKAS, having to take into account Height Above Sea Level, Atmospheric Pressure, Gravity, also how high the gauge on the deadweight tester. It was such a nightmare for a gauge you know was just going to be used to tell them that there is pressure there. The worst pressure I had to do was 4000bar that was a day for wearing brown trousers.

I see no reason why customers and hobbyists should also look at these things, its clear that many here get this mentality that because the meter is calibrated that its spot on and everything else is wrong and even if they accept some error it looks like they quote the spec and don't take into account the imported errors for the kit used to calibrate it.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline bsdphk

  • Regular Contributor
  • *
  • Posts: 198
  • Country: dk
Re: Uncertainty Budgets and Decision Rules
« Reply #74 on: August 18, 2021, 01:54:20 pm »
[...] having to take into account Height Above Sea Level, Atmospheric Pressure, Gravity [...]

The worst calibration I have ever heard about, is the BIPM's Watt Balance.

They told me they would have to calibrate out the gravity gradient, that is, not the direction of the force of gravity, but the curvature of the force of gravity, because BIPM is literally located on the side of a mountain.

How ?  No idea, but they knew they would have to.


 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf