Author Topic: How long should you monitor a calibrator output?  (Read 1881 times)

0 Members and 1 Guest are viewing this topic.

Offline Tony_GTopic starter

  • Frequent Contributor
  • **
  • Posts: 899
  • Country: us
  • Checkout my old test gear channel (link in sig)
    • TGSoapbox
How long should you monitor a calibrator output?
« on: February 13, 2017, 05:04:53 am »
Hi All,

I've been playing around with IanJ's Excel macro and I'm wondering if there is an industry standard for acceptance testing on a calibrator? My 5440B manual says that the acceptance criteria is, for 1V, 0.999960 to 1.000040 but it doesn't list how long and at what temperature that is.

This really isn't my area of expertise so I just don't know if there is a standard - i.e. measure at 24.5C for 24H sort of thing.

Any commentary appreciated.

TonyG

Offline Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2377
  • Country: de
Re: How long should you monitor a calibrator output?
« Reply #1 on: February 13, 2017, 07:14:39 am »
Hi TonyG,

at first you have to separate specification limits of the instrument, defined for different time scales, versus sample acquisition time, following "Good Metrological Practice" (GMP).

Concerning the 5440B specification, you already have several different time scales, like
30days, 90 days, and 1year uncertainty, additionally the 10minutes, 24h  and 30 days stability specification.
Both show different aspects and instability sources of the instrument, and these are defined on fixed temperature ranges, +/-5°C or +/- 1°C from the nominal lab temperature.
It's metrological practice, that the lab temperature is usually 23°C, but it may be defined differently in the calibration report.

The sources of instability are: Temperature, Ageing, Noise, and EMF, which you may mitigate, or simply include into your uncertainty calculations. The basic uncertainty of the Volt transfer from NIST to your lab has of course to be added; that's also mentioned in the specification.

The noise of the instrument plus your comparison instruments (e.g. a 3458A, or an external 10V reference) is the only parameter, which can be influenced or mitigated by different sampling time scales.

If you measure the 10V output of the 5440B for 0.1sec time frames, you will see a lot of short term noise. At 1sec ...60sec averaging time, these noise figures will decrease, and the readings will stabilize more and more to consistent values.

At longer time scales like > 5minutes, temperature and ageing drifts will already show up and will increase the uncertainty  / standard deviation.

This whole picture can be summarized within the Allan Deviation statistics, which describes the stability on these different time scales. Maybe you check this idea with the vast thread "stability comparison of DVM", which TiN initiated.. there you will find a lot of such Allan deviation diagrams.

To measure the basic uncertainty, it is Good Metrological Practice to average over about 2..4 minutes, and also to determine the standard deviation (2 sigma)  over all the samples taken within this time frame.

It's evident, that the lab temperature should not change during this measurement, at least not more than the given limits of the instruments used.

Frank
« Last Edit: February 13, 2017, 08:04:55 am by Dr. Frank »
 
The following users thanked this post: pmcouto, TiN, Assafl, 2N3055, CalMachine


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf