Author Topic: self calibrating multimeters  (Read 5676 times)

0 Members and 1 Guest are viewing this topic.

Offline alex77Topic starter

  • Contributor
  • Posts: 10
self calibrating multimeters
« on: August 12, 2013, 12:11:08 pm »

I know that digital oscilloscopes have this feature, but I found out only yesterday that there are multimeters with a self calibration mode.

I would like to know your opinion about this, if you think that this kind of calibration is reliable, if any one of you already own one, etc.

thanks in advance
 

Offline Wytnucls

  • Super Contributor
  • ***
  • Posts: 3045
  • Country: be
Re: self calibrating multimeters
« Reply #1 on: August 12, 2013, 12:21:21 pm »
I have seen that feature advertised on a few cheap multimeters. It would need a very accurate voltage reference on board to make sense, which is unlikely to be the case on a basic meter. I suspect it is a sales gimmick and a proper calibration with an external standard would still be required for any kind of confidence.

http://www.digitek.com.hk/en/newcon.php?id=5
« Last Edit: August 12, 2013, 12:24:18 pm by Wytnucls »
 

Offline PA4TIM

  • Super Contributor
  • ***
  • Posts: 1161
  • Country: nl
  • instruments are like rabbits, they multiply fast
    • PA4TIMs shelter for orphan measurement stuff
Re: self calibrating multimeters
« Reply #2 on: August 12, 2013, 12:52:20 pm »
For a scope or spectrumanalyser it is a different thing. This because the accuracy of those things is relative low. A good scope is 2% so amplitude calibration only needs a 0.2 % reference and that is not hard to add. For frequency the same thing, a normal cristal oscillator will be a few magnitudes better as the timebase accuracy. In a Tek 271X SA they use an ovenised Xtal and a TL431 as amplitude reference. More then enough.

A multimeter needs to have a voltreference (or older types a frequency reference for the VF converter)
This Vref can be a simple resistor divider (I have seen two cheap DMM using this) or something like a LTZ1000 and there is a lot in between. The Vref here dictates for the major part the total performance. But you could make a construction that used very cheap unstable components and use the meter to measure its own reference in the hope that is a bit better as the Vref and so compensate the crappy components. But normally when you use a good reference, you use good parts for the rest and when you save on parts you do not throw in a more expensive Vref (that also needs a better design pcb etc)

And for calibration the standard needs to be around 10x better as the internal reference. Besides that, that reference needs to be aged enough (many people forget that) and calibrated. You need to know its value. Even a LM399 has a poor initial accuracy, but the value it is after burning in stays very stable and that is important. It could be used for some sort of self check. But in that case you must asume the Vref is the most stable part and the drift is only caused by things like resistors. 

I have seen Vref designs where they used the Vref for a sort of self calibration. I do not remember exact how but is was rather clever, a sort of ratio measurement and bridge like setup. If both test points had the right ratio the output should be 10V. I once made that and it worked but the Vref (1n821 if i remember well) used had a very bad tempco and I could not find the sweetspot,  so you kept adjusting it.

a separate Vref as in house standard is nice but never forget thwe initial accuracy, the Rout, load it can handle and calculate the numbers. A LM399 on its own is not usefull but combined with a good chopperamp, good resistors and a buffer can make a good reference. Problem is you need to calibrate it. Even the standards at nist need calibration
 
www.pa4tim.nl my collection measurement gear and experiments Also lots of info about network analyse
www.schneiderelectronicsrepair.nl  repair of test and calibration equipment
https://www.youtube.com/user/pa4tim my youtube channel
 

Offline Wytnucls

  • Super Contributor
  • ***
  • Posts: 3045
  • Country: be
Re: self calibrating multimeters
« Reply #3 on: August 12, 2013, 01:06:26 pm »
In the case of Digitek, if I understand the patent correctly, the self-calibration adjusts readings to compensate for ambient temperature and humidity falling outside of the original calibration environment. This is done by internal measurement of the reference resistor network and comparing it to the initial calibration value.
 

Offline KedasProbe

  • Frequent Contributor
  • **
  • Posts: 646
  • Country: be
Re: self calibrating multimeters
« Reply #4 on: August 12, 2013, 01:46:06 pm »
If the multimeter has to take very small DC offsets into account they will have an auto-zero calibration, basically 0V should read 0V, the difference will be subtracted from the measurment.
« Last Edit: August 12, 2013, 01:47:45 pm by KedasProbe »
Not everything that counts can be measured. Not everything that can be measured counts.
[W. Bruce Cameron]
 

Offline Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2384
  • Country: de
Re: self calibrating multimeters
« Reply #5 on: August 12, 2013, 03:05:53 pm »

I know that digital oscilloscopes have this feature, but I found out only yesterday that there are multimeters with a self calibration mode.

I would like to know your opinion about this, if you think that this kind of calibration is reliable, if any one of you already own one, etc.

thanks in advance

Hello,

There may be a difference between 'self calibration' and 'auto calibration' instruments, in technology and in performance.
Example for Self Cal is the Datron 1281, but this seems to be a weaker technique, than AutoCal.

Please indicate, which instrument you have in mind for self cal.


I own two AutoCal Instruments, the HP3458A DMM, and the Fluke 5442A DCV calibrator.
The HP 3458A is the best in this class, due to its most linear ADC, where this technique mostly relies on.

Definitely, the AutoCal feature works highly reliable, and the principle, also designated as Artifact Calibration, is also accepted in metrology.

There are several documents from HP/agilent and from Fluke which describe this technique thoroughly, which should be read to really understand the scope of this feature, i.e. what characteristics of such instruments are auto-calibrated, and which parameters cannot be corrected by principle. Then, possible differences to self cal may be evident.

Here they are:

http://assets.fluke.com/appnotes/Calibration/dbmsc98.pdf
http://www.hpl.hp.com/hpjournal/pdfs/IssuePDFs/1989-04.pdf

http://support.fluke.com/calibration-sales/Download/Asset/1260355_6200_ENG_C_W.PDF[/url]



Frank
« Last Edit: August 12, 2013, 07:11:13 pm by Dr. Frank »
 

Offline Fraser

  • Super Contributor
  • ***
  • Posts: 13170
  • Country: gb
Re: self calibrating multimeters
« Reply #6 on: August 12, 2013, 03:25:29 pm »
Not quite the same but I dismantled a large Nixie tubed Solartron lab voltmeter. Inside I found my very first Weston Cell voltage reference, and a load of very large resistors rated at 0.01% tolerance. I used the Weston Cell and Resistors to check digital multimeters when I was a student (10MOhm input to protect cell)

I believe the unit could calibrate itself against the Weston Cell. I still have the cell but it has now degraded to the 'terminal' voltage that can no longer be predicted in terms of uV per year change. i.e its at end of life, though still functional. Clever Chemistry.

That Solartron was an impressive beast but it was 19" wide, 16" deep and around 4" high. It was also very heavy.
If I have helped you please consider a donation : https://gofund.me/c86b0a2c
 

Offline PA4TIM

  • Super Contributor
  • ***
  • Posts: 1161
  • Country: nl
  • instruments are like rabbits, they multiply fast
    • PA4TIMs shelter for orphan measurement stuff
Re: self calibrating multimeters
« Reply #7 on: August 12, 2013, 03:45:19 pm »
http://www.pa4tim.nl/?p=2141
This is my guildline cabinet with four saturated cells. They need to be kept at a steady temp and are very hard to transport. They need weeks to recover from movements or temp changes but they last for a very long time.
In your meter they used non-saturated cells. They last about 10-20 years.

Dr Frank, you are right I was forgotten the 3458 (the wet dream of every voltnut) uses auto calibration, thanks fpr the links 
www.pa4tim.nl my collection measurement gear and experiments Also lots of info about network analyse
www.schneiderelectronicsrepair.nl  repair of test and calibration equipment
https://www.youtube.com/user/pa4tim my youtube channel
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf