Author Topic: DIY DMM calibration  (Read 9687 times)

0 Members and 1 Guest are viewing this topic.

Offline rhbTopic starter

  • Super Contributor
  • ***
  • Posts: 3484
  • Country: us
Re: DIY DMM calibration
« Reply #25 on: November 06, 2018, 04:01:56 pm »
Quote
I have a 1500 sq ft shop in back with *almost* everything you could imagine.  Still need a TIG, plasma cutter and a few other bits, but except for a lack of organization, it's a pretty complete complement of tools.
I'd kill for having own workshop, even small one, not 1500sq.ft.

The sad part is most of the people who could *really* make good use of such a facility can't afford it until they are old.  The kid who made the IC was *only* able to do that because his parents had invested over a million dollars in a shop.

Quote
If I succeed , I'll have 3+GHz scope for 10% of market price. If not - I learned few things already about scope arch and BGA assembly.  :-+
Very similar how I started 5720A build project, from getting just one PCB from it for "little" toy project...

If the Tek doesn't work out, have a look at a LeCroy DDA-120, *not* DDA-125. The screen is a 640 x 480 VGA monitor.  There's an internal VGA out on any unit with the external VGA output option.  So one should simply be able to remove the entire CRT section, install a $90 7" LCD and go.

That leaves a huge space where the CRT was located right above the main board. The DDA-120 will sample a single channel at 8 GSa/S.  The 125 will also, but you need a special adaptor that connects to C2 & C3.

I thought the 7% overshoot of an  MSOX3104T was bad, but it's nothing compared to the 20% of the DDA-120.  Worse yet, the 125 is spec'd as 1.5 GHz while sampling at 2 GSa/S.  I can only assume that the application made rise time measurement more important than the quality of the step response.  But it really does have a < 250 pS rise time.  FYI the disk drive analyzers are modified LCxxx DSOs.

The anti-alias filters appear to be a pair of boards that span the gap between the AFE and the ADC.  Unfortunately, the LeCroy documentation is not very good.  I've yet to identify the filter in the schematics, so I'll be flying by the seat of my pants.  But...

There is lots of room where the CRT was to install some Radiall SP6T SMA DC to 18 GHz relays.  So my general plan is to get a DDA-120, replace the CRT, remove the crappy filters and install replacement boards with SMA connectors to connect to the relays and a set of proper filters, four 750 MHz filters for 4 channel mode, two 1.5 GHz filters for 2 channel mode and one 3 GHz filter for single channel mode.  The beasts are incredibly noisy, but a 3 GHz, 16 Mpt DSO with external time reference for under $1000 is a very attractive project.  Especially after trying out both the MSOX3104T and the RTM3104. 

Quote
Could i have same confidence with unknown history meters from ebay? I'd be just guessing if its my ref unstable, or the meter, or both have same drift-rate so result looks stable, but is not? Getting more uncharacterized meters that agree only adds more confusion and false confidence. If I'd try some AI math to predict long-term drift of such, it would base of wrong assumptions too. Garbage data in = garbage data out.  :popcorn:

But ofcourse, none of this matter if you happy with +/-100ppm uncertainty. Then you don't have to spend lot of $$ and just have literally ANY lab to cal your 34401A for cost of few beer boxes :).

That's very kind of you, but the sort of data I need is a suite of several  references with a year or two of data from initial startup.  The question is, if I have the first 1000 hours of drift, can I predict the next 1000 hours? But I entirely agree with your assessment of trying to predict aging from your data.  I should have no confidence in it either.  The early behavior is critical.

The aging curves have the general form of abs(a*(1-b*exp(-c*t))).  Most of the information about a,b and c are in the initial part of the curve.  After that it gets very linear.  So if you do not have the first part of the curve the best you can do is the current practice.

The reason I think I might succeed is that the equation is ill-posed and traditionally considered unsolvable using least squared summed error (L2).  And 30 years ago, a least absolute summed error (L1) solution was computationally intractable.

Without actually knowing much about it, I started doing L1 solutions to the 1D heat equation which is an infinite sum of exponentials of the same form.  It worked much better than I ever imagined possible.  Investigation led me on a 3 year binge learning about sparse L1 pursuits about which I've already written too much.

But I did a numerical experiment that is very relevant here.  I generated a particular example and then tried to recover a,b and c.  What I found was that if I had N samples I could predict the value from N+1 to 2*N to within a percent or two.  The  error could be reduced further, but at the cost of significantly more computation and not really worth while.  Suppose you measure a reference for 1000 hours under normal operating conditions and the drift is 5 ppm/1000 hours after the initial 1000 hours.  If the aging behaves as in my numerical experiment then for the next 1000 hours I can predict the aging to better than 0.1  ppm.  That reduces the uncertainty by almost 5 ppm.  It's not all the error, but a large chunk of it.

One can, of course, just use a linear approximation from one cal to the next, i.e. a linear fit to the drift from j to j+1 to predict the drift from j+1 to j+2.  That's so simple and obvious that I'd be quite surprised if it's not being done in the FW for most high end DMMs. 

I'd like to get 10 ppm, but like Conrad, I can live with more if I know the magnitude of the uncertainty.

As noted, all this requires multiple references, meters, samplers, etc.

With regard to the BNC terminators, this is just initial screening before putting them on a VNA.  I'm also checking for fit, etc.  I don't want to bother measuring the crappy ones with the VNAs.

That's really all I have to say on this.  I've got one of Doug's 10 V references in the mail.  It's not sufficient, but it's better than I have now and that's a start.  Right now my biggest source of error is the lack of temperature control.
 

Offline Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2393
  • Country: de
Re: DIY DMM calibration
« Reply #26 on: November 06, 2018, 06:12:39 pm »
You seem to be insinuating I think that a DMMCheck Plus is adequate to calibrate a  DMM...
.. But if I measure  5 V on a calibrated DMMCheck and one DMM reads 5.00000 VDC and the other reads 4.99980 VDC, why should I not  adjust the one which reads low?

Well, the problem with your argumentation is, that you compare references of different stability and accuracy, and in the end you indeed imply, that the DMMCheck REALLY could do the correct judgement, which of the two 34401A reads "correct".
That's the classical 'man-with-three-clocks' problem..

To get a more scientific approach, you have the DMMs, which are accurate to 15ppm in their 10V range, and stable to 35ppm/year. They disagree by about 40ppm on a 5V reading.
The DMMCheck on the other side is specified to have 70 ppm accuracy over 1/2 year, that's about 3 time worse than the 34401As.

So it's not directly possible to judge a 40ppm difference by a 70ppm accurate source; the service manual calls for a TUR of 5:1, not the other way round.

To achieve what you have in mind requires reference(s) of the same, or better stability class than the LM399, so that's the starting point for the LM399 and the LTZ1000 thread here in the forum..
If you have a bank of at least 3, let's say LTZ1000 based references, you can really start your drift analysis and really calibrate the 10V range of your 34401A, but you also need one initial calibration baseline of that uncertainty class, so maybe well below 5ppm, or so.

Then you need Transfer Standards, which precisely (<1ppm uncertainty) transfer your reference voltage to the needed values, like 7,xxx V => 10V, 10V => 1V, 100V, 1kV.
You asked, how this was done in the past, BEFORE the 5720A, or 3458A.
Well, these Transfer standards were the 720A KV divider, and the 752A reference divider, plus a stable 332A / 335B, maybe.

I recommend the Fluke book 'Calibration: Philosophy in Practice', where all this historic stuff is described.
But you really need such instruments, you can't solve that calibration problem just with scientific statistics.

Similar to Conrad Hoffman, I also built these devices on my own, in the beginning, because I also wanted to calibrate my 34401A, from 1990.
That's described here: https://www.eevblog.com/forum/metrology/ultra-precision-reference-ltz1000/msg239666/#msg239666

But I started directly with an appropriate approach, i.e. 2x LTZ1000, a precise 10/7 divider/amplifier (to 10.24...V, <0.2ppm ratio uncertainty by self-alignment) , and a 100:1 / 10:1 Hamon divider, also precise to < 1ppm ratio uncertainty @ 1kV. I also could once import the initial absolute uncertainty of < 10ppm from a 3458A in my company. All these self-built devices cost < 1000 € / $.
 
(Ok, sooner or later, this story escalated, similar to TiNs passion.  :-// )

I propose that you better chose the appropriate devices, instead of fiddling with DMMCheck, or similar.

Concerning your approach to improve the relative uncertainty of references, I fully agree with you, but again under the precondition, that you use lowest possible stability for your initial reference ensemble.

That means, if you have 3 or more LTZ1000 references, you can measure the relative instabilities and estimate the absolute stability from that, (i.e. decide the 3-clocks-problem) so to have an uncertainty of that order of magnitude of about 1ppm also, if you once set a baseline.

But it's not feasible, to start with, say LM399s, as these will diverge much more, not giving the necessary stability for calibration of your 34401A.

If you once have that stable ensemble, then only you can add less stable references to your statistics, and use these additional baseline points to improve your uncertainty also.

These metrological techniques have been state-of-the-art, when Weston cells and these resistive dividers were common, and no Josephson Junction arrays were available.

I'm maintaining such a history of 4, up to now 9 voltage references, over about 9 years, which you requested.

I'd be interested in a statistical model, which analyses these relative drifts properly, and delivers properties of this ensemble, like probable overall stability, uncertainty, identification of 'stinkers' and handling of 'jumpers', after transport or temperature excursions. 

PS: You obviously had some problems to measure low Ohm with your 34401As, consistently.
That probably may be caused by the well known 2W Ohm Offset problem, due to contaminated front/rear switches, but not by an improper calibration.

Frank
« Last Edit: November 07, 2018, 09:43:00 am by Dr. Frank »
 
The following users thanked this post: msliva, thermistor-guy, Kosmic

Offline Kosmic

  • Super Contributor
  • ***
  • Posts: 2540
  • Country: ca
Re: DIY DMM calibration
« Reply #27 on: November 06, 2018, 07:58:57 pm »
I recommend the Fluke book 'Calibration: Philosophy in Practice', where all this historic stuff is described.

Interesting book. Thank you for sharing!
 

Offline rhbTopic starter

  • Super Contributor
  • ***
  • Posts: 3484
  • Country: us
Re: DIY DMM calibration
« Reply #28 on: November 06, 2018, 08:01:58 pm »
Thanks for taking so much of your time.  After finally finding the correct table in the 34401A service manual it was obvious that tweaking the 34401A 10 V range was not possible using a 5 V reference.  I raised the question precisely because I thought that might be the case.

My reason for not trusting that particular 34401A is:

The zero was wrong.
It had been sloppily repaired and when I received it was not functioning.
It has a history of cals, at least one only partial.

The 34401A that agrees with the DMMCheck has never been calibrated since it left the factory in 1991.  Also my 3478A agrees with it.  I am more concerned that if I'm using both meters they agree than I am the absolute accuracy.

I have three of Jason's  LM399s, but not yet packaged so I can run  and monitor them.  I bought a lot of gear the last 12 months and am still struggling to get everything set up properly.  At the moment the DMMCheck is my best reference.  I can't use what I don't yet have.

My question about what was done before was rhetorical and directed at how  Kelvin and others did their work.  I've had a general interest in metrology for many years, but books and dimensional instruments consumed my available funds.  I'm not familiar with commercial electronic instruments, but I am familiar with the physics and mathematics.

Do you have data for references from initial power on?  LTZ1000s would be nice, but anything will do if it displays the classic aging curve from the beginning or at least within a hundred hours of startup and I have some idea of the lag between startup and data.

At this point I don't know anything at all about a, b & c.  They might be constants or some might be a polynomial expression.  Until I can do fits and look at residuals I have no way to know.

I spent several years dealing  the properties of hydrocarbon mixtures at work.  In the process I became very familiar with the way that multiparameter equations of state are developed.  If you think that electrical references are difficult, try dealing with the physical properties of substances at temperatures and pressures that push the limits of  physical instruments.  Mother nature has no problem developing a pressure of 15,000 psi at 450 F, but it's really hard to do that in the lab.  Getting a single data point can consume a lot of time.

I knew nothing about the subject when I got involved with it.  But by the end of a year it became clear my client was paying $35K/yr supporting a consortium that was doing very poor quality work and making completely ludicrous claims of accuracy.  I mention that because in the process of sorting out what was going on I learned a great deal about fitting unknown functions for which only a very rough notion of the mathematical form is available.

I am *not* trying to solve the calibration problem with statistics.  As you note, it can't be done.  What I am trying to do is characterize the physical aging process.  What started as checking up on fluid properties eventually led me into the continuum mechanics of solids and porous media.  In the process I learned quite a lot about the  thermo-elasto-plastic behavior of solids and similarly arcane subjects.  I've now forgotten lots of it, but I buy a lot of reference works when I work on a subject and I still have those.  So I merely need to refresh my memory.

I think I have a pretty damn good idea of exactly what the mechanics of the aging and environmental shock process of a reference is.  How that couples to the electrical behavior is still a mystery, but I have not read anything to date with regard to reference aging that gave me any confidence that the EEs involved had taken the time to learn the appropriate mechanics.  It's an exotic subject even for a  mechanical engineer.
 

Offline GigaJoe

  • Frequent Contributor
  • **
  • Posts: 494
  • Country: ca
Re: DIY DMM calibration
« Reply #29 on: November 06, 2018, 09:39:50 pm »
my 2 cents ....

i have 3457a , 34401a and 3478A ;  nice thing  for HP \ Agilent you may adjust only some single range and do it manually ; another huge plus , you don't need precise 1.000000 V , its ok with some variation, so you just adjust on display an exact voltage that you feeding (that super important on a resistance cal), vs some global procedure with gpib.

as I don't have any reference , has had to cal 3457a , then use it as reference , adjust others, same with resistance ,  same with DC current,    then problem began with AC,  for a voltage I use sine generator + TDA2030 that feed a couple transformers on secondary , so primary goes +700 .  it reasonably stable.

and no luck with AC current , so it not calibrated at all ..... (no big deal so far) as well as 1000Vdc

3457 and 3478 are 3000000 and 300000 , means ... for me calibrate 34401 in lab == more uncertainties for 3457
second note 34401a has 10M 10G resistance both do calibrating during one shot range,  but  3457, 3478 doesn't.  - means if you feed 34401a from a resistance divider , chance that cal. fail due to diff V input as DMM switch internal resistance therefore affect divider  - it doesn't happens with 3457, 3478

on side note , it pretty easy to deal with 5.5 , a magnitude harder 6.5 ; impossible for me 7.5.  and the same reason why i bought 3 HP3456a :) now I man with 3 clocks :) it seem that clocks drifting pretty low, during my occasional use ... around 2ppm \ year ... ( one recent device with 5ppm disagreement comparing to my 3xclock; so I thinking to do a second calibration circle )
 

Offline rhbTopic starter

  • Super Contributor
  • ***
  • Posts: 3484
  • Country: us
Re: DIY DMM calibration
« Reply #30 on: November 06, 2018, 10:16:22 pm »
A man after my own heart.  Thanks.  It may not get you to the level that TiN and Andreas like to play at, but it strikes me as a reasonable solution to a practical problem.

I don't have space for 3 x 3465A, so I'll have to settle for 3 or more voltage refs.
 

Offline GigaJoe

  • Frequent Contributor
  • **
  • Posts: 494
  • Country: ca
Re: DIY DMM calibration
« Reply #31 on: November 06, 2018, 11:35:16 pm »
it may work, if you construct 1:10  dividers + buffer follower.   I did 2xLM399 with 10 1 0.1 output all opamp+trans folowwer buffered, and cheap melf 25ppm hammon divider where resistors was sorted to very close tempco and then each resistor altered by filer to precise the same nominal;  soldered together , then many layers of acrylic coating .

now I'm thinking the same melf divider ( as resistors are very small) and acrylic +  glass bottle and a mineral oil ... I think it will work wery for a long term even ..... Will do a single reference resistor first ...

 
 

Offline alm

  • Super Contributor
  • ***
  • Posts: 2903
  • Country: 00
Re: DIY DMM calibration
« Reply #32 on: November 06, 2018, 11:55:48 pm »
Sounds to me like you have three problems:
  • standards
  • sources for all points you care about (e.g. cardinal points from 100 mV to 100 V, or 100 Ohm, 1 kOhm etc resistors)
  • a way of transferring the value of the standards to compare them to the sources
The $$$ calibrators contain all in one instrument. For example, they might contain a voltage reference, a variable power supply, and a voltage of the reference to calibrate the variable power supply. If the divider is 1:10, and the reference is 10 V, then if the power supply is adjusted so that its divided output equals the reference, it will be 100 V plus the uncertainty of the reference and the transfer uncertainty (divider, null detector).

The old Fluke 7105 system for DCV calibration, discussed in the first edition of the Fluke book described above, used a Weston cell as reference, a Kelvin-Varley divider to divide arbitrary values down to the value of the standard cell, a null meter to minimize the difference between two voltages and a calibrator. For best performance, the calibrator was merely used as a stable power supply, and adjusted until its value divided by the correct division ratio equalled the reference.

If you care about absolute values, then you will have to periodically compare your standards to the outside world. If your accuracy requirements are modest, a DMMCheck could serve as a standard. Sources you can improvise, depending on requirements. A precision power supply, a stack of 9V batteries, optionally combined with some sort of voltage divider might serve as one for DCV. For transferring, you can buy / build a Kelvin-Varley divider and null meter (see Conrad's mini metrology lab articles) or use a DMM for which you verified linearity. For resistance, you may be able to do something similar to the ESI SR1010 series.

You're not likely to achieve sufficient accuracy to calibrate a 34401. Even the Fluke 7105 system would probably have been borderline. But it sounds to me like that may not be the goal here.

Offline GigaJoe

  • Frequent Contributor
  • **
  • Posts: 494
  • Country: ca
Re: DIY DMM calibration
« Reply #33 on: November 07, 2018, 12:20:08 am »
it will be 100 V plus the uncertainty of the reference and the transfer uncertainty (divider, null detector).

Kinda sorta .... in my childish practice deal with 1mV in the end much easy then 1µV .. so 100.000V not so hard at all, but 1.000000 are disaster, where all diff. effect adding to skew the value ..
 

Offline rhbTopic starter

  • Super Contributor
  • ***
  • Posts: 3484
  • Country: us
Re: DIY DMM calibration
« Reply #34 on: November 07, 2018, 01:03:03 am »
Does anyone know of a *complete* DC error analysis of a 1:10:100 or similar divider?  By "complete" I mean at the level of what a mid-career  PhD experimental physicist would consider complete when designing an instrument.  Every known factor affecting the result included in the equations: thermal masses, currents, device dissipation, thermal resistance, non-linear effects, etc.
 

Offline GigaJoe

  • Frequent Contributor
  • **
  • Posts: 494
  • Country: ca
Re: DIY DMM calibration
« Reply #35 on: November 07, 2018, 01:22:21 am »
interesting approach ...  in such case , IMOO , design a schematic, then define a most critical elements affected by environment or element parameter fluctuation, then then less, then somehow equalize it , then add statistical uncertain .... and all this approach perfect for a mass production,  but not for a really a single device that you going to build .

due to all elements you can carefully select, starting from a voltage reference , and ending with resistors that you can select from batch per lowest thermal coeff  ( or connect 2 par\seq with opposite sign to compensate it) and so on ..... in the end nothing stop you to tell the temperature:  "screw you"  and toss all schematic board to thermal box with constant T .

Outsmart God it seems a reason volt-nuts still alive;  how the hell,  i may assume, that on specific trace I lost 3 microvolts,  not 5 or 6 and that need to compensate somehow to reach "0000" in the end ....
« Last Edit: November 07, 2018, 01:27:15 am by GigaJoe »
 

Offline Conrad Hoffman

  • Super Contributor
  • ***
  • Posts: 1940
  • Country: us
    • The Messy Basement
Re: DIY DMM calibration
« Reply #36 on: November 07, 2018, 05:31:13 am »
Does anyone know of a *complete* DC error analysis of a 1:10:100 or similar divider?  By "complete" I mean at the level of what a mid-career  PhD experimental physicist would consider complete when designing an instrument.  Every known factor affecting the result included in the equations: thermal masses, currents, device dissipation, thermal resistance, non-linear effects, etc.

My guess is no. If there is an analysis, it's proprietary to the company. IMO, they probably cheated and did the analysis after the wily old techs got the thing working properly. There are just too many physical unknowns. No problem putting variables in equations, but where to get all the data?

edit- Ha, wrong again. I own a 752a but only read the operating and adjustment sections.
« Last Edit: November 07, 2018, 07:18:16 pm by Conrad Hoffman »
 

Offline Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2393
  • Country: de
Re: DIY DMM calibration
« Reply #37 on: November 07, 2018, 06:46:37 am »
Does anyone know of a *complete* DC error analysis of a 1:10:100 or similar divider?  By "complete" I mean at the level of what a mid-career  PhD experimental physicist would consider complete when designing an instrument.  Every known factor affecting the result included in the equations: thermal masses, currents, device dissipation, thermal resistance, non-linear effects, etc.

Sure. Just download the 752A manual from the FLUKE site: https://us.flukecal.com/category/literature-type/product-manuals
It contains an elaborate, physics grade error analysis. All relevant effects are taken into consideration, not explicitly calculated, but sufficient to reproduce the calculus, as far as I can tell as an experimental PhD physicist.
In summary, the 100V to 10V / 10:1 OUTPUT is uncertain to 0.2 ppm, and the 100:1 OUTPUT to 0.5ppm, including power dissipation of 1kV to 10V.
Division of 10V to 1V and to 100mV is the same.


Frank
« Last Edit: November 07, 2018, 07:02:04 am by Dr. Frank »
 

Offline rhbTopic starter

  • Super Contributor
  • ***
  • Posts: 3484
  • Country: us
Re: DIY DMM calibration
« Reply #38 on: November 07, 2018, 02:13:27 pm »
Thank you.  Not quite as detailed as I'd like, but it makes clear what the important factors are and is a marvelous example of good instrument documentation.  I shall be spending a good bit of time studying  it.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf