Author Topic: Fellow Norwegians: Got access to precision gear?Begging to calibrate HP 3456A :)  (Read 30216 times)

0 Members and 1 Guest are viewing this topic.

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Re: Safe to repair HP Agilent 3456A 6½ d DMM? Any ideas?
« Reply #25 on: February 10, 2012, 04:11:50 pm »
Ok, so I shout out to my fellow Norwegians:

Are there anyone in Oslo whom has access to highly accurate equipment whom is willing to help me put this meter into spec??? :)

I checked a local company. They asking NOK4500 to do a cal. That is circa USD750. I cannot afford that.

Kind regards,
Marius

PS: saturation, the meter has been powered on for 2 days now. Still going strong :)
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Hello Marius:

You may want to join and post too to folks at volt nuts, to find someone near you who may be able to help with DIY or formal calibration at low cost:

http://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts

the other helpful board is HP group:

http://tech.groups.yahoo.com/group/hp_agilent_equipment/

They have all been helpful in getting tips to keep my stuff up and running.

Ok, so I shout out to my fellow Norwegians:

Are there anyone in Oslo whom has access to highly accurate equipment whom is willing to help me put this meter into spec??? :)

I checked a local company. They asking NOK4500 to do a cal. That is circa USD750. I cannot afford that.

Kind regards,
Marius

PS: saturation, the meter has been powered on for 2 days now. Still going strong :)
Best Wishes,

 Saturation
 

Offline Neganur

  • Supporter
  • ****
  • Posts: 1138
  • Country: fi
Re: Safe to repair HP Agilent 3456A 6½ d DMM? Any ideas?
« Reply #27 on: February 10, 2012, 09:46:15 pm »
Ok, so I shout out to my fellow Norwegians:

Are there anyone in Oslo whom has access to highly accurate equipment whom is willing to help me put this meter into spec??? :)

I checked a local company. They asking NOK4500 to do a cal. That is circa USD750. I cannot afford that.

Kind regards,
Marius

PS: saturation, the meter has been powered on for 2 days now. Still going strong :)

Not Norwegian (but Danish), I have access to two Fluke calibrators here at the uni in Helsinki. The 1 Gohm range may be problematic since I think to remember 100 Mohm being the maximum when I cal'ed my two HP3478's. Would have to check.

You could also try to contact your local uni and ask if they have any. Sometimes  people are nice enough to offer help (it's more a time issue than not wanting to help).
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Re: Safe to repair HP Agilent 3456A 6½ d DMM? Any ideas?
« Reply #28 on: February 10, 2012, 10:03:53 pm »
Ok, so I shout out to my fellow Norwegians:

Are there anyone in Oslo whom has access to highly accurate equipment whom is willing to help me put this meter into spec??? :)

I checked a local company. They asking NOK4500 to do a cal. That is circa USD750. I cannot afford that.

Kind regards,
Marius

PS: saturation, the meter has been powered on for 2 days now. Still going strong :)

Not Norwegian (but Danish), I have access to two Fluke calibrators here at the uni in Helsinki. The 1 Gohm range may be problematic since I think to remember 100 Mohm being the maximum when I cal'ed my two HP3478's. Would have to check.

You could also try to contact your local uni and ask if they have any. Sometimes  people are nice enough to offer help (it's more a time issue than not wanting to help).

Neganur, will do.

I do think the Uni of Trondheim has better equipment. As far as I know, Uni of Oslo doesn't do any engineering, but I will phone and ask tomorrow anyway.

Do you have access to equipment in Denmark? My girlfriend is Danish, so we travel there all the time.

Kind regards,
Marius
« Last Edit: February 10, 2012, 10:23:24 pm by king.oslo »
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Ok, I have made some progress.

Tell me what you think.

An engineer with access to the standards replied to an email of mine. He told me I could bring the meter to his institution next week. He said he could compare my meter to the standard.

Am I mistaken in thinking that if I record how far my meter is off from the standard, then when I get home, I can calibrate any difference then (given that the temperature is the same, and that I can access suitable DC voltage and AC voltage source at home)?

Thanks.

Kind regards,
Marius
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
You are making quick progress, Marius.  A delayed adjustment is not recommended. 

There are other phenomena that cause instability: humidity, barometric pressure, cables used, type of metal on the ends that contact with both devices, touching the cables, or just adjusting the pots, just to name a few

If you've never done this before and are not fully aware of the other phenomena that affect sensitive measures, the only way you'll see your approach has trouble is that after adjustment the reading of the standard and your DUT are not in sync down to the last digits, say uV, and the values are unstable.

When the standard and the DUT are together, at least some variables are controlled, as they are now exposed to the same issue simultaneously.

For example see Dave's video inside a cal lab, and what the cal guy says to Dave:



You can zoom to 42:30 to see how cal's are done.

In analog components, adjustments to switches and pots need to 'settle' as shown in the video when Dave turns the knobs.  Temp stability is crucial: at the start note the last digits bobble about until it 20 minutes later, it ~ 9.999920 then is is ~ 9.999965, which is in spec for 30ppm, similar to the 3456a.  However, 20 min is just a practical guideline, in experimentation I find 4+ hours is typical, and leaving it overnight is a simple way to get it fully stabilized before adjustment.  This is why the 'cal guy' says 'take it home and never turn it off'. 

The cal guy mentions 'test uncertainty ratio' of ~ 3.5:1 which is similar to what to expect between the 3456a measured against a 3458a, the best DMM now available.  TUR means the "accuracy of the DUT / accuracy of the standard".  The ideal is 4:1.

When you use smaller ratios, such as against another 6.5 digit DMM, it gives a TUR of 1:1.  To offset this ratio, the value you adjust, and the reading of the standard must be 1:1 to reduce uncertainty to something like a TUR of 3.5:1 or better.  The only way to insure that is when both the standard and the DUT are together to tweak the adjustments to keep a 'tightest' variation between them.



Ok, I have made some progress.

Tell me what you think.

An engineer with access to the standards replied to an email of mine. He told me I could bring the meter to his institution next week. He said he could compare my meter to the standard.

Am I mistaken in thinking that if I record how far my meter is off from the standard, then when I get home, I can calibrate any difference then (given that the temperature is the same, and that I can access suitable DC voltage and AC voltage source at home)?

Thanks.

Kind regards,
Marius
Best Wishes,

 Saturation
 

alm

  • Guest
Re: Safe to repair HP Agilent 3456A 6½ d DMM? Any ideas?
« Reply #31 on: February 12, 2012, 04:21:18 pm »
Noise, drift, temp, proper leads, etc. are all part of consideration during the measurement.  One doesn't connect the 2 DMM together to avoid interactions between them, since one can't be absolutely certain the DUT isn't still faulty. So each measurement is done separately, and multiple times.
For a proper voltage reference with good short-term stability, I would agree. For improvised sources, I'm not so sure. Even after warm-up, short-term (5 minute) fluctuations of some signal sources/power supplies I tried were 50 ppm and 200 ppm, but the fluctuations were closely correlated between various multimeters. Averaging can correct for short-term fluctuations, but sometimes there's a long term drift component, even after hours of warm-up. Averaging samples taken during the same time period works around this problem, but requires parallel connections.

See the attached image for some actual data. Green is a 16 sample moving average of blue, and cyan the same for black. This was to emulate the transient response of red. There are some minor differences in transient response, but apart from that the values are very close. Note that these three meters came from different parts of the world, and two of them were last calibrated at least ten years ago.

What kind of interactions are you talking about? A defective DUT having a much lower impedance, loading the voltage source? A DUT putting out a bias current much larger than specified? Unless you're talking about huge amounts of current or very low voltages, two parallel connect meters should see the same voltage.

When you use smaller ratios, such as against another 6.5 digit DMM, it gives a TUR of 1:1.  To offset this ratio, the value you adjust, and the reading of the standard must be 1:1 to reduce uncertainty to something like a TUR of 3.5:1 or better.  The only way to insure that is when both the standard and the DUT are together to tweak the adjustments to keep a 'tightest' variation between them.
Another 6.5 digit DMM is likely to be worse than 1:1. For example, 1 years specs for HP 3456A compared to HP 34401A are 23 ppm:35 ppm, or about 1:1.5. A 34401A calibrated within the last 90 days would be a little better than 1:1, and a 24 h calibration interval would give you 1.5:1. Much better instruments are needed if you want to calibrate the 3456A to 90 days or even 24 hour specs (8 ppm for 10 VDC).
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
I got a Fluke 8846A, (0.0026% acurate in DC Volt) and You could come to me, in Asker and compare, adjust it up against that if You want to..
But I`m in Africa at the moment, so next week maybee?

Thanks Erik,

That is great.

It'd be great if you pass me your phone number. Mine is 402 402 00! :) M
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Hello there,

I was invited to a company - NEMKO. They compared the DMM to their standard. The results were okay, but I had wished it was better. Their calibrator was something like 8 or 9 digits.

Without much of a warm up, this were the results:

Standard - 3456A

0VDC - 0.00000
1VDC - 1.00012
10VDC - 10.00021
100VDC - 100.004

100VAC (50Hz) - 102.5
400VAC (50Hz) - 403

10R - 10.023
10k - 100.002
1R - 1.091

Calibrating the meter against their standard was circa NOK3000-NOK4000 ~ USD500-USD700, depending on how long they would need to complete the job.

When I accept ErikTheNorwegian's offer, is it possible to adjust only the AC and resistance, or will this upset the DC calibration?

Thank you for your time.

Kind regards,
Marius
« Last Edit: February 13, 2012, 02:03:33 pm by king.oslo »
 

Offline Neganur

  • Supporter
  • ****
  • Posts: 1138
  • Country: fi
If the meter is anything like the other HP's, the most important cal is the DCV and DCA since the Ohms measurements are derived from those values.

ACV and ACA usually have a very poor tolerance compared to the DC ranges and are for sine wave only.
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
If the meter is anything like the other HP's, the most important cal is the DCV and DCA since the Ohms measurements are derived from those values.

ACV and ACA usually have a very poor tolerance compared to the DC ranges and are for sine wave only.

The meter hasn't got current-measurement.M
« Last Edit: February 13, 2012, 05:59:40 pm by king.oslo »
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
I am preparing myself (and the 3456A) to get adjusted up. I checked out the official calibration procedure for this meter, but it seems to require the use of instruments which I don't have (heck, some of them I have never even heard of).

Are there more up to date procedures available from anywhere? What would you do if you were me?

Thank you for your time.

Kind regards,
Marius
 

alm

  • Guest
I would try to arrange for an hour or so of warm-up, I believe this is specified in the manual, and an ambient temperature close to in your lab. Accuracy is usually specified for calibration temp +/- 5°C, so too much difference in temperature will reduce accuracy. I'm not sure about the magnitude of the warm-up drift, you can easily test this yourself by hooking it up to a stable source, turning it on and see how the reading changes during the first hour or so.

Regarding your quick performance verification:
Sure you didn't miss a zero for the 1 VDC reading? The current value is out of spec, while 10 VDC and 100 VDC are comfortably within 1 year spec (about 24 ppm plus a few counts). It also has only less digit than the other values.

The AC results seem to be out of spec, assuming the filter was enabled (I don't remember if it's enabled by default).

Were the resistance measurements taken in 5.5 digit mode? Resolution should be 100 uOhm in the lowest range. 10k value appears to be off by an order of magnitude, that seems unlikely. What was the integration time? For 0.1 PLC, which would match the 5.5 digit resolution, it's allowed 0.01% + 14 mohms of offset in the 100 ohm range, so it would still be close to specs. In 1 or 10 PLC mode, it'd be clearly out of spec, assuming correct procedures and cables were used.

What sources are you planning to use to do the adjustment? Adjusting AC voltage from the mains voltage + transformer, for example, may not give very good results, since its stability may not be better than the ~2% accuracy you have now. You should be able to check the stability of the source by measuring its variance. First measure the intrinsic noise by shorting the inputs with a piece of copper around the binding posts and measure the variance (math -> stat, let it measure for a while, then recall variance to get the variance and recall mean/count for mean and number of samples) to get a baseline. Then do the same with your source. If the standard deviation (square root of the variance) is not significantly lower than the factory spec, your cal will be less accurate than factory spec. If it's in the same order of magnitude or higher than the current error, it's probably not a good source to adjust to. You don't need accurate values if you have another calibrated DMM, adjusting the trimmers to 0.984165 V works just fine. Try to stay close to the ~1.2x ranges, 1.3 V would not be very optimal, for example.

You apparently need a stable 1 VAC/1 kHz source and a bunch of accurate 1 Kohm - 10 Mohm resistors. My experience is that 1/4 W metal film resistors are horrible for this purpose, the value would change noticeably due to heating by the current from the DMM. Standard resistors or precision wirewound resistors appear to work much better, if only due to their much larger thermal mass. I wouldn't worry too much about the required equipment. Even a real cal lab might just use a DMM calibrator and reference DMM.

I don't expect any change to DC calibration after adjusting AC or resistance. If you check the adjustment procedure, they first adjust DCV, then R, then ACV. If DCV would be influenced by ACV, the procedure would be wrong. Resistance calibration could influence ACV calibration, as ACV is performed after resistance calibration. This seems extremely unlikely to me, since these circuits usually have very little in common apart from the circuitry also used for DCV.
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Alm, thank you. I am filled with humility by your evident experience.

The recordings I did at the lab were done in a hurry, so perhaps I got a reading wrong. It sure looks like that is the case.

I thought I would make my 1VAC 1KHz source using PIC mcu and a transistor.

Circa what current should I expect the meter to draw from the circuit during measurement in ac and dc?

Thank you for your time.

Kind regards,
Marius
 

alm

  • Guest
Almost no current, especially at low frequencies. Impedance at DC is > 10 Gohm <= 10 V and 10 Mohm at higher ranges. AC impedance is something like 1 Mohm with probably < 100 pF of capacitance is parallel, or an impedance of more than 500 kOhm at 1 kHz, so the peak current would be something like 2 uA.

No idea if you will achieve the required stability with a basic oscillator, but I don't see any harm in trying. The aim is good short-term (minutes, not hours) stability. I would try to get a sinusoidal signal, a square wave will have all kinds of nasty harmonics. Note that you need a sine with an amplitude of sqrt(2) volts to get 1 Vrms. Since the frequency is constant, just a sharp band pass filter tuned to your frequency may be enough, if your frequency is stable. This would also remove the DC component. At least use a $0.2 or so crystal as clock source for the PIC.
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
I have a rubidium oscillator.

Is it's short term stability suitable?M
 

alm

  • Guest
Frequency isn't critical at all (I don't think HP bother to specify a frequency tolerance for the 1 kHz source), but varying by a few percent as would be possible with the internal RC oscillator might be an issue, and a crystal only costs a few pennies. It's the amplitude accuracy that counts.
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Ok, which method would you use to build a suitable ac source?M
 

alm

  • Guest
I got decent result with a HP DDS function gen after allowing it to warm up, standard deviation of a 1 Vrms/1kHz sinusoidal signal was about 10 uV over the time span of five minutes (almost 400 samples). Main limitations was that it did not go beyond 7 Vrms.
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
Which one?M
 

Offline amspire

  • Super Contributor
  • ***
  • Posts: 3802
  • Country: au
One method I have used in the past to make a stable AC voltage reference is to used a stepped sinewave generated from cmos switches connected to, say, a 10V DC reference IC, and to then pass it through an op-amp low pass filter to turn it into a sinewave with about 0.2% distortion.

The trick was the stepped waveform I used had no harmonics before the 9th from memory, so I was able to have the filter 3db point set well above the sinewave frequency, and so the filter stability didn't affect the output voltage much.

You could use a micro to generate the clocks for the waveform. Then you need the dc reference, four matched and stable 10K resistors and a couple of opamps to make the filter.

To calibrate it, I have an old Fluke 540A thermal transfer calibrator that is accurate to about 0.01%, but the output has pretty good accuracy based on the DC reference voltage only.

I did this design for 50/60Hz but it is not hard to scale it up to 1KHz fixed frequency. I can dig up the circuit if you are interested.

Richard.
« Last Edit: February 18, 2012, 09:42:03 pm by amspire »
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
More challenges ahead, much has happened since the last posts!

Some thoughts on various issues raised on this thread to this point:

IIRC, the 3456a AC range is not stellar; the LSD variation is so high that the final accuracy isn't much better than a Fluke 87V.  I checked the high range AC against line voltage and compared it against the Fluke 87V, and most other ranges using a DDS generator too.  Note, at line voltage, I'm limited only to 60 Hz, whereas with the DDS, I could check output with the variable frequencies required.  I made no adjustments, just checks.

There is no 'zero volt' range, i.e., so I presume its 0.1VDC.  If so, it reads out with an exponent to a max of 7 digits:

+100.0000 -3

note the last digit is 100nV.  If you zero out the input with a shorting bar to give you lowest noise, I typically get < 500nV at equilibrium.  Uncertainty of the LSD at this range is is 20-30 counts at 6 digits, so the accuracy is limited to 2-3uV, but you can get resolution to 100nV range.

'Zero volts' however is tested per range on the 3456a by shorting the inputs, and what you see is the offset voltage.


Ok, which method would you use to build a suitable ac source?M
« Last Edit: February 19, 2012, 02:56:44 am by saturation »
Best Wishes,

 Saturation
 

Offline king.osloTopic starter

  • Frequent Contributor
  • **
  • Posts: 432
  • Country: no
I did this design for 50/60Hz but it is not hard to scale it up to 1KHz fixed frequency. I can dig up the circuit if you are interested.

Richard, please let us see it! :)

Saturation, in this context, what is the LSD? If I google the term, I only find information about psychedelic drugs.

Thank you for your time.

Kind regards,
Marius
« Last Edit: February 19, 2012, 01:03:18 am by king.oslo »
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Good graph and comments.  In reply:

1.
Yes, these all are short term sources; < = single digit minutes of stability; thus its important to have the "standard" DMM side by side with the DUT to check for drift.  The variance function can help confirm drift is occurring, but since it doesn't readout in real time you'll need to rely on LSD variations.

Yes, electronic sources are noisy for this purpose. 

I used stacks of good quality NiMH batteries, [eneloops]: 8 cells ~ 10.0VDC etc.,, to compare 1 & 10 VDC ranges.  There is a slow 10 uV drift down the more batteries are stacked [ each cell isn't at the exact same potential as others in the stack, so stronger charge the weaker ones ]  but hold long enough to compare the DUT and the standard down to the uV.  To reduce voltages further I use a Kelvin-Varley voltage divider.

For the 100VDC scale, I used as many batteries as I had then I reconfirmed the reading at full scale with an electronic reference to 100VDC and then to 1kVDC, comparing both DUT and standard meters continuously.  It isn't as stable as the battery, but a point is to keep the DUT and standard as tight a reading possible to reduce error, see #3.

2.

I'd insure all DMMs connected together are first working properly individually, particularly after a repair.  It may not be accurate without calibration, but the repaired meter should be precise.  Once the user is satisfied all is well, then connecting them together should not be a problem unless you are concerned at the uV level.

On low voltages, once other low level issues are stable: temperature, Seebeck, etc., you're still left with the leads picking up EMI, so the less leads are involved, and the shortest they can be, is best.  To reduce pickup, all uneeded additional connections are avoided.

3.  There are problems even if the calibrated meter were identical to 3456a in accuracy, giving a TUR of truly 1:1.   The theory of how this uncertainty is minimized is that each measurement comparing the DUT and the standard must have as little difference as practical, but tighter than allowed by the manual.  This is done by reducing the limits of a specification that the meter is allowed to vary, the idea of a guard band.

http://www.agilent.com/metrology/pdf/guardband_beginners.pdf

For example, at 24hrs, at 10V the acceptable range is 9.999 90 to 10.000 10; this is 8ppm + 2 digits.  To adjust it with a TUR of 1:1, the acceptable range must be reduced >= 4:1, which is range spanning 2ppm.  So the acceptable range after adjustment is now at least ~ 10.000 00 to 10.000 04 relative to standard used.  The Agilent papers details how to make a more accurate calculation of the new limits, but its easier to keep the difference nearly nil between the DUT and the standard, so you can keep making as many subtle adjustments until the DUT and standard are measuring as tightly as possible.

A simple test after adjustment is to measure the same same transfer reference simultaneously, say at 10V.  After some time, both meters should have ~ same mean and variance. 

Given the 3456a must already be out of cal to warrant adjusting it, which would be best, leaving it out of cal, or approximating the 34401a?  The 3456a precision in unaffected.  Regardless of TUR ratio, the best that can be achieved is the accuracy of the lesser unit: if the standard is better than the DUT, the best you can achieve is the spec of the DUT; if the standard is less than the DUT, then at best the DUT will be as good as the standard.




1. Even after warm-up, short-term (5 minute) fluctuations of some signal sources/power supplies I tried were 50 ppm and 200 ppm, .....

2. What kind of interactions are you talking about? A defective DUT having a much lower impedance, loading the voltage source? A DUT putting out a bias current much larger than specified? Unless you're talking about huge amounts of current or very low voltages, two parallel connect meters should see the same voltage.

3. Another 6.5 digit DMM is likely to be worse than 1:1. For example, 1 years specs for HP 3456A compared to HP 34401A are 23 ppm:35 ppm, or about 1:1.5. A 34401A calibrated within the last 90 days would be a little better than 1:1, and a 24 h calibration interval would give you 1.5:1. Much better instruments are needed if you want to calibrate the 3456A to 90 days or even 24 hour specs (8 ppm for 10 VDC).
Best Wishes,

 Saturation
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Sorry Marius, least significant digit; the right most digit.

I did this design for 50/60Hz but it is not hard to scale it up to 1KHz fixed frequency. I can dig up the circuit if you are interested.

Richard, please let us see it! :)

Saturation, in this context, what is the LSD? If I google the term, I only find information about psychedelic drugs.

Thank you for your time.

Kind regards,
Marius
Best Wishes,

 Saturation
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf