Author Topic: Measurement mishaps and calibration cautions.  (Read 1877 times)

0 Members and 1 Guest are viewing this topic.

Online bdunham7Topic starter

  • Super Contributor
  • ***
  • Posts: 7860
  • Country: us
Measurement mishaps and calibration cautions.
« on: December 20, 2020, 12:47:31 am »
So I repair stuff.  I don't have any need for any type of certification, calibration or the like--no paperwork.  Nonetheless, I still need very accurate measurements at times, such as recently when I was repairing....calibrators!  Specifically a Fluke 5100B and 5101B.  The 5100B has been repaired and shipped, the 5101B languishes in storage waiting for me to luck into some parts, most importantly a CPU board.  Working with these units has opened my eyes to a number of issues regarding calibration procedures in general and the current measurement implementation in my Fluke 8846A (their TOTL service bench meter, 6.5 digits, 24ppm basic accuracy).  In both cases it is possible to get much less than you might reasonably hope for, even though what you are getting is in fact printed in black and white.  I've attached the relevant spec pages from the 8846A manual as well as the calibration (or verification) data from a Fluke 5100B being sold on eBay (nothing to do with me or the unit I repaired.  If you want to play along, you can look at those and the manual for the 5100B.  Do that first before reading further and see if you spot any problems, traps for noobs, or whatever.

http://www.nousnexus.com/Manuals/fluke-5100b-03_manual.pdf

What I'm attempting to do with these calibrators is repair them.  They're broken.  In one case an orange tantalum shorted and burned the traces off the HV supply and set the main power supply on fire.  On the other one there were at least a dozen issues, never mind the dirty, noisy trimpots.  Once they are alive, I want to adjust them as best I can with what I have so that I'm reasonably sure that they can be calibrated.  I'm well aware that the equipment I have is not specified well enough to reliably adjust all of the ranges to spec, but I can hope to get close.  I can actually do more than half of the actual calibration process.  Also, from other experience I know that my meter is quite a bit more accurate than spec on those ranges I've been able to test, so my stretch goal is to get the unit adjusted well enough that it can go through the calibration with just a verification.  Well, I can hope.  I'm making no representations about that--just that it works, it is close and the adjustments aren't maxed out one way or the other to get it close. 

This worked fairly well until I got to the current measurements, where I had an issue with the accuracy of my meter.  Before I write more, have a look at the photos and think about:

1) What are the problems with the calibration certificate on the eBay unit?  It is only partial and no lab names show, but I've seen similar work from labs claiming NIST traceability and issuing certs.

2) My 8846A is pretty consistently more accurate than spec by about a factor of 5X.  That's actually not unusual given the 99% confidence interval and the fact that I'm pretty close to calibration temp.  However, it gets much worse on a few ranges.  Which ones and why?



« Last Edit: December 20, 2020, 12:56:38 am by bdunham7 »
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline alm

  • Super Contributor
  • ***
  • Posts: 2881
  • Country: 00
Re: Measurement mishaps and calibration cautions.
« Reply #1 on: December 20, 2020, 09:02:09 am »
The performance verification procedure is a bit confusing. It looks like it was written for older instruments like the Fluke 332D/335D DCV calibrators, and then later these were replaced by more modern equipment like Fluke 5700A when the old equipment was discontinued. But they didn't update the diagrams, like figure 4-6 which uses the 5700A as differential voltmeter. Also, in figure 4-4, I'm not convinced that that Fluke 720 K-V divider will improve on the linearity of the Fluke 5700A.

As for your questions, it seems like you want people to think about them before you give the answer, so I won't comment on them.

Online bdunham7Topic starter

  • Super Contributor
  • ***
  • Posts: 7860
  • Country: us
Re: Measurement mishaps and calibration cautions.
« Reply #2 on: December 20, 2020, 05:24:48 pm »
The performance verification procedure is a bit confusing. It looks like it was written for older instruments like the Fluke 332D/335D DCV calibrators, and then later these were replaced by more modern equipment like Fluke 5700A when the old equipment was discontinued. But they didn't update the diagrams, like figure 4-6 which uses the 5700A as differential voltmeter. Also, in figure 4-4, I'm not convinced that that Fluke 720 K-V divider will improve on the linearity of the Fluke 5700A.

As for your questions, it seems like you want people to think about them before you give the answer, so I won't comment on them.

Yes, the procedures are confusing and I'm pretty sure nobody pays much attention to them anymore.   A good portion of that equipment list can be replaced by an appropriate reference DMM like the 8508A or 3458A.  In fact, a properly calibrated 8505A or 8506A, which they've listed, could also be used anywhere they achieve a TUR of 4:1 or better over the 5100B.  The 5100B was an old product when this 'revised' manual was written and it is almost disturbing that they are still fairly widely in service. 

You're right, the way they use the equipment doesn't make sense to me, although I don't have most of it anyway.  It's just a mishmash of old and new, and I think it mostly reflects the equipment that would have been available when the calibrator first came out.  The 720 probably made sense then.

But go ahead and comment.  I do have one 'answer' but there are lots of things to unpack and I surely haven't thought of all of them.
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline alm

  • Super Contributor
  • ***
  • Posts: 2881
  • Country: 00
Re: Measurement mishaps and calibration cautions.
« Reply #3 on: December 21, 2020, 11:26:32 am »
Before 8.5 digit DMMs with high linearity, and calibrators like the Fluke 5440 and 57xx series, the Fluke 720 KVD was probably the most linear device in many cal labs. Certainly better than the contemporary 335D etc calibrators for which the manual appears to have been originally written.

Okay, here's what I came up with. I'm sure I missed something, so I would love to know what you came up with.

1) For starters the list of equipment with serial numbers and calibration certificate references is missing. How do you trace a calibration to anything without it? The tolerance for DC current ranges on the certificate is a bit wider than the Fluke 6 months spec: 0.03% vs 0.019% for 1 ADC. And they don't test the points that Fluke specifies. Fluke tests both negative and positive polarity, and two points of each polarity on the 1 mA range: 1.9 mA and 2.1 mA. I'm sure they have good reasons for this. The 1.9/2.1 mA test might be a linearity / clipping test. Fluke doesn't test the current ranges other than 1 mA and 1A, presumably because these are sufficiently covered by DCV and resistance tests. I would be comfortable with substituting equipment and e.g. directly measuring the current with a sufficiently specced DMM, but not remove any points that the original procedure specifies.

2) For the 8846A, the specs get a lot worse above 1A. 3A is a lot worse than 1A, which uses the same shunt. So most likely this error is from self-heating of the internal shunt. Which means the error will increase (and stabilize) if you are measuring longer. In the Fluke procedure with an external shunt, they tell you not to measure for more than 1 minute, and wait at least 5 minutes between measurements with that shunt. This is for the Fluke 742A-1 shunt that they use, which has a very low temperature coefficient. Much lower than the 8846A shunt, I'm sure.
« Last Edit: December 21, 2020, 07:14:26 pm by alm »
 

Online bdunham7Topic starter

  • Super Contributor
  • ***
  • Posts: 7860
  • Country: us
Re: Measurement mishaps and calibration cautions.
« Reply #4 on: December 21, 2020, 07:50:18 pm »
The certificate or data sheet was an excerpt, I'm assuming there is more but they didn't show it.  If they did, I'm pretty sure that would be a topic for discussion as well, although as you say, a sufficiently accurate reference DMM can substitute for a lot of that equipment.  The tolerances being different I didn't catch, but as you figured out, the real issue is that they used an ad-hoc set of test parameters that don't match what the OEM specifies. The most egregious omission is the lack of negative voltage/current testing.  The 5100B uses a separate negative reference circuit and so on--getting the positive side correct guarantees you nothing on the negative side!  The calibrator could be completely broken and still pass this calibration test.

 I can tell  you from direct experience that meeting the OEM-specified test points is a lot harder than the ones they have listed.  As you surmise, the reason is that the OEM test points catch the unit at its weakest points, testing how well the various ranges match up where they 'shift gears' and so on.  And a lot of the test measurements are interdependent--there's a lot of iterative adjustments that force you to compromise between two measurements to minimize the error in both.  Even the OEM-specced points don't fully address the linearity of the DAC, which is another issue altogether.  Fluke has published a paper, which is really an ad for their cal services so I won't link it, but in it they point out that there is no explicit law or rule that says that a cal lab must use the OEM procedures.  As long as they document what they did, it's kosher.  So whenever someone offers to sell you something with 'fresh calibration certificate'  the correct response is not "Ooh, great!  Calibrated!" but rather "OK, let me see that".  See this thread:

https://www.eevblog.com/forum/metrology/can-we-believe-a-calibration-certificate/msg3141840/#msg3141840

Now for the 8846A, its specs and its mess of a current measurement implementation.  The answer can be found in the specs, but I didn't see it until I had this issue and looked.  First, the meter is generally more accurate than the 1-year spec.  If you look at the 24-hour, 90-day and 1-year specs, you'll notice that the much tighter 24-hour specs also specify +/- 1C, while the others are +/- 5C.  When these are calibrated, they should be calibrated in laboratory conditions after an extended soak at 23C +/- 1C.  A calibration certificate that uses the 1-year specs to determine the pass/fail limits is....um...not so good?  So a good portion of the additional error in the 90-day and 1-year specs can be attributed to tempco.  If you take the listed temp coefficient listed for temps outside the 18-28C range and apply it to the 24-hour specs, you'll see that you get pretty close to the 90-day spec. These meters don't drift much, especially if they are powered down a lot.  So, unofficially at least, if I stay near 23C, I can count on this meter to be near its 24 hour specifications.

However, if you look at the 24-hour specs, you will see there are two components as usual (usually listed as % of reading + counts, or % of reading + % of range) and the second one, the offset or residual component, is larger than the percent-of-reading error on some, but not all of the ranges.  Typically lower ranges have larger offset or residual counts, and often the best accuracy is specified using the REL or ZERO function to zero them out.  However, when attempting to read a current of 1.9mA (as specified for the 5100B test point), I found it impossible to get stable readings and the ZERO function only worked for a second or two.  At 1.0mA, the meter was golden--nice, steady and presumably accurate 6.5 digits.  So I went back to the specs and calculated the total specified uncertainty for a reading of 1.9mA and found that it was .00295mA or 0.16%--over 3X worse than what it would be at 1mA.  And my readings, though noisy and unstable, did fall just within that range, assuming my other equipment was functioning properly.  That's terrible!  Why?  Because the 8846A has three current shunts when it really ought to have six or seven and they prioritized low burden voltage (and perhaps shunt heating) over uniform precision in current measurement.

If you look at the current ranges and the shunt values, you'll see that they only match up with the meters lowest voltage range of 100mV for three ranges--1mA for 100R, 100mA for 1R and 10A for the 10mR shunt.  For the oddball 400mA and 3A ranges, which were presumably put there in response to some real or perceived customer demand, the meter actually displays one less significant digit, which reflects the realities of matching up the shunts with a voltage range without an intermediate gain amplifier.  But for the 100uA, 10mA and 1A ranges--the ones with the large residual or offset component in the uncertainty, they've done something different.  Just truncating a digit would make it obvious that they weren't real ranges but rather just the bottom tenth of the next range up.  Adding a gain amplifier, like the 8808A has, would be an additional expense, and looking inside the meter I didn't see one.  So instead, as far as I can determine, they just exposed an additional digit from the ADC, a sort of digital gain.  The 8846A is nominally a 1.2 million count meter, but on certain statistical displays it shows 10 million counts, and using the PC software I believe you can get 8 digits without any missing codes.  Like many meters, the internal resolution is much higher than what they display, and for good reason--those hidden digits are generally just noise. So, they just shift all the digits one to the left, hide the leading zero (since the range is always less than 0.1 of the next range) and expose an additional digit....of noise.  If they did this on voltage by creating a new 10mV range, people would quickly figure out "hey--this is just noise".  But precision current measurement on the 10mA range will lurk in the background for much longer until it finally snags someone like me.

I know, RTFM.  But more than that, know your test instruments!  They all have characteristics, some of which can be called flaws, others limitations.  They all interact with your circuit to some extent.  You can't ignore that with any sort of precision measurement.  The notion that "hey, it's calibrated and that's all I need to know" only works when the instrument is much more accurate than you need and the interaction with your circuit is much below the precision you need.

« Last Edit: December 21, 2020, 08:54:47 pm by bdunham7 »
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf