Author Topic: Calibration adjustments based on specific "As Found" Z540 cal lab measurements  (Read 2100 times)

0 Members and 1 Guest are viewing this topic.

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
Trying to define a methodology for myself as I started to work with a local cal lab that is reasonably priced and I like and has all of the regular capabilities from a run of the mill facility (Z540, 17025, etc.). It's been around for decades and is the main servicer for some local governmental entities.

Just got back my first unit which passed a full Z540 on all scales - yeah to me but much more importantly to the member of this community who did the adjustments work and pulled this uncanny feat. You know who you are! The bronze is on order for your statue!!...

Now, having, for instance, a set of certified 10V, 100V, and 100mA readings, and having my DP 8200 reported slightly out of spec (by the same lab... this was just out of kindness), would it be advisable to perform the field cal procedure and adjust the 10V, 100V, and 100mA pots to the exact values as indicated by the "As found" measurements which were performed by my meter when exposed to the standard voltages and current?

I'm pretty sure the octade adjustments should not be touched, but essentially the field cal adjustments are there for exactly this reason - fine touch slight adjustments if the unit drifted a bit off.

Thank you in advance for sharing your thoughts.

For reference, I enclose the respective page from the SM going though the field cal adjustments. I grabbed the full manual from the internet for free, but if anyone wants to see the schematic excerpts with the adjustment pots I can post that too.
« Last Edit: February 18, 2023, 04:50:14 pm by Rax »
 

Offline Conrad Hoffman

  • Super Contributor
  • ***
  • Posts: 1931
  • Country: us
    • The Messy Basement
Depends on how far out it was reported to be and how confident you are of meeting the requirements they give for field calibration, plus having a stable environment. FWIW, I did the entire cal on mine, having a couple KVDs and, at the time, a working null meter. It wasn't that difficult, though I remember using some tricks on the grounding (like a pickoff point at the center of a balanced ground wire) to get that last ppm. As for the field cal, if you're not happy you can always do it again!
 
The following users thanked this post: Rax

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
Depends on how far out it was reported to be and how confident you are of meeting the requirements they give for field calibration, plus having a stable environment. FWIW, I did the entire cal on mine, having a couple KVDs and, at the time, a working null meter. It wasn't that difficult, though I remember using some tricks on the grounding (like a pickoff point at the center of a balanced ground wire) to get that last ppm. As for the field cal, if you're not happy you can always do it again!

Conrad - thank you very much for the very sensible, "down to earth" points. They said it's "very close" but it'd fail cert. Which to me sounds like the exact application for the field cal.

I should also mention I've touched the offset control (R60 - technically part of the metrology calibration), as it had a bit too much for comfort. Doing that didn't seem to harm or shift anything around.
 

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
They said it's "very close" but it'd fail cert.

To add to that point, if I measure the DP8200's 10V, the newly calibrated meter shows it being about 4-5ppm from 10.00000V. Based upon just that, I don't think it should technically fail upon being calibration-verified, but they did say that, so...

My plan is essentially to make sure there's nothing wrong with it (such as too large an offset to pass) and to bring it as close as I can to spec, then have them measure it. If it fails, this is one of the places that both repairs and actually adjusts instrumentation to meet calibration spec.
 

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
having a couple KVDs and, at the time, a working null meter.

One of my issues is the lack of such instrumentation... I don't see it likely to run into some opportunity and acquire a KVD. Which is why I'm choosing this path with transferring standards from newly calibrated units.

I did the entire cal on mine, having a couple KVDs and, at the time, a working null meter. It wasn't that difficult, though I remember using some tricks on the grounding (like a pickoff point at the center of a balanced ground wire) to get that last ppm. As for the field cal, if you're not happy you can always do it again!

Interesting point, thank you for that. One thing that's not clear to me is the requirement to do the 100V field cal adjustment at 70V. Is this because the unit would top at 100V and therefore not be able to adjust at the top of the adjustment range? The reason I'm asking is that my "characterization point" (where I have an "as found" measurement is obviously not at 70V, but at 100V. I'd adjust with respect to that.

But if the 70V point is an absolute need, I can probably keep on tweaking at 70V and verify at 100V unitl I get it as close as possible. It'd take much more time, but I'm OK with that.
« Last Edit: February 18, 2023, 08:17:07 pm by Rax »
 

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7860
  • Country: us
One things that's not clear to me is the requirement to do the 100V field cal adjustment at 70V. Is this because the unit would top at 100V and therefore not be able to adjust at the top of the adjustment range? The reason I'm asking is that my "characterization point" (where I have an "as found" measurement is obviously not at 70V, but at 100V. I'd adjust in respect to that.

It is possible that there is some slight, known non-linearity issue due to a voltage coefficient of the divider or some other issue that results in a small response curve and they found it is best handled (minimized over the whole range) with calibration at 70V.  Your dilemma is that you believe you have an available calibration point at ~100V that is more accurate than what you would have at 70V, since your ~100V point does not depend on the linearity of the DMM.  It's a tricky deal, but I'd start by setting it at 70V as in the manual and then checking at 100V and possibly a whole bunch of other points, certainly 10V.  Then you'll have a good idea about how much generaly uncertainty you have in the whole process.  If, for example, you can get it so that your 0V, 10V and 100V points are all spot-on by your measures (all on the 100V range) but your 70V point is off by 11ppm, then you really won't know for sure where that small error comes from without a lot more work.
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 
The following users thanked this post: Rax

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
As I'm doing this - and using a Fluke 8502A which is this calibrated unit - I'm starting to explore in more detail the fine modes of using it. For instance, filtering - I'm not at all sure when this is appropriate and/or needed.

I am getting a slightly different behavior from the readings with the filter on vs. off (not necessarily a different value, but I'd say the slight fluctuations go a bit differently). Should I have it on or off? To be clear, I am performing the measurements at the output of the DP8200.
 

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
I'd start by setting it at 70V as in the manual and then checking at 100V and possibly a whole bunch of other points, certainly 10V.

Sounds like a good plan. I think my initial step here will be to do 70V, then slight successive adjustment to get the 100V as close as the characterization point/value as I can get it. Then look up and down the range.
 

Offline Testtech

  • Regular Contributor
  • *
  • Posts: 118
  • Country: us
The Data Precision 8200 is very temperature sensitive. You also need to make sure that the relays are clean, if not, the measured value will be erratic. Set it for 10V and lightly tap the relays, the reading may change a bit, but should return almost exactly. Same at 100V. If not, you can remove the relay covers and clean the contacts.
The 68K resistor in the output amp is undersized and often bad, I always replace them.
Sometimes the 100 MV divider has drifted OOT, and must be trimmed very slightly to get into spec.
You should not use nickel plated leads for the measurement, use copper due to thermals.

 I made an adjustment cover that has holes to access the pots, leaving the cover on. You need at least a 7-1/2 digit dmm to make the adjustments, I have always used and 8-1/2; ie, 1281, 7081, 3458A..
Make an adjustment, let it sit awhile; check, adjust, repeat. You may need a few passes of the entire adjustment procedure.
Sometimes the resistors have drifted enough that an adjustment won't have enough range. Then you need to very slightly trim something.

These are a decent compact DC calibrator within their limitations.
 
The following users thanked this post: Rax, srb1954, JK21

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
The Data Precision 8200 is very temperature sensitive.

Interesting. Maybe its internal Vref needs to be blanketed thermally - kind of how I've seen it done by @TiN (I think) and others.
 

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
I went ahead and adjusted all three ranges, and it went pretty well and sticked to value (except maybe the mA adjustment, which is not nearly as stable, which I think is to be expected). I took my time yesterday, over many hours or fine, successive tweaks and resettling, and rechecking at both 10V and 100V. It seems there's just a slight dependency between the two, but I feel it's on value now.

BTW - I simply adjusted at 100V, as I couldn't quite figure a practical way to perform the adjustment at 70V and check at 100V. Didn't touch any of the octades and stuff like that, as, again, I think the intention with those is to last a very long time without needing adjustment. But we'll see about that, with some luck I may take this to the cal lab to be looked at as soon as Monday.

 

Offline Testtech

  • Regular Contributor
  • *
  • Posts: 118
  • Country: us
It is not primarily the reference, the entire analog system is temp sensitive, from the op amps to the resistors and everything else.
 
The following users thanked this post: Rax

Offline RaxTopic starter

  • Frequent Contributor
  • **
  • Posts: 906
  • Country: us
It is not primarily the reference, the entire analog system is temp sensitive, from the op amps to the resistors and everything else.

Testtech - I continued this track in the 8200's own thread ("repair and other fun stuff"). I hope this doesn't spread the conversation too much around, but helps keeping it focused and topic-centered.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf