Author Topic: Influence of switch resistance in Hamon Dividers  (Read 12176 times)

0 Members and 1 Guest are viewing this topic.

Offline Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2382
  • Country: de
Application of the 752A and further bias data
« Reply #75 on: June 07, 2022, 11:00:35 am »
Motivated by Dr. Philipp to determine the HV linearity of my 5442A, and of the 3458A, I used the 752A as a high precision 10:1 and 100:1 divider for the 3458A, solely in its 10V range, to measure the exact output voltages of my 5442A, relative to its calibrated 10.00000V reference value, up to 1kV.



At first, I measured the bias currents at various input voltages. These values changed compared to the first table above.
I calculated the expected ratio errors for each data point. The highest error is 0.08ppm, so that already might affect high precision measurements.

I calculated each 'exact' output voltage from the 5442A, by combining the precise divider ratio (0.2 and 0.5ppm uncertainty), the precise readings from the 3458A, and normalizing to 10V reference value, as the 3458A is not calibrated exactly to the 10V baseline of my reference group.

Afterwards, I applied these 'exact' output voltages directly to my 3458A, to estimate its ACAL (ratio) uncertainty of its 100V and 1kV ranges, and to determine its High Voltage linearity.



As can be seen from repetition of this process, the ACAL ratio transfers are mostly precise to 0.2..0.3ppm for the 100V range, and about 0.2 ... 0.5ppm for the 1kV range. That may change for successive ACALs.

The HV linearity, by comparing 100V and 1000V readings in its 1kV range, is about -1.5 ..-2ppm, much better than the specified 12ppm.

These measurements I also propose for TiNs CAL FEST.

Now I had a closer look on the linearity of the 20V, 250V and 1000V range of my 5442A. Take notice of change in the bias measurements, and that for several low readings, one gets too high errors of -0.2ppm when using a DMM in this use case.




One can directly see some quirks of the 5442A.
The lowest value of each range deviates strongly.
The linearity specification of 0.5ppm is violated in the 20V and 250V range, and in the 1kV range, it should not be as high as -1ppm for 1000V.

This indicates a subtle error of my instrument. Subtle in the sense, that it just exceeds the specification, and was hardly detectable, only by use of the uncertainty of the 752A.

I already knew, that my instrument was nOK for negative voltages, having a smaller gain factor there, and that negative voltages do not stabilize quickly, but instead need too long a time to reach their nominal value.
That all did not restrict its use, as it was all inside the main specification.

Now I had the sense current compensation in suspicion.
These are internally up to 500µA for the + Sense, and up to 200µA for the - Sense. Like in the 732B reference, these currents are compensated internally to increase the accuracy of the output voltage.

I set the 5442A to external sense and measured both sense currents by two handheld DMMs, i.e. the BM869 and the 121GW, resolving 1 .. 10nA each.

It turned out, that the + Sense works correctly, i.e. quickly regulates to < 200nA. The - Sense shows exactly the anticipated error: it changes very slowly and stabilizes after a long time at several µA, depending on the output voltage and sign.
So I will investigate on this assumed error mode around OpAmp U1 further.
Frank
« Last Edit: June 07, 2022, 11:48:59 am by Dr. Frank »
 
The following users thanked this post: TiN, e61_phil, alm, eplpwr

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Fluke 752A Reference Divider - Ratio Consistency Check
« Reply #76 on: June 12, 2022, 10:06:46 pm »
Is it the bias current only, or are the impedance, isolation resistance, and so on also relevant?
Additionally, FLUKE only hopes, that the bias of their 8588 would be in most cases around 5pA. Fluke also does not measure the bias over different voltages, or do I miss something? I guess, that each DMM will show varying bias currents over their input voltage.

Hi all,

Frank and I already started the discussion about that point. In my opinion the absolute value of the bias current is less important. It is very important, that this current is stable during the self-cal procedure.

Fluke describes the procedure with the 8508A in the 5730A calibration manual. They nulled the meter with 0V applied to the input (0V not open!).
The input current is a current (surprise, surprise) which means a constant current will develop a constant voltage above a constant impedance. In my opinion the Fluke procedure makes absolutely sense. Applying 0V (or apply a short) creates the same output impedance for the null detector as within the self-cal with low impedant 20V. And you know, that the bridge voltage must be 0V as long as you don't apply a voltage to the input. Therefore, that is the correct mode to zero your meter. That will only correctly cancel out the bias current for 0V. Which means if your bridge isn't perfectly balanced the reading of your nullmeter is influenced by the current difference between 0V and the actual bridge voltage. But that doesn't matter as long as your readings are still monotonic. You don't need to measure accurately a non-balanced voltage you just need to find 0V.

I measured the input current of my Keithley 182 Nullmeter several times over the day and the current was very reproducible (K182_input_current.png). One can also see, that the slope is not very steep, which means the current is almost equal in the +/-1µV range of interest.

To evaluate if the K182 ist suitable to self-cal a Fluke 752A (big thanks to High Voltage!!) I run the following self-cal procedure and verifed the zero after that with my Fluke 845AR.

1. Set calibrator (connected to the input of the 752A) to 0V (in OPERATE!!)
2. zero the K182
3. Set calibrator to 20V
4. adjust the 752A until the K182 reads zero (few 100nV)

That worked perfectly with the 1:10 calibration, but I got ~8µV missbalance (measured with the 845) every time I tried to calibrate the 1:100 range. That would lead to almost 2ppm of ratio error.

To figure out where the problem lies I measured the common mode current, which flows into the K182 as soon as it isn't sitting on calibrator LO anymore. To test this I used my Keithley 617 and connected both inputs of the K182 to the HI input of the K617. The interal voltage source of the K617 was placed between the GUARD/Shield of the K182 and the inputs.
That measurement revelad more than 100pA at 10V. (K182_CM_current.png). That explains the 8µV over the balance arm. The curve looks very much like a simple resistance of ~70Gig. I also tested that with the Fluke 845AR and the current was well below 200fA at 10V. In my opinion this isolation resistance is way more important than the input bias current.

To overcome this problem I changed the self-cal procedure slighlty

1. short the input of the 752A
2. apply 10V on the shorted input. (LO of the calibrator is internally connected to GUARD on the calibrator)
3. null the K182 (this time the bridge voltage is again 0, but the meter sees the same CM as during the self-cal)
4. remove the short and apply 20V on the input of the 752A
5. adjust the 752A

This time the 845AR also agreed with the zero bridge voltage in 1:100 mode.

Conclusion: You need not only a meter with low input bias, but also with very good insulation.

As Frank already mentioned: The input current is also very important during measurements with the 752A. I also measured the input current of my 3458A (also very stable over many hours) and calculated the generated error, if one measures the 752A output in the 10V range.
 
The following users thanked this post: Dr. Frank, alm, MegaVolt, eplpwr


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf