Author Topic: ELI5 - How do calibrators take 10V and get it up to 1000V?  (Read 982 times)

0 Members and 1 Guest are viewing this topic.

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
ELI5 - How do calibrators take 10V and get it up to 1000V?
« on: October 26, 2021, 08:26:43 pm »
I would ask in the metrology section but I am just wondering how people take let us say the output of an LTZ1000 and ramp it up to 100 or 1000V and I know the Voltnuts might get into a fight.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7821
  • Country: us
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #1 on: October 26, 2021, 08:43:06 pm »
As it happens I'm finishing up working on a calibrator right now, a Fluke 5101B so I can tell you in great detail how this one works--but that would be a long post.  I'll just give you a quick run through and you can ask about any part that isn't detailed enough.

To start, for smaller outputs, the reference is supplied to a DAC that then outputs a specific fraction of that reference.  This is supplied as a reference to a control board which compares two signals, the DAC output and the output of the calibrator, and then produces a control signal, which is sent to a power amplifier that responds and produces the calibrator output.  The initial state, of course, is zero output, so the control board ramps up the control signal until the power amplifier output matches the DAC signal.  That works for the range that corresponds exactly to the DAC output range, which on the 5101B is 0.2 to 2.0 volts.

For higher or lower signals, the output is scaled through the ranging board, which is a complex series of shunts, voltage dividers and relays.  So for the 2.0 to 20 volt range, it would divide the actual output by 10 and send that back to the control board.  For voltages over 20 volts, the system switches to the HV mode which changes a bunch of relays so that the power amplifier is now producing a modified 2kHz square wave that is fed to a large transformer that transforms it up by a factor of approximately 10 or 50 depending on the range, then that output is rectified and filtered by an especially tricky little circuit and that voltage--as high as 1100VDC--is sent to the ranging board where it is directed to the output terminals and divided down and sent to the control board.

So the short version is that the reference is used as just that to control the output of an entirely separate voltage source by comparison via scaling.

Edit:  The way I've described it is actually not correct, I've mishmashed the AC and DC functions.  I won't bother correcting it here because the point is the same, even if the numbers are wrong.
« Last Edit: October 26, 2021, 09:40:51 pm by bdunham7 »
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7942
  • Country: us
  • Retired, now restoring antique test equipment
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #2 on: October 26, 2021, 08:45:05 pm »
One traditional method:
Start with an adjustable or settable power supply that is known to be stable over the duration of the test.  I remember lead-acid batteries used with old-style potentiometers back in the 1960s.
Connect a fixed accurate ratio 10:1 or 100:1 voltage divider to that supply and a variable Kelvin-Varley divider to the 10 V reference.  Adjust the K-V divider to obtain a null in the difference between the two divider outputs.  At null, both dividers are unloaded, since there is no voltage across the floating null meter.
If the high voltage supply is adjustable, you can adjust it to 100 or 1000 V, or 90 or 900 V, with the voltmeter to be calibrated connected;  this totally avoids loading corrections.

Edit: fix auto-correct induced typo
« Last Edit: October 26, 2021, 08:56:55 pm by TimFox »
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6758
  • Country: pl
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #3 on: October 26, 2021, 08:49:26 pm »
This one weird trick takes your 100V/1000V and outputs a very accurate 0.1 or 0.01 fraction thereof for comparison with a trusted 10V reference. See posts by Dr Frank in particular.
https://www.eevblog.com/forum/metrology/anyone-else-built-a-hamon-divider/
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14172
  • Country: de
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #4 on: October 26, 2021, 09:12:14 pm »
The higher voltage are produced with some kind of amplifer for scaling the voltage. For the way the amplifiers gain is checked and corrected, there are several methods. One is the hamon type divider to create a precise 1:10 and 1:100 divider. Anther way is to use a very accurate DAC / ADC to measure the amplifier / divider gain at a low votlage and than by using stable parts apply the same ratio also at a higher voltage.

A 3rd, though method for the step up, though AFAIK not used with the normal calibrators is to stack a chain of voltage refrences and than measure / set the single voltages with enough accuracy. The reference to stack are usually less accurate than an LTZ1000. They only need to be stable for time of the expriment.
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 844
  • Country: gb
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #5 on: October 26, 2021, 09:31:07 pm »
OK, its not an easy thing to do. I see the Hamon Divider but googling it didn't help me went I was looking the other week as I didn't quite understand the use of dividing the voltage even more.

I have a calibrator at work that does up to 1000V and its just a box that I type numbers in and magic happens so part of me wanted to know how they do it.

I am also interested as I personally would like to build up the new version of the AD(LTC1000) and see if I could turn it into a basic Voltage source and have a few outputs if possible which I could lend out and learn with at home.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14172
  • Country: de
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #6 on: October 27, 2021, 11:25:42 am »
There are different solutions for the voltage divider / amplifier setting.
The easy is to have just a fixed set of resistors for the division and than need an extra calibration from time to time. Here the Hamon divider could be used to get the accurate 1:10 or 1:100 ratio for the calibration. When done right it can be very accurate, but it is a bit of a slow process with trimming and switching.

For anderstanding, learning on how the voltage scaling works and to build such a circuit at home, there is no need for a super low noise reference. Just a stable 5 V chip or maybe LM399 would be goood enoug for the start.
 

Offline perdrix

  • Frequent Contributor
  • **
  • Posts: 640
  • Country: gb
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #7 on: October 27, 2021, 04:25:57 pm »
You don't divide the reference voltage, in simple terms you divide the voltage to be checked (e.g. 1000V)  with the switch adjustable divider until you have a null against the 10V reference.

At that point you read off the switches on the Kelvin-Varley Divider or Hamion Divider and say "That reads a factor of 9.998635"

So your 1000V supply is actually outputting 9998.635V.

David

 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7942
  • Country: us
  • Retired, now restoring antique test equipment
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #8 on: October 27, 2021, 04:39:56 pm »
Many K-V dividers cannot tolerate 1000 V at their input:  a typical 100 k \$\Omega\$ input resistance would dissipate 10 W.  Therefore, often a separate fixed-ratio divider is used to divide the high voltage down to roughly 10 V, to be compared with the output of a switchable divider set to a reasonably high ratio, say, > 0.5.
« Last Edit: October 27, 2021, 04:50:51 pm by TimFox »
 

Offline Terry Bites

  • Super Contributor
  • ***
  • Posts: 2389
  • Country: gb
  • Recovering Electrical Engineer
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #9 on: October 27, 2021, 04:47:33 pm »
Jim Williams created this excelent series of designs LT Application Note 118 page 2.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16607
  • Country: us
  • DavidH
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #10 on: October 27, 2021, 08:28:30 pm »
Jim Williams also shows the design for a 0 to 100 volt reference on page 6 of Linear Technology application note 6.

It comes down to a low voltage reference divided by a Kelvin-Varley divider, which drives the high impedance input of a high voltage non-inverting amplifier which has a precision fixed gain.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7942
  • Country: us
  • Retired, now restoring antique test equipment
Re: ELI5 - How do calibrators take 10V and get it up to 1000V?
« Reply #11 on: October 27, 2021, 08:31:29 pm »
The comparison of the outputs from (the reference and KV divider into the non-inverting input) and (the output through the high-voltage resistor and low-voltage resistor at the inverting input) at the op-amp is the equivalent of the manual operation with a null-meter in the traditional method.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf