Author Topic: Voltage calibrator question  (Read 1384 times)

0 Members and 1 Guest are viewing this topic.

Offline nnillsTopic starter

  • Contributor
  • Posts: 36
  • Country: nl
Voltage calibrator question
« on: October 08, 2019, 06:07:19 pm »
Hi,

I read a document by fluke about artifact calibration and thought about how I would do something like this(for voltage only). The document mentions PWM DAC's. Now the question is: if the 10V reference is for example 9.9999000V how does one make a 10V output? I thought of a boost converter of which the output is compared to the reference with a sensitive voltmeter. That would require using In-Amps and such. Is this a reasonable approach? And what do the real pro's use?

small sub-question:
would the following be a good calibration procedure?
1. measure with a short
2. measure 10V@10V range
3. measure transfer ratio(divider): 1V@10V range compared to 10V input divided by transfer divider
4. measure 1V@1V range with 10V input divided by transfer divider
5. measure 100mV@100mV range with 1V@1V range divided by transfer divider

 

Online Echo88

  • Frequent Contributor
  • **
  • Posts: 826
  • Country: de
Re: Voltage calibrator question
« Reply #1 on: October 08, 2019, 08:49:53 pm »
I cant follow the idea, normally its done like this:

Calibrators like the Fluke 5440 or 57XX-series use a PWM-DAC and two zener-refs in series (2x ~7V =14V), which gives them the ability to go from 0 to +-11V.
Using a trimmed 10V-reference for the DAC isnt necessary, since all calibration constants are used in software and the needed internal zener -> 10V-converter would worsen the overall specs since it contributes additional long-term-drift, TC, noise.

Your calibration procedure looks very similar to the Fluke 752A which is usually used for this type of calibration (when left-side voltage reference + DAC is seen as a calibrator).
 

Offline nnillsTopic starter

  • Contributor
  • Posts: 36
  • Country: nl
Re: Voltage calibrator question
« Reply #2 on: October 09, 2019, 08:49:59 am »
And would the 1V and 100mV be divided before or after the DAC? And the 100V range be amplified x10 or is 110V regulated down, divided by 10 and compared to the 10V dac output? Or can I look this up in some kind of service manual?
In any case, thank you for the first response.

Edit: I initially thought that de divider and the null meter would be in the instrument.
« Last Edit: October 09, 2019, 08:58:09 am by nnills »
 

Online Kleinstein

  • Super Contributor
  • ***
  • Posts: 14203
  • Country: de
Re: Voltage calibrator question
« Reply #3 on: October 09, 2019, 04:14:23 pm »
Some calibrators include a null-meter or even a good resolution low level meter used as a null meter.

The DAC usually only works with a fixed input voltage. So the higher ranges are amplified from the DAC output in some way.
Amplification could be checked against an external Hamon type divider or internally against the PWM DAC.

For 1 V they may just use the PWM DAC. For even lower voltages a divider (can be inside the instrument) may be used after the DAC.

There are a few manuals around on how the calibrators work.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf