Author Topic: Voltmeter scaling resistor front-end and ADC calibration coefficients  (Read 2346 times)

0 Members and 1 Guest are viewing this topic.

Offline diyaudioTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 683
  • Country: za
I'm designing a voltmeter and current measurement system that forms part of a electronic load. Lets tackle the volt meter first.

I'm using a 16-BIT ADC (AD7708) and want to measure the following voltages see below, the ADC is pretty neat for DC applications and is used in the BK8500 instrument.

Here us what I want to achieve.

0.1V to 25V ( A resolution of 1mV with an accuracy of 0.05%)
0.1V to 120Vmax ( A resolution of 10mV with an accuracy of 0.05%)


I've written the painful driver code (based on the datasheet) that does all initialization start-up routines like zero calibration, internal DSP filtering, ADC gain and single ended operation mode. So far so good, it works, as I can measure good accuracy against my Fluke 87V from 1mV to 2.5V (ADC FSD)
Note: The ADC is using a external precession 2.5 Volt Reference (ADR421) and powered using a low noise power supply (DP832). 
http://www.analog.com/media/en/technical-documentation/data-sheets/ADR420_421_423_425.pdf

Question 1

I'm confused as to how to design a scaling resistor front-end with a suitable gain amplifier for the 25V range and 120V range.
so far my attempts are failing, the resistor divider ratio I used to scale for 25V are 100uV per 1mV  for R1 = 18K and R2 = 2K (Where R1 and R2 presents a simple resistor divider)
Are there any good examples I can use as a guide for this sort of design?
   
Question 2

How does a design incorporate calibration coefficients, is there a standard design for this?
How much ADC coefficients are required to influence a measurement? Is it across the entire measurement range or only for low resolution readings.?


 




   
« Last Edit: July 13, 2015, 02:27:32 pm by diyaudio »
 

Offline macboy

  • Super Contributor
  • ***
  • Posts: 2254
  • Country: ca
Question 1

I'm confused as to how to design a scaling resistor front-end with a suitable gain amplifier for the 25V range and 120V range.
so far my attempts are failing, the resistor divider ratio I used to scale for 25V are 100uV per 1mV  for R1 = 18K and R2 = 2K (Where R1 and R2 presents a simple resistor divider)
Are there any good examples I can use as a guide for this sort of design?

This is how to do it. But there are a few things to keep in mind. As mojo-chan already said, consider the impedance of that resistive divider (it is 1.8 kohm) and the minimum input impedance of the ADC. You can and should put a capacitor across R2 to lower the AC impedance and reduce noise. Size it for fast enough response for your needs. You will likely need to buffer the signal with a precision op-amp as mojo-chan suggested.

You must also consider the self-heating in the resistors. At 25 V input, your 18 k R1 will have 22.5 V across it, which is 28 mW of power. A "typical" 1/4 W through-hole resistor may have a thermal resistance of 125 degC/W. At 28 mW, this will result in 3.5 deg C rise in temperature. The 2 k R2 will heat much less. If you use plain metal film resistors, they may have a 100 ppm/degC temperature coefficient. This will result in around 350 ppm of error in the value of R1 between hot and cold, but you can only tolerate 500 ppm of total error from all sources (op-amp buffer thermal drift, divider drift, reference drift and absolute error, ADC offset/drift/non-linearity, etc.). At 120 V range, you will likely have even more self-heating error, exceeding the error budget. To control this error, you could reduce self-heating by increasing the resistance, or use low temperature coefficient resistors. Increasing resistance can introduce other errors (noise, offset voltage due to bias currents, etc.). Look at something like Dale RN55 resistors, they are not too exotic or expensive and should perform well, given your specs. High end multimeters use special thin-film resistor networks that incorporate all divider resistors on one substrate to help keep their temperatures and temperature coefficients matched as closely as possible. You could use something like that but they are expensive.

Don't worry about getting the ratio to be exactly 1/10 or 1/50 since you can and will calibrate digitally.
   
Question 2

How does a design incorporate calibration coefficients, is there a standard design for this?
How much ADC coefficients are required to influence a measurement? Is it across the entire measurement range or only for low resolution readings.?
You will likely calibrate at zero volts and at a known voltage near full scale, applying a simple linear adjustment of ax+b, where x is the ADC value, a is the gain adjustment, and b is the offset (zero) adjustment. The ADC linearity should be a tiny portion of your 500 ppm (0.05%) error budget so instead of trying to adjust that out, just accept the error. You will need to apply a gain to the ADC reading in any case, since you have 65535 ADC counts full scale at ~25000 mV full scale input.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf