Electronics > Beginners
Improving ADC measurement precision
(1/1)
nuclearcat:
I want to measure lead-acid battery voltage on ESP8266. Approximately range is 12-14V.
In fact, I think even 0.1V resolution is enough for me, so this is more a theoretical question.
ADC is 10-bit 0-1V, probably as usual it is more marketing number, but i guess i can get real 10bits by a little bit of oversampling.
Even if i use 5% tolerance resistors 14k/1k(i can always calibrate divider), corner case max voltage (13300/1050) is 1.024V, still OK.
Precision drop in worst case (max 0.85V) (9.7bit), this means ~0.016V precision.

What if i add some inexpensive R2R opamp (MCP6041?) in inverting scheme and shift with virtual ground 12V, range to 12-14V, with gain 0.5?
Such way my measurement will stay within 2V, which might be ~0.002V precision, but i add in equation imperfections from opamp itself.
Is it worth to do such way if i want to improve the resolution of the measurement?
Kleinstein:
If one is only interested in changes of the voltage it can help to subtract the bulk part of the voltage before sending the rest to the ADC. However this requires the part subtracted to be really stable. So it takes quite stable resistors and a good reference (even at 10 Bit level).

Oversampling mainly improves on the noise and quantization errors, but it does not help much with precision. µC internal ADCs can have additional drift that limits the precision. With such a slow changing voltage one can usually use enough oversampling that noise or the quantization is not the limit, but the ADC precision, linearity and reference. So the important point is not if the ADC is 10 Bit, 8 Bit or 12 Bit. It is more about how much offset drift and gain drift. Also input leakage can be an issue with a high impedance source like a divider.
iMo:

--- Quote ---In fact, I think even 0.1V resolution is enough for me, so this is more a theoretical question.
--- End quote ---
You may divide the 14V down to say 1.4V, subtract 1.2V (with a differential opamp against an 1.2V reference) and amplify the difference such it fits inside the 0-1V ADC range.
It is a big complication, however. If I were you I would stay with the 0.1V resolution.
PS: mind the resolution, precision, and accuracy are 3 different things..
rstofer:
Read Chapter 4 of "Op Amps For Everyone", Section 4.3.2.  What you can do with an op amp and 4 resistors is subtract off, say 12V, keep whatever is left over and scale it to fit the ADC.  13.8 is the usual charge voltage for a lead-acid battery so if you designed for 12-15V over the ADC range, you would be in pretty good shape.

You will be solving for the case "y = mx - b" where b is the offset (say 12V) and m is derived from the range of the ADC divided by the applied voltage (what is left over after 12V is subtracted).  If the MCU used a 5V ADC input and you designed for 12V..17V (5V range) then m = 1.

If you wanted the same 5V battery range over a 3.3V ADC,  m = 3.3/5.0 or m = 0.666

You will probably have to scale Vin to fit within 0V..Vcc of the op amp.  Make sure to use a rail-to-rail op amp so you can get the full range of output voltages

https://web.mit.edu/6.101/www/reference/op_amps_everyone.pdf

LTspice is a good place to start.  Input a ramp voltage of 0..15V and see what the output looks like.  You can probably derive Vref from Vcc.

Some math required...

floobydust:
I mentioned in your other thread:
ESP8266 has a crappy A/D, not good enough here. Noisy, non-linear 10-bit 0-1V input range, poor accuracy you need to calibrate it and have to shut off the RF circuit to get a reasonable reading. I thought the internal 1V reference does poor over temperature too.

Just add an external A/D converter.
Navigation
Message Index
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod