Hi,
Thanks, but i dont really care about the reference, i just care about the ADC itself. When i apply a known reference how well does it actually read the voltages.
I had found that the 1.1v reference can be quite a bit off, but i assume that once it is measured it can hold for the same temperature. I cant find a temperature plot for the reference, so all i can do is assume that it varies like any other band gap reference around 1.1v, which isnt too much. But i dont want to concentrate on the reference itself, just the measurement accuracy given a constant known reference of 5.000 volts.
In other words, if i measure 4.100 volts is it really 4.100 or 4.101 or 4.102 or 4.103 or 4.110 etc. And this would be after a single point calibration (i know multiple points can do better, but i am trying to characterize the ADC without extensive calibration).
Here is how it would go...
Measure a constant DC voltage with a high quality meter.
Measure with the ADC with an ADC adjust factor of (float) 5.000 volts.
Using the ADC reading and the quality meter reading, compute the true adjust factor. Say it comes out to 4.935 volts.
Change the 5.000 to 4.935, then test again to verify, then test over the whole range say at 0.100v, 0.500v, 1.000v, 2.000v, 3.000v, 4.000v.
Compare the readings from the ADC with those 6 quality meter readings.