| Electronics > Projects, Designs, and Technical Stuff |
| Precision RTD measurement - correct for tempco of ref resistor? |
| << < (3/3) |
| Kleinstein:
--- Quote from: max_torque on October 12, 2019, 09:39:00 pm ---wow, that ^^^ is certainly going to town in the name of accuracy! I don't need to be anything like as good as that! Question for ADC experts: What is the difference between adding gain in front of the ADC, vs simply lowering the reference voltage? I guess gain can be added with better components (lower noise) whereas the noise of the ADC is more likely to be fixed by the choice of ADC used. For example, if i use a MCP3550 ( http://ww1.microchip.com/downloads/en/devicedoc/21950c.pdf ) which can accept a Vref down to 100 mV, what is the actual effect of using such a "Low" reference voltage? if the noise of the adc is fixed (2.5 uV rms according to the data sheet) , with a 100mV reference, that noise is equivalent to anything greater than about 15 bits? --- End quote --- How far on can go with lowering the reference voltage, depends on the ADC. The main disadvantage is the noise. There may not be full specs for a rather low reference voltage: something like bias, drift etc. may be tested only for more normal reference. It can still be OK, but may need extra testing if for more than simple home use. The MCP3550 allows for a very simple circuit, but is not very good at noise. Other SD ADC often offer an additional internal gain option, that often does improve the noise somewhat. The internal gain is also usually very stable, as it is done by multiple sampling and not real voltage gain through resistors. External amplification can be lower noise, but it also needs to be stable and thus may be relatively expensive resistors (or networks) to set the gain. The uncertainty in the gain adds to the errors. The idea of directly using a high resolution ADC is avoiding those gain setting resistors. In the modern times a 24 Bit ADC may be use as a lower cost replacement for 2 precision resistors :-DD. For the simple circuit, using the reference resistor directly for the ADC reference, it can be a little tricky to turn off the current. Reversing the current could work with CMOS switches for reversal of the current leads. However this will add some leakage currents and cause transients on switching. The reference input may need some extra capacitance for the ADC to work really linear. This may extend the transients in time - so switching would need to be rather slow. |
| max_torque:
I'm looking mainly for repeatability and stability of measurement, rather than absolute accuracy. I need to determine a temperature between around -10 and 110 degC to an absolute accuracy of 0.05 degC (0.1 degC would be acceptable) I expect the device to be regularly calibrated (multipoint cal) probably checked every 3 months and calibrated if it falls outside the spec. Cal points will probably be 0 degC and 100 degC, with perhaps a 50 degC check in the middle. The MCP3553 can sample up to 60 samples per second at 20.6 bits, and my process has a very long time constant, in the order of 10's of seconds per degree (because it's a large mass of water) so i will be able to oversample and remove AC noise in software. As such, the DC offsets are probably more of a concern, and the variability of those offsets (with time) i guess more important than there basic magnitude (because the calibration will remove fixed offsets) Using a reference resistor to generate the voltage reference for the ADC directly is good for a low parts count, and doesn't introduce any additional significant offsets into that task, however that resistor will need to be carefully sized. My RTD is going to change between roughly 90 and 140 ohms, so to use the biggest "span" of the ADC possible the reference resistor needs to be reasonably close to the upper value. However, as the ref resistor gets smaller, the Vref of the ADC falls, the internal ADC noise increases ie more bits are "lost" to each noise quanta: EG 200 ohm ref resistor at 1mA = 200mV Vref at 20.6 bits, each bit is 0.126 uV, so the claimed 2.5uV of noise is 4.3 bits and my ADC range used is 75% of the available range 400 ohm ref resistor at 1mA = 400mV Vref at 20.6 bits, each bit is 0.252 uV, so the claimed 2.5 uV of noise is 3.3 bits and my ADC range used is 37.5 % of the available range In reality, i suspect the answer will be the answer so to speak, and that the effects that dominate due to layout etc won't be changed much by minor tuning of the component values? |
| Kleinstein:
From the data-sheet it looks like it does not make a large difference weather a 200 or 400 Ohms resistor is used as the reference. The main limiting point is ADC noise and offsets. It looks like the noise is not getting better with a very low reference (it may make a difference from 5 to 3 V). The points to observe may be the importance of copper traces and the heating effect. Another point could be capacitance at the reference and maybe stray RF. |
| max_torque:
I'm going to knock together a quick test pcb and see how it performs i think! too many unknowns to work out at the moment! Has anyone got any recommendation for precision resistors i can use as reference resistors for system checking & calibration? 100 ohms (0 degC) and around 140 ohms (~100degC) |
| Navigation |
| Message Index |
| Previous page |