I'm looking mainly for repeatability and stability of measurement, rather than absolute accuracy.
I need to determine a temperature between around -10 and 110 degC to an absolute accuracy of 0.05 degC (0.1 degC would be acceptable) I expect the device to be regularly calibrated (multipoint cal) probably checked every 3 months and calibrated if it falls outside the spec. Cal points will probably be 0 degC and 100 degC, with perhaps a 50 degC check in the middle.
The MCP3553 can sample up to 60 samples per second at 20.6 bits, and my process has a very long time constant, in the order of 10's of seconds per degree (because it's a large mass of water) so i will be able to oversample and remove AC noise in software. As such, the DC offsets are probably more of a concern, and the variability of those offsets (with time) i guess more important than there basic magnitude (because the calibration will remove fixed offsets)
Using a reference resistor to generate the voltage reference for the ADC directly is good for a low parts count, and doesn't introduce any additional significant offsets into that task, however that resistor will need to be carefully sized. My RTD is going to change between roughly 90 and 140 ohms, so to use the biggest "span" of the ADC possible the reference resistor needs to be reasonably close to the upper value. However, as the ref resistor gets smaller, the Vref of the ADC falls, the internal ADC noise increases ie more bits are "lost" to each noise quanta: EG
200 ohm ref resistor at 1mA = 200mV Vref at 20.6 bits, each bit is 0.126 uV, so the claimed 2.5uV of noise is 4.3 bits and my ADC range used is 75% of the available range
400 ohm ref resistor at 1mA = 400mV Vref at 20.6 bits, each bit is 0.252 uV, so the claimed 2.5 uV of noise is 3.3 bits and my ADC range used is 37.5 % of the available range
In reality, i suspect the answer will be the answer so to speak, and that the effects that dominate due to layout etc won't be changed much by minor tuning of the component values?