Hello,
I would like to ask you guys about one thing... I've learned all day long about sensors, resolution, calibration, precision, etc. so my brain is a bit foggy now.
Let's suppose I have a PT100 temperature probe, for its great precision (when compared to K thermocouples) and extended usable range( when compared to digital i2c temp. sensors)
Now I have to reach a microcontroller, which is several meters away so naturally I had the idea to use a PT100/4-20mA transmitter, which seems to be pretty much the industry standard, at least in France.
So I saw that there are two main solutions : measuring voltage across a precision 100R or 250R resistor directly with an ADC, or using a small converter based on Op-Amps, such as
this, or even this single chip
receiver before the ADC.
(btw if you know another great solution, or another chip like the rcv420, please let me know :-) )
what would be the most precise architecture between the two (or more) ? Or the best in regard as other criteria of your choice ? The rcv420 claims 0.1% accuracy, is it better than using a 0.1% resistor directly ? Maybe the op-amp based bridge is better when it comes to input protection ?
So I'd like to have your opinion !
Thanks
JF