EEVblog Electronics Community Forum
Electronics => Beginners => Topic started by: seyedsaeed on April 11, 2020, 10:41:31 am
-
Hello
I want to measure the temperature with 0.01 C
In the range -50 degrees Celsius to 170 degrees Celsius
What is the proper circuit for PT100?
-
Hello seyedsaeed,
This might be a harder question to answer, than is seems at first.
PT100 sensors are usually set to 100Ω @ 0ºC, and are 138.4Ω @ 100ºC , so that only gives a 38.4Ω change over the 100ºC range. To measure down to 0.01ºC accuracy, you are looking at building a circuit which can measure changes of 0.00384Ω changes, something I'm sure you'll see is not an easy task. A circuit that sensitive would be subject to noise, changes in cable resistance, which make it difficult to have a simple circuit for. You'd probably be looking at a 4-wire measurement circuit, where 2 wires feed a constant current through the temp sensor, and the other two measure the voltage across it. And even then, errors can creep in, as the current through the sensor can add heating, and knock out the accuracy.
There is a microchip pdf which will show you the basics, but for the accuracy you are looking for, the current source, reference voltage, and ADC if you use one, would have to be very accurate (read expensive parts), and probably shielded from noise. There is also the issue of calibrating it, once you have built a circuit, as you would need known good high accuracy sensors to compare your output to.
http://ww1.microchip.com/downloads/en/appnotes/00687c.pdf (http://ww1.microchip.com/downloads/en/appnotes/00687c.pdf)
There is industrial interfaces at around this accuracy, but they are expensive.
https://www.picotech.com/data-logger/pt-104/high-accuracy-temperature-daq (https://www.picotech.com/data-logger/pt-104/high-accuracy-temperature-daq)
-
Do you need absolute accuracy, or only a fine indication of small temperature changes?
About ten years ago I made a smart thermostat using an Arduino Uno controlling a 2400W oil column heater. I used a simple thermistor and used standard Arduino analogRead(). I took 100 readings back to back and averaged them and got a pretty nice 0.01 C precision. I calibrated it at a couple of points (0 C using ice water, and 100 C using boiling water) and got nice smooth and regular 0.01 C continuous changes as the room heated up or cooled down. It worked great a input into a PID controller to proactively adjust the heater duty cycle in response to the first signs of a temperature change.
The linearity and absolute accuracy probably weren't that great, but it worked really great for the purpose. Of course I didn't have to handle the temperature range you're talking about -- the whole point was to keep the temperature constant :-)
(there was definitely self-heating involved, but that came out in the calibration)
-
With the usual size sensors one does not want more than about 0.1 mW of heat so limit the self heating to a reasonable amount. This would be some 1 mA of sensor current and thus some 3.5 µV for 0.01 K to detect. It is likely not a good idea to make a precision current source, but hat a second resistor in series to the RTD and use the voltage drop over that resistor as the reference voltage to high resolution ADC. This works relatively easy as there are high resolution ADCs with differential input. Some ADCs have a input suitable for low input voltages (some 100 mV).
The LTC2410 data-sheet has such an example circuit.
-
https://www.silabs.com/documents/public/data-sheets/Si7050-1-3-4-5-A20.pdf (https://www.silabs.com/documents/public/data-sheets/Si7050-1-3-4-5-A20.pdf)
0.01° resolution, but 0.1° accuracy, and "only" -40°C to +125°C ...
https://www.youtube.com/watch?v=pj1HOM35vH4 (https://www.youtube.com/watch?v=pj1HOM35vH4)
-
I use an Analog Devices CN0359 to achieve 0.01 C temperature resolution.
Not long after the CN0359 became available I mounted one in a black case and used an AC power supply. It measures both temperature with your needed resolution, plus AC conductivity. Both can be measured with either 2, 3 or 4 wire connections to the sensors. The attached picture shows it measuring the conductivity of a 1,000 ohm 0.01% accuracy resistor and the temperature. While I was using a PT1000 RTD for temperature readings, the CN0359 automatically senses when PT100 RTD's are being used and will work with either.
Earlier this year, being happy with my first CN0359 I purchased a second one to be battery powered and mounted in a blue case. It has AD's updated software.
While the best that I have connected is RTD's with 0.1 C accuracy, it is nice to be able to see fine changes in temperature. Plus, if a greater accuracy temperature chamber is available, the CN0359 can be calibrated to meet that standard. I have used precision resistors as another calibration method.
I also have a Renesas SDAH02. It measures both temperature and humidity with 0.01 resolution, but 0.2 C temperature accuracy. A picture of its display is attached. Like the CN0359 it is a complete circuit board, without a case. The SDAH02 comes with both sensors while the CN0359 does not come with any. The SDAH02 costs $43.75 while the CN0359 is at $131.25. I have found the CN0359 more accurate with better ability to control the sensor accuracy and positioning.
-
SMT 172 is good for me
But there are some problems
1- It is non-linear
2- I'm not sure if it's accurate
I need a sensor to control the temperature
My available chip is ADS1232, in your opinion, with this chip, I can reach the accuracy of 0.01.
-
Check this out https://www.maximintegrated.com/en/products/sensors/MAX31856.html (https://www.maximintegrated.com/en/products/sensors/MAX31856.html)
-
The ADS1232 should be good enough to get 0.01 K resolution. However 0.01 K accuracy is way out for the normal affordable PT100 sensors. The ADC1232 alone should not add that much error, but the sensors are usually way less accurate. The ADC is not ideal, as with gain of 64 the voltage range (+-39 mV) is a little on the small side and the next setting is a gain of 2 and this more like +-0.5 ...1 V.
One still has to check for EMI.
-
I wish I could find it now, but no...... an app note which discusses all the pitfalls of measuring such high resolution temperature.
In addition to what others have mentioned, you may have to also compensate for thermocouple effects.
-
Thermocouple effects are a possible error source. For this reason the really high resolution systems prefer low frequency AC or measure with both polarities.
A point to watch can be cable / input capacitance that can effect some SD ADCs.
-
The ten page Circuit Notes on the CN0359 do a good job of explaining its operation. See: https://www.analog.com/media/en/reference-design-documentation/reference-designs/CN0359.pdf (ftp://www.analog.com/media/en/reference-design-documentation/reference-designs/CN0359.pdf)
The RTD Measurement paragraphs starting on page 4 explain the temperature measurement and its limitations. The CN0359 has an ADUCM360 microcontroller which has dual 24-bit ADCs and a 32-bit processor. Computer interface is via RS-485.
-
I've tried measuring higher resolution with thermistors (in open air) and had problems due to self-heating.
This is a small bead type with 0.25mA (~1mW 5V 10k). Took me a while to figure why convection currents affected accuracy.
In firmware, my solution was to turn on excitation, measure the A/D, then shut off excitation to the thermistor. That made it work great.
OP's RTD size would be a factor, I usually use ~1mA for their excitation so small ones would self-heat.
-
how do measuring temp with 0.02 or 0.025 resolution?
What do you think ?
-
What are you measuring? I find it hard to think of cases that would require such resolution or be worth the effort to attempt it.
This is like building a scale to measure the weight of dust on a person. 0.01C is like dust. Breath near it and it changes. Even 1C is 'small'. For example the boiling point of water changes by about 1C with just a 1000ft elevation change. If you use boiling water as your calibration standard and assume it is 100C you can easily add much more than 0.01C of error.
You can introduce error in many ways, even if your electronic hardware is great.
Where are you placing the probe? Does your medium have the same temperature throughout its volume? How fast does temperature change in your medium? When temperature is changing, how will the temperature of your probed location compare to the average temperature of the medium?
Do you need absolute accuracy, or only a fine indication of small temperature changes?
This is an important question. Technically you shouldn't bother having resolution that is much better than accuracy but if you care more about seeing small changes than you care about absolute accuracy then it would be helpful to declare that so we have a better idea what your goal is.
-
For a regulator resolution much better than accuracy can absolutely make sense. There are cases where stability and low noise is needed, but the absolute temperature does not matter. Take the LTz1000 reference: the stability / temperature reading resolution should be at around 1 mK, but the uncertainty of the temperature set point is more like 10 K.
It is not so uncommon that one wants a reproducible / stable temperature, but does not care so much about the absolute value.
So if you have a solution to get 0.01 K resolution reading to 0.02 K resolution is simple :-DD.
-
As someone who yesterday continued my regular set of nine temperature readings measured with 0.01 degrees C resolution, I can say that there can be an advantage to that resolution, even when the accuracy might not seem to justify it. I have seen one reading change as follows during a 20 day period:
9.15 9.11 9.05 9.03 9.01 8.99 9.02
I would be evaluating the readings differently if I would have seen all logged as just 9 or even with just one digit of resolution. Before starting, I would have felt fine with just one digit resolution but now feel glad that I have two.
Yes, the measured stability justifies the meaning of the resolution. The nine different RTD's agreed in an initial crosscheck to easily within 0.1C. One was not used due to it being 0.15 degree high compared to the others. I plan adding two additional RTD's to my project.
It feels nice to grow into using a meter's precision rather than growing out of its capabilities.
-
Linear Technology published several platinum RTD signal conditioning circuits which can achieve better than 0.01C resolution. Check out the LT1236 datasheet for an example which meets your requirements almost exactly. Absolute accuracy depends on a three point calibration.
-
I tested with ADS 1230
it looks great
What do you think ?
with 2 wire pt100
https://www.youtube.com/watch?v=iUZJSmH-l1s (https://www.youtube.com/watch?v=iUZJSmH-l1s)
-
It is a little hard to judge from the fast changing number, but it looks reasonable low noise. One could see it better from plotted curve or just a CSV file with a few 100 data. Chances are one does not need such a high speed and could average a few values.
2 Wire RTD is not really suitable with high resolution Pt100. One usually needs 4 wires. The ADS1230 should support using 4 wires with no more noise. 2 Wire are kind of OK with PT1000 if the cables are short and fixed.
-
It is a little hard to judge from the fast changing number
On my CN0359, with a steady temperature the 0.01 degree C digit is steady when taking four wire measurements with a 3.6M long cable between the meter and RTD sensor.
Since the CN0359 can also measure conductivity (/resistance) I have used that capability to compare 0.1 C accuracy RTD's with collocated 0.1 C thermistors. The temperature measurements typically agree to within 0.05 C. Sometimes they precisely match. With the CN0359's 24 bit ADC's and five digit conductivity measurements, I could be translating those to 0.001 C resolution readings from the thermistors. I have not gone to that point yet. The final digit of those measurements typically moves from one to five units--nothing like that seen on the video.
Since my temperature measurements are over a narrow temperature range, I make two point calibrations.
-
In my experience precision temperature isn't something simple.
I am using high resolution temperature measurements for a LTFLU voltage reference and succeeded to get temperature variations down to about 0,1 mK. Measurement noise is another factor 10 below that (HP 3456A). This is with an Arroyo TecSource 5235 plus some fine tuning based on temperature sensing of the LTFLU built-in transistor. But how can i know my sensor isn't drifting? I have to record ambient temperature, TEC heating current etc. and be very patient to learn how much it is drifting. It's not finished.
Recently i have been recording one of those cheap incubators NH-2 with a Sensirion SHT31 inside and after 10 days the temperature measurement drifted by about -0,3 °C. It's set to 19 °C and drifted from 19.9 °C down to 19.6 °C until now. The drift is clearly visible and i can't tell whether it's spring coming or whether one of the two temperature sensors (NH-2 internal sensor or SHT31) is drifting. I don't know how long the NH-2 has not been used before.
Regards, Dieter
-
It is a little hard to judge from the fast changing number, but it looks reasonable low noise. One could see it better from plotted curve or just a CSV file with a few 100 data. Chances are one does not need such a high speed and could average a few values.
My favorite way now to simply quantify noise in instrumentation applications is to display the standard deviation (RMS) and peak-to-peak over the past 10 seconds.
-
What are you measuring? I find it hard to think of cases that would require such resolution or be worth the effort to attempt it.
This is like building a scale to measure the weight of dust on a person. 0.01C is like dust. Breath near it and it changes. Even 1C is 'small'. For example the boiling point of water changes by about 1C with just a 1000ft elevation change. If you use boiling water as your calibration standard and assume it is 100C you can easily add much more than 0.01C of error.
You can introduce error in many ways, even if your electronic hardware is great.
Where are you placing the probe? Does your medium have the same temperature throughout its volume? How fast does temperature change in your medium? When temperature is changing, how will the temperature of your probed location compare to the average temperature of the medium?
Do you need absolute accuracy, or only a fine indication of small temperature changes?
This is an important question. Technically you shouldn't bother having resolution that is much better than accuracy but if you care more about seeing small changes than you care about absolute accuracy then it would be helpful to declare that so we have a better idea what your goal is.
I think Kasper has very good points here.
From -50C to 170C is a range of 220C, and at +-0.01C, that is 0.004545% or 45.45 part per million.
I doubt if component of that precision (+- 0.004545%) is even on the market. It would likely need to be custom made. If you are successful making that and even assuming your device is that accurate, I doubt it will function well unless it in a very controlled environment. Air movement, light, vibration, etc., will all impact your reading.
Temperature is inherently low precision. Far too many external factors exist to disrupt the accuracy. From external barometric pressure to air movement around your device will add to error in measurement, and that is but two of the many other factors.
-
SMT 172 is good for me
But there are some problems
1- It is non-linear
2- I'm not sure if it's accurate
I need a sensor to control the temperature
My available chip is ADS1232, in your opinion, with this chip, I can reach the accuracy of 0.01.
Are you looking for resolution or accuracy? Getting real accuracy to 0.01 degree C is not easy. You could consider a multimeter like this: https://download.tek.com/datasheet/1KW-61315-0_DMM6500_Datasheet_052119.pdf (https://download.tek.com/datasheet/1KW-61315-0_DMM6500_Datasheet_052119.pdf) or a precision temperature display like: https://www.transcat.com/media/pdf/mensor-ctr3000-datasheet.pdf. (https://www.transcat.com/media/pdf/mensor-ctr3000-datasheet.pdf.)
The problem with wanting 0.01 degree C accuracy (true accuracy not resolution) is that is really labatory grade accuracy. More so environmental conditions will throw your accuracy out the window. It is going to be a significant trick to get that accuracy out in the field. I guess the important question here, is this for research in a lab or something else. The concern is do you really need the accuracy??
-
A few days ago, I measured temperature drift with resolution around 0.00002°C, it is very simple you need only 3 things:
If we have one DMM and many thermistors, are there multiplexers that are stable enough (contact-potentials, unintentional thermocouples etc) ?
Any suggestions for a mux-product or chip/relay?
-
sounds like 4 wire measurement with a switched source, 4 wires mean the resistance of the wires is removed, and inverting the polarity of the current source each measurement let you subtract any EMF voltages.
There should be relay cards for 4 wire measurements available, and inverting the polarity instead of switching on and off should keep the thermals fairly consistant, there will just be a slight curve due to capacitance each flip which means you may need to delay the measurement.
-
A few days ago, I measured temperature drift with resolution around 0.00002°C, it is very simple you need only 3 things:
If we have one DMM and many thermistors, are there multiplexers that are stable enough (contact-potentials, unintentional thermocouples etc) ?
Any suggestions for a mux-product or chip/relay?
The Agilent 34970A or 34972A DAQ with 34901A multiplexer is good for this. It has the offset-compensated ohms function to remove the effects of thermoelectric potentials not just at the multiplexer but anywhere in the thermistor circuit. We have been using one of these to calibrate a batch of thermistors for an experiment to measure temperature profiles in domestic compost bins!
The new Keysight replacement - DAQ970 I think - should do the same job, and probably better. But there are plenty of the older units appearing on eBay at reasonable prices these days
-
For very high resolution one may need an eye on self heating of the sensor even with a 10 K thermistor. So one may have to use a test current smaller than normal, so may the 100 K resistance range, not some 20 K.
-
I know I am taking a risk here. Some of you would want to squash me like a bug, but I feel should sound an alarm anyhow: Measuring the temperature of a bin down to 0.01C or 0.00002C with contact-based (heat transfer) measuring is really nutty.
This is like measuring the height of high/low tide at the mouth of a major river delta down to millimeter. The wave, the wind, the passing ship traffic, all will have impact far bigger than your millimeter resolution. Even the swelling of the water while going around your pole with the mounted measuring stick will have > mm effect.
Heat, like a waves in water, needs time to get to your measuring device. Heat, like waves in water, diffuses. As in water hitting a shore, environment around your measurement makes a difference. The shape and level of the shore will affect the backflow of water thus the wave height.
Be it a thermistor, thermo-couple, whatever contact-based measuring device you use. Due to environmental effect, it will likely have a temperature differential greater than 0.01C from one part of your device to another, let alone 0.00002C. The entire path of heat conduction from the heat source to the measuring device will also have a temperature differential caused by the environment the bin is in, and that is also likely to be greater than 0.01C. For variation as low as 0.00002C, you have to first define what exactly are your measuring, and your target volume likely has to be sub-millimeter cube so heat transfer within the target is not impacting your desired accuracy.
The resolution is pointless. Resolution can be infinite with averaging, but resolution is not accuracy - heck, you can average it down to nano-meter. One ship passing near by will throw your measurement way off. By the time you average that "noise" away, three more ships and six more gust of wind already passed. So what is the high of the tide now? Measuring the tide down to mm is an interesting exercise, but it is not an effort to achieve real mm accuracy.
Okay, now the objection to my opinion can begin...
Edited - typo correction ..as slow as 0.00002C... should be ..as low as 0.00002C... (and other typo corrections)
-
The resolution is pointless.
Resolution and stability of measurement tool is very important if you need detect a drift.
The statement was in reference to when the system is in an inadequately controlled environment.
In your case and seeing your containment: "Box with 60mm thickness of Polyisocyanurate thermal shield with aluminium coating" -- yeah, hats off to you. You clearly has things in control there.
(Edited - added hats off to you, I am impressed by the measures you took; and you an't using an Arduino ADC...)
-
A few days ago, I measured temperature drift with resolution around 0.00002°C, it is very simple you need only 3 things:
- 6.5 digit DMM's
- NTC 10k
- 4 wire
I don't think so... because the ohms current through the thermistor creates self-heating.
In stirred oil or water, the error is not noticeable, but in convection air currents, the reading will mysteriously move around and always have an offset.
-
0.01° accuracy, on temperature, Can't be reached without the use of very very expensive sensor, The best Pt100 available will give you only 0,1 C
-
0.01° accuracy, on temperature, Can't be reached without the use of very very expensive sensor, The best Pt100 available will give you only 0,1 C
Or a time consuming and difficult three point calibration of an RTD.
-
"Or a time consuming and difficult three-point calibration of an RTD."
Without a Million dollar LAb, you will never do that on the right way, 0.01 C traceable Standards, are extremely expensive, and A PT100 system can't hold that calibration without adrift in a few weeks.
0.01C isn't for anyone, very few people on even companies haves access to 0.01 C temperature accuracy.
-
"Or a time consuming and difficult three-point calibration of an RTD."
Without a Million dollar LAb, you will never do that on the right way, 0.01 C traceable Standards, are extremely expensive, and A PT100 system can't hold that calibration without adrift in a few weeks.
0.01C isn't for anyone, very few people on even companies haves access to 0.01 C temperature accuracy.
But but but ... we're told we know the average temperature of the entire surface of the Earth to 0.1 C accuracy. Not just some single point but everywhere. And not only today but 100 to 150 years ago.
I won't bring up the fact that averaging temperatures from different places makes no physical sense. Oops.
-
from 0.1 to 0.01 , have a lot of money in this case
-
0.01° accuracy, on temperature, Can't be reached without the use of very very expensive sensor, The best Pt100 available will give you only 0,1 C
Here you go. (https://www.cannoninstrument.com/en/Image/GetDocument/7)
0.015°C accuracy. Not quite 0.01, but certainly better than 0.1°C. Using a Pt100.
Not terribly expensive either.
-
The original poster said what he wants to do and that is easy. He does not want to measure the temperature of the planet. And he does not want accuracy but resolution.
Using an Arroyo TecSource like shown on the images is one way to do it. Self heating won't be a problem, since this controller works with very low and constant currents, e.g. 100 uA for a 10 KOhm NTC. Self heating power will then be 100 uW near room temperature, lower at higher temperatures. If the NTC is glued into a hole in a metal part, the self heating error will be less than 100 uK.
When i see the image of the thermal chamber though, i am wondering about the cabling. If the cables are in ambient after 6 cm of thermal insulation, that will be a source of disturbance, unless the whole room is a lab with controlled temperature. One could use some thermal clamp to keep the cable end near the DUT at the same temperature, too.
Regards, Dieter