EEVblog Electronics Community Forum
Products => Thermal Imaging => Topic started by: Fraser on April 09, 2020, 11:18:06 am
-
So here is a question for all you Thermal imaging fans :)
How important is measurement accuracy and uncertainty to you in your use of a radiometric thermal imaging camera ?
As many will already know, a thermal imaging camera can be a useful tool for detecting abnormal temperatures in machinery, electrical and electronic systems. It can be a very powerful preventative maintenance tool in industry and a great diagnostic aid to techs when fault finding. But how much temperature accuracy is truly required for such tasks ?
A pretty standard Industrial grade radiometric thermal imaging camera will have a measurement accuracy specification of “+/- 2C or 2% (whichever is greater)”. Keeping the maths simple, that means that a temperature measurement of a target at 100C surface temperature of known emissivity will fall within the range 98C to 102C in a perfect World. The 2% tolerance provides the same range of reading. However if the target is at 200C and measured again, the 2% tolerance specification comes into play and the measurement will fall in the range 196C to 204C. As this exceeds +/-2C it overrides that Celsius tolerance specification. Now at 1000C the percentage tolerance specification means that the measured target can provide a reading of 980C to 1020C so an error of +/-20C is still within tolerance. Note that many professional thermal cameras provide accuracy specifications fir each operating range that they provide. The accuracy specifications can be very different between measuring a target at 100C and one at 1000C. Low temperature targets add their own complications to the accuracy specification as the emissions from low temperature targets approaching 0C are small and errors easily creep into measurements. The camera system has less ‘signal’ to work with so low temperature measurements can reveal weaknesses in a cameras design.
Life is never so simple however. In any measurement using a radiometric thermal camera there are equipment tolerances and uncertainties at play. To make a accurate measurement of a surfaces temperature, it’s emissivity needs to be known. When working with calibrated Blackbodies this is not such an issue as the Blackbody emission plate or cavity will have a stated known emissivity over a specified wavelength. Surface Emissivity can change with wavelength ! In a real world practical scenario, a target is unlikely to be coated with a known emissivity paint like a Blackbody. Thermal camera manufacturers often provide a table of emissivity in their user manuals. This details different materials and their approximate emissivity. I say approximate as it is a generic table of emissivity and their are infinite variations in a real world surface that could effect emissivity. The Thermographer works from these tables to decide what emissivity is involved in the measurement. There are techniques to enhance emissivity and to add a material of known emissivity to a surface that cannot easily be classified in terms of its emissivity. PVC tape is a common material for such. There will normally be a small error in true emissivity versus that used in thermographic measurements but it can often be minimised by testing if the targets surface using other temperature measurement techniques, where possible.
So we have one uncertainty, what others exist ? Well there is the atmosphere and the distance to the target from the camera to consider. Our atmosphere is a complex mix of gases and pollutants. As such it does have some effect on the accuracy of a radiometric measurement where significant distance separates the target from the camera system. This is why many industrial cameras have a measurement correction algorithm that needs to be told the distance to target. The correction system uses calculations based on normal clear air. If the air has high humidity, smog, dense smoke or mist present, there will be an error introduced into the measurement. A Blackbody of known temperature and emissivity may be used to better understand the ‘path losses’ that will effect important measurements.
There are other uncertainties that can effect measurements such as a cameras last calibration check and it’s true accuracy in different operating temperatures that fall outside those detailed in the accuracy specification. Do not think that a thermal camera specified at +/-2C or 2% at an Ambient of 25C will necessarily maintain that accuracy in an ambient temperature of 40C or 10C ! You may be disappointed unless the camera is a high grade industrial model. For example, the FLIR E4 accuracy specification applies at a set ambient temperature. The further away from that specified temperature you go, the greater will be the measurement error. These budget thermal cameras do not have temperature stabilised microbolometers. They use generic ambient temperature offset tables to correct fir changes in ambient temperature. High grade industrial cameras normally use a Peltier element based temperature stabilisation system to maintain the microbolometer temperature at that originally used during factory calibration. There is good reason why Industrial thermal cameras are more expensive than consumer grade units. They have to remain accurate in any different working environments, some of which are very harsh indeed.
So where am I going with all this ?
Well have you ever considered just how much error in a radiometric thermal cameras measurements may exist and YOUR tolerance for such in your application? Is it really that important that the measurement of an integrated Circuit or passive component is accurate to within 2C or 2% ? Would you be able to tolerate a +/-5C or 5% tolerance, or even greater ? If a component is so close to suffering thermal overload and harm for a +/-5C or 5% tolerance to be important, maybe the design requires a revisit ?
Thinking about your general use of a thermal imaging camera, how large a tolerance would you be able to tolerate and still gain useful information from the thermal camera?
To be balanced, I should state that many radiometric thermal imaging cameras from respected stables are far more accurate than their specifications might suggest. This is the manufacturer being cautious due to the uncertainties detailed above. I have tested some if my industrial and prosumer cameras and the error at 50C and 100C has been tenths of a degree.
Food for thought and I welcome your comments on this matter :-+
Fraser
-
There are two more causes of uncertainty, from the camera viewpoint, if we assume the camera compensates for everything it can know about in of itself.
While the camera can measure ambient temperature at its' location, even if accurate this is not necessarily the temperature of the air path to the target, nor the average temperature of what is being reflected back by the target (ie the 1-e part).
So we now have a path loss where the loss is replaced by an unknown temperature, and a target emitting emissivity fraction e of its temperature but also reflecting 1-e of an unknown temperature.
Bill
-
Why would you need the air temperature?
I can think it's related to how much water vapour can be in that air... provided you also know the relative vapour concentration, and also along the path |O (an the camera knows how it is sensitive to it)
-
Yes, the energy from the source being absorbed is replaced by energy re-radiated by the opaque constituents in the air (mainly water & CO2). As you say assuming 'average' air
The camera will also have to use its' measurement of 'local ambient' as the fill in for emissivity compensation.
As an example, we had a simple test chart that sat at a 5°C differential and we used the 5°C as the basis for an MTF measurement. Come humid August (and no A/C) no camera passed test in the afternoon. Not because the focus was out, but the assumed 5°C differential now only looked like 4°C !
Bill
-
For my 2 pence on the topic, I'd like to have some accuracy "just because".
Accuracy may not be all that necessary but if it's ridiculously loose it seems like a bad thing even if it doesn't matter. Anything more than a few degrees (F or C) off between freezing & room temperature would seem crazy. Imagine showing a friend your fancy new toy & it says an ice cube is 10C. Will the friend be impressed? It would probably be better not to display a temperature.
But for my main intended purpose of detecting warm bodies on the premises I don't think it matters. What matters is the ability to detect differences in temperature. The same would be true for detecting cold air leaks or missing insulation and wet spots.
I don't need to know if the coon defecating in my back yard has a fever or not; I just need to be able to pick it out from the surroundings. :)
-
This is well understood by our colleagues using infrared thermal sights : no need for temperature measurement, contrast is enough.
-
Who cares about measurement errors: https://www.reddit.com/r/Wellthatsucks/comments/fzi9x6/fake_thermoscan_from_china_that_will_never_exceed/ (https://www.reddit.com/r/Wellthatsucks/comments/fzi9x6/fake_thermoscan_from_china_that_will_never_exceed/)
;D
-
For me, measurement accuracy is an optional extra that's nice to have but mostly irrelevant - I am normally more interested in making pretty pictures rather than knowing a precise temperature.
I'm fortunate to have a range of uncooled thermal cameras ranging from an early vidicon-based Argus 1 up to a couple of good 640x480 types, one of which is science-grade and seems to be fairly accurate.
Most of the time I am more concerned with resolution (and focus) than precision. Controllability is important, too - I need to be able to set a camera so it's stable, rather than changing its gain and palette.
A high resolution, low noise camera that's switchable between manual and auto span & level, has adjustable focus and maybe interchangeable lenses is my ideal. My FLIR SC-660 is pretty good for that, and has accurate measurements too, but that's something I could definitely live without.
-
Regarding whether accuracy is important -- it varies a lot depending on what you're doing. If you are searching for dog turds in the grass, you don't really care about temperature at all. Detection is all that's important!
If you are looking for cold spots in the house, maybe you are only interested in temperature differences. By the way, I would think temperature differences would be fairly accurate, even if the absolute temperature measurements are not very accurate. I am assuming that temperature errors are basically systematic errors, rather than random errors. Maybe someone can correct me if I'm wrong on this.
The place where absolute accuracy is needed is measuring human body core temperature. But you probably wouldn't use a thermal image camera for that.