Electronics > Metrology

IR thermometer for lab study


I'm doing a study on the electrical heating of graphite rods, and I need a way to measure temperature.  Thermocouples are a bit too slow, and their heat capacity will alter the temperature.  An IR thermometer seems ideal, but I don't know where to find one suitable for lab use.

Consumer models simply display the temperature, with no means of recording it.  I want to set up a continuous data logger.  Also, I can't find any information on how temperature is determined in regards to the field of view.  If half the field of view is at 20 °C, and the other half is 200 °C, would the measured temperature be the maximum, or the average?  Is there a way to find the maximum?

A proper IR camera would help here, but I can't quite justify its cost when I only need to measure a single point.

There is the Fluke VT02 that seems to have visual IR cheaper than a thermal imager.  It does have software available for data logging and a near/far indicator. One possible issue you might have is the emissivity of the graphite might affect it's accuracy depending on the diameter. 

Omega makes bare fine wire thermocouple that might have a quicker response time (0.025mm @ 0.05 second in still air) than a IR gun. Thermal loss might be to much for you though as you mentioned, just thought I would mention it.


Hope this helps you in finding a suitable solution.

This sounds like a project where a Flir Lepton based sensing setup could make sense. The Lepton 2 will have 9 fps output, so I'm not sure if that's sufficient for your needs.


[0] Message Index

There was an error while thanking
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod