I know that the cooled thermal imagers (such as HgCdTe) are considered absolute temperature thermal imagers, while the more common uncooled VOx thermal imagers are considered differential temperature thermal imagers. However, differential means a difference between 2 things. What is the temperature difference that is being measured with such thermal imagers? Both types of thermal imagers appear to be designed to give actual temperature (not temperature difference) readouts. I mean on my FLIR One, I have a temparature scale displayed on the side of the image, that I can use to compare the color or brightness of the pixel with what point on the scale it is to get a rough estimate of the temperature (not a difference between that pixel some other temperature, such as another pixel on the screen). Likewise, if I turn on spot temperature mode, I will get the exact temperature of the center pixel on the device (not a difference between that pixel and some other temperature, such as another pixel on the screen). So can someone here explain exactly what temperature difference is being measured?