Hi all!
A question to Gurus!
Some thermal camera calculate "thermal flux", in form of two functions: Power and PowerLoss. I had opportunity to play with the first function, which calculates power, in BTU/hour*square foot. I converted these unit into Watts per meter2, with the formula found elsewhere, and run some tests with measuring low temp areas, room temperature (22C) and a hot plate. The result of the calculations should be thermal flux from the surface into "absolute zero" space

More or less it fits the Stefan-Boltzman Law and 4th power of absolute temperature, but I noticed that for low temperature and high temperature, the accuracy is low, error is about 20%. While for the room temperature error in negligible - some 2 percents or below.
Note: this has nothing to do with calibration, RFB, etc., because it merely calculates power of temperature of all and each of the pixels, then summing.
Question: why so?
Possible suggestions:
1. They do not actually use Stefan-Boltzman formula, but instead, use some Plank derived integration over the angle and spectral range. Would be kewl!
2. They apply some "corrections" to account for non uniformity of emissivity over the spherical angle
3. They screwed up...
What you as Guru, know about how Power is calculated in thermal cameras? Power Loss is another story, but still,
if I wanted to calculate power loss, I would calculate Power of the surface in focus, and then rotated camera 180 degrees and calculate Power of the scene onto which my surface is looking. Then I'd use a difference as a result - radiative power loss. Is this approach valid?
If it is, I am afraid this 20% on the absolute scale error, will be even more important in measuring relative difference...

So... you opinions are welcome!
thank you all!