@Ben321,
I will not argue with you about noise injection on the F1G2
I'm talking about the F1G3. The one that was released for Android phones. I did my testing with an Android phone. F1G2 was only ever available for the iPhone5. I do not have an F1G2. My description that it doesn't appear to have any noise injected, as stated in this thread's opening post, is based on the results of my testing Android-based F1G3 using the sample app that comes with the FLIR One SDK.
The E4 had the noise injection wound up high to degrade its NETD compared to other cameras in the series.
Maybe so, but the FLIR One G3 does not have that. In fact, if you look at the configuration file that comes with the FLIR One SDK, you will see that noise is disabled.
.caps.config.image.targetNoise entry
.caps.config.image.targetNoise.enabled bool false
.caps.config.image.targetNoise.targetNoiseMk int32 0
I assume this is the same configuration file, and from the same SDK, that would have been used by FLIR to compile their official FLIR One app.
Please consider the marketing angle
Actually, if anything, it would hurt marketing. It is nearly inevitable that something like that would be found out, and when it is, it makes the FLIR company look very dishonest (charging thousands more, for something that doesn't have thousands of $ worth of hardware improvements, but just firmware that lets already existing hardware be less crippled, that can easily be modified with changes to a couple lines of code), and that would make their company look bad, and turn away potential customers.
FLIR are past masters of thermal image processing but they wish to market different cameras, using similar hardware, at differing price points. Hence why an E4 is just an E8 with its resolution and NETD deliberately degraded.
But I'm not talking about the E series, or even previous generations of FLIR One. I'm talking about FLIR One gen-3. That's what I was ALWAYS talking about in this thread, right from the opening post. I maybe should have been more specific about that in my opening post, but I didn't think I needed to, because I assumed most people are now using the gen-3. So from now on, all further replies to this thread should be based on the fact that I'm talking about the FLIR One gen-3. The FLIR One gen-2 is irrelevant to the test results I gave describing the performance of F1G3.
Can you prove that the FLIR One gen-3 does in fact inject additional noise? If it really was for marketing purposes, and they were really thoroughly trying to make sure that the end-user truly thought that the FLIR One official app showcased the best possible performance of the device, they'd go out of their way to make sure That the end-user couldn't just program around that. Yet this is not what we see happening. We see that the SDK clearly allows direct access to the 14bit-per-pixel raw data. I tend to think there may be some technical reason why FLIR chose to use cropping in the official app, and why they might have injected any noise (if they even did).
Or are you arguing that even the raw data itself (as is accessible through the SDK) contains noise injected into it?