If you point a cheap phone camera at an IR source, such as a TV remote control, you'll see a blu-ish or purple-ish light.
Actually any camera sensor can pick up IR, but more expensive cameras have lens with IR blocking filters.
But why purple? What is the process that goes on between the imaging sensor and the screen that basically changes a color to another?
Semiconductor image sensors (CCD and CMOS) are intrinsically extremely sensitive to IR, and progressively less sensitive through to blue and then UV. (The opposite of silver halide film, by the way!) So we put IR filters over the sensors to block the IR, which would otherwise significantly compete with the sensitivity in the visible light spectrum. The fact that residual IR makes it through is simply a property of the imperfect filters we have available. That it looks purplish, as opposed to pure white, has to do with the additional filtering of the RGB filters of the Bayer pattern over each pixel (presumably, the green filter removes IR slightly more than the red and blue filters, giving the purple tinge).
It is untrue that only cheap cameras exhibit this effect. It affects all cameras with semiconductor sensors. If you
didn't put in an IR filter over the sensor, the color of the image would be unusably wrong, and potentially the focus as well, since IR focuses at a slightly different distance than visible light. Any differences in IR sensitivity have to do with the characteristics of the components used.
The upper left quadrant of this picture is what a DSLR with its IR filter removed produces, in auto white balance. (The other 3 are with various filters and white balance settings.)
FYI, in security cameras and camcorders with an IR-illuminated night vision mode, you'll always hear a clunk when activating that mode, as the IR filter is physically removed from the optical path.
Essentially the spectral response for each red, green, blue pixel in a camera doesn't exactly match the spectral response of the receptors in your eyes. What you see through the camera is a mixture of mostly red and blue wavelengths with a slight touch of green.
Huh? Human vision is
most sensitive to green, which is why the typical RGB Bayer pattern* devotes twice as many pixels in a cell to green as to red and blue:
RG
GB
This reduces noise in the green spectrum where we are most sensitive to it.
Semiconductor sensors are more sensitive to low wavelengths (red/IR) than longer ones (green/blue/UV), so the color filters and the software have to try and balance the relative sensitivity to produce a faithful (to human eyes) reproduction.
The optical filters on the individual pixels of the camera sensor don't work as expected when so far out of the visible band - there is no red, green or blue output from the IR led, but all of the sensor pixels are being activated in an 'unspecified' ratio (basically, somewhere near white). Presumably a better camera would have a better IR stop filter to reduce these out-of-band wavelengths getting to the sensor.
Exactly. The
quality of the IR filter and the color filters, not their
presence, which is mandatory, is what defines how an IR LED appears.
*Alternative Bayer patterns have been toyed with in the past, mostly by Sony, including CMYG (cyan/magenta/yellow/green) and CMYE (cyan/magenta/yellow/emerald). The idea was to have noise in the sensor be spread across the RGB output pixels.