from memory the simplest way to approach it is to remove the IR filtering lense from a webcam or camera and used a special type of exposed camera film to instead filter out all visible light,
Unfortunately that won't result in a thermal image camera. Normal (visible light) image sensors can sense up to perhaps 1um wavelength with the IR filters removed. A true thermal image camera works at much longer wavelengths, on the order of 10um. "Low cost" thermal cameras use a microbolometer, which is basically an array of microscopic thermistors suspended above an ASIC containing readout circuitry, all mounted in a vacuum chamber. The heating due to incident IR light is measured by the thermistors and read out by an ADC.
Higher end cameras use something more along the lines of a conventional CCD or CMOS sensor, but using different materials. The sensor also has to be cooled well below ambient, to perhaps -100C, to avoid being blinded by it's own thermal radiation (imagine designing a camera where the image sensor and all the casing, lenses, etc were glowing white hot).
I was thinking about a practical way to make a thermal imager, the only thing I can think of is to make a "minibolometer" using an array of the smallest SMT thermistors available. Resolution would be low, and the response time would be very slow, likely many seconds, but it may just be possible.
You could buy one of those ~$50 non-contact digital infrared thermometers (the kind you point at an object and it reports temperature)
Then take it apart and somehow mount the sensor and optics on a scanning motor so you can build an image of a 2D area by reading values and position.
I have no idea if it would work but it wouldn't be too expensive to find out.
That is an interesting idea. I wonder what type of update rate one of those sensors could handle. It would be like HAM Radio Slow Scan TV.
This has actually been done before:
http://www.cheap-thermocam.tk/