What is the OP thinking of doing for heat dissipation? A typical 4K projector lamp will draw 250-300 watts alone, and if you've got a tiny transmissive display (like the OP's suggested LCD) you're going to have to dissipate a good portion of that from the display. I saw something like 33% loss earlier in the thread - I sure wouldn't be comfortable dumping 100W into a cheap consumer phone display for hours on end.
Good point and needs some consideration.
The LED isn't 100% efficient, maybe 15%, or 100 lm/W [1]. For a 300W LED lamp that is 15% efficient... there is only 45W of visible light as the starting "100% backlight" source. True there is also IR (heat), most heat can be dissipated at the back of the lamp through the heatsink. The temp on the heatsint x area will give an idea what IR there is also given emissivity. For easy estimate, emissivity=1, and a lamp the size of a screen 5.5" has 83.39cm², say this is like a CPU core on a cpu cooler, going up to 70C under load. black body radiation (total) at that temp seems to be ~ 6.5W ??
http://hyperphysics.phy-astr.gsu.edu/hbase/quantum/radfrac.htmlat 70C it's basically all IR-C, and IR-A is only ~5%, IR-A maybe all you might need worry about in fact, as that can be blocked by glass, say something between the LED and LCD, so there is only 45W of visible light now.
So, 5-8% light efficiency of LCD of the 45W visible light, round to 10% efficiency, there is only ~4.5W of light being projected!! That sounds like nothing, but if you can convert 1W to 683lm, x 4.5 = 3073.5 lm (convert to ANSI??). It checks out that the 300W LED @ 100 lm/W = 30,000 lm, ~10% is 3000 lm.
Anyways, that's still 45W - 4.5W projected = 40.5W the screen absorbs. 40.5W is only ~ .5W/cm sq for the LCD glass to dissipate, x 2 sides is only .25W/cm sq. A good amount could sink around the edges to case of a touching piece of glass too. The LCD datasheet might show a 60C high temp operational test for 240hrs but it may not be intended continuous, not sure. Add airflow to the LCD and it should be fine.
Maybe a good
idea [2]: put a reflective polarizer in front of the lamp the same polarization as the back LCD polarizer --- THEN the back LCD polarizer absorbs basically no light and it [should be] basically like having the reflective polarizer on the LCD. A normal polarizer has
50%-60% loss.
http://informationdisplay.org/portals/informationdisplay/issues/2010/09/art6/GIFS/Fig_1b.jpgWith the first 45% transmittance, the total is 7.6%, without the barrier it is ~17%, over double the output. If 'polarization recycling' was itself 50% efficient, ~12% is reached, which still is over a 50% gain. However the best I have read is a 30% gain which is still substantial and means ~ 25% reduction in power... that still takes 10W or so off the LCD, down to 30W, not to mention the 300W LED down to 225W, quite an improvement really.
I'm considering 6000K LED car headlights with fans / heatsinks attached for a lamp. (Not sure if any of those heatsinks look capable of even 100W though)
[1]
https://en.wikipedia.org/wiki/Luminous_efficacy[2]
http://www.apioptics.com/reflective-polarizer-films.htmlInteresting:
http://electroiq.com/blog/2008/01/shedding-light-on-alternative-to-lcd-manufacturing/(a TFT 'mems' display with no LCD, filter, or polarizer -- 'vaporware'?)