I asked because I was wondering about the efficiency of the latest available high power (few Watts) white (blue) LEDs. Efficiency or at least efficacy, in %.
Nobody seems to give such information in the datasheet. They only give lm, and lm/W, and so on. Although, I once saw stated 57,4% efficacy of some Luxeon high-power white LED, but now when I look at the same datasheet (and it was page 5), there is no such data anymore. WTF?
So I thought, if I knew at least rule of thumb value of lm for 1W, I could know the efficiency in %.
Out of curiosity, why are you so interested in watts/watts for your LEDs?
For the most part, nobody really cares about the watt per watt efficiency of LEDs because total radiated power (which is to say watts) is not a useful way to measure illumination. Note that illumination is distinct from irradiation because in the former we're generally talking about throwing light at something so that we can see it, which means we need to account for the spectral response of the eye, whereas with the latter we're talking about throwing energy in the form of light at something for some other reason. Lumens account for the human eye response, so that's why we measure illumination in lumens/area and irradiation in watts/area. If you were to look at the datasheet for a UV LED, for example, you'd probably see the output listed in W or mW, because generally with those you're throwing energy at some process (killing bacteria, curing adhesives, or whatever) that responds to energy at that particular wavelength (and you don't care about the human eye response because you shouldn't be looking at the thing anyway).
Furthermore, if you're using LEDs for illumination, the
efficiency of the emitter itself is only one part of the total
efficacy of the finished system. You also have to account for how much light makes it through your optics (optical efficiency) and then how much
useful light makes it to the subject (total efficacy). Note here the distinction between efficiency and efficacy. Efficiency deals with getting the same type of thing out as you put in, such as putting in watts of electricity and getting out watts of light, and efficacy deals with getting a different thing out as you put in, such as putting in watts of electricity and getting out lumens. By definition, efficacy isn't going to be a percentage, because if you have lumens/watts the units don't cancel out like they do if you had watts/watts. To bring this all back to the beginning, if the ultimate receptor of the light is the human eye, then you need to account for its spectral response, and since that's what lumens do, lumens are the useful thing that you get out of the LED, and watts are a completely meaningless unit for light*.
If you still really want to know the radiometric efficiency of your LED, you can use the table mikerj linked to at least get in the ballpark. For a single-color LED, you can simply pick a point on the table (or interpolate between two points--note that a few nm in wavelength makes a big difference, especially at the ends of the visible spectrum). For white LEDs, you'll need to tabulate the emission spectrum from the LED's datasheet, normalize the resulting table as well as the table that mikerj linked, and then multiply the two and sum the result. **
* Unless you have a LOT of light (or a lot less coherent light) and are calculating ocular hazard, in which case watts and/or joules are very important. While the luminosity function is crucial to determining perceived brightness, energy is all that it takes to cause retinal injury.
** If you have multiple single-color LEDs, things get tricky in the other direction, as the response of the eye to multiple discontinuous wavelengths does not necessarily add up the way that the luminosity function would suggest. That means that you can't really put the output of RGB LEDs in lumens with much confidence. There's some interesting work out of CIE on this subject, but last I looked the consensus was basically "it's really complicated".