Very simple rule of thumb: they produce the same amount of Watts in heat, as you put in. I.e., for cooling purposes, assume efficiency of 0.
Only the very best LEDs approach efficiencies around 50%, in which case half of the power in would end up as heat directly, half would be emitted as light. But, most likely your LEDs are in the 20-30% efficiency range, so assuming 0% efficiency is close enough. The error in this assumption works for you, giving a bit of extra margin.
Standard formulae apply; see the thermal resistance specification of the heatsinks and LEDs you are going to use.
For example:
Heatsink is rated at 10 degC/W
LED is rated at RthJ-C (thermal resistance junction-to-case) = 20 degC/W,
Ambient temperature = 50 degC
LED power in = 2W
Resulting die temperature = 50 degC + 2W * (10+20)degC/W = 110 degC.
Then note the LED derating curves on the datasheet. While you can run most LEDs at around Tj=120 degC no problem, they last longer, give better efficiency, and allow higher output power if you can keep them cool.
To simplify all this, LEDs can run roughly at the same temperatures as CPUs. So if you have a CPU cooler designed to properly cool a CPU specified to produce 65W of heat flow, it will be able to cool a 65W LED as well. This assumes you can thermally couple the LED to the heat sink as well as you would couple the CPU to it.