Author Topic: Need help with LED cooling, is there a rule of thumb on heat produced?  (Read 1875 times)

0 Members and 1 Guest are viewing this topic.

Offline UndweeberTopic starter

  • Regular Contributor
  • *
  • Posts: 107
  • Country: us
1000316-0
Is there a rule of thumb how much a cob LED produces heat in W?
I got a 2700K 65W LED and not sure which heatsink would be suitable, as you see in the picture I have 2 CPU heat sinks that I can actively cool as well, I have two small passive heatsinks from a motherboard, and I also have a small active GPU heatsink assembly.

I don’t have a thermal camera to monitor and experiment I would have to use my own fingers to measure temperature not sure what’s safe for it, the LED is super thin so it would burn instantly if powered on without the heatsink.
 

Offline Gregg

  • Super Contributor
  • ***
  • Posts: 1186
  • Country: us
The rule of thumb is that the laws of thermodynamics apply (unless you are a free energy fanatic).  That means that if the LED is rated for 65 watts, you have to be able to dissipate 65 watts of energy as heat, worst case.  Then you have to consider the thermal conduction of the LED substrate to the heatsink; in other words have surfaces that are have lots of surface area in contact and use heat sink compound.  Than you have to consider the ambient temperature, surface area of the heatsink, thermal conductivity of the heatsink material, and if it is forced air or just convection transfer to the atmosphere.


Edit:  From your picture, it appears that either of the two large aluminum heatsinks would suffice if air can freely flow around them.  The smaller one with the fan may work.  Many multimeters have thermocouple temperature inputs that could measure the temperature.  If it burns your fingers it probably is too hot for the LEDs to last very long.
« Last Edit: June 05, 2020, 08:07:05 pm by Gregg »
 

Online tunk

  • Super Contributor
  • ***
  • Posts: 1325
  • Country: no
The two large heatsinks look like they could have been used with
50-100W CPUs together with fans. I guess you have to test if they
will work without fans. The others are too small.
« Last Edit: June 06, 2020, 10:43:01 am by tunk »
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 10336
  • Country: fi
Very simple rule of thumb: they produce the same amount of Watts in heat, as you put in. I.e., for cooling purposes, assume efficiency of 0.

Only the very best LEDs approach efficiencies around 50%, in which case half of the power in would end up as heat directly, half would be emitted as light. But, most likely your LEDs are in the 20-30% efficiency range, so assuming 0% efficiency is close enough. The error in this assumption works for you, giving a bit of extra margin.

Standard formulae apply; see the thermal resistance specification of the heatsinks and LEDs you are going to use.

For example:
Heatsink is rated at 10 degC/W
LED is rated at RthJ-C (thermal resistance junction-to-case) = 20 degC/W,
Ambient temperature = 50 degC
LED power in = 2W

Resulting die temperature = 50 degC + 2W * (10+20)degC/W = 110 degC.

Then note the LED derating curves on the datasheet. While you can run most LEDs at around Tj=120 degC no problem, they last longer, give better efficiency, and allow higher output power if you can keep them cool.

To simplify all this, LEDs can run roughly at the same temperatures as CPUs. So if you have a CPU cooler designed to properly cool a CPU specified to produce 65W of heat flow, it will be able to cool a 65W LED as well. This assumes you can thermally couple the LED to the heat sink as well as you would couple the CPU to it.
« Last Edit: June 06, 2020, 06:50:59 am by Siwastaja »
 

Offline ahbushnell

  • Frequent Contributor
  • **
  • Posts: 776
  • Country: us
Very simple rule of thumb: they produce the same amount of Watts in heat, as you put in. I.e., for cooling purposes, assume efficiency of 0.

Only the very best LEDs approach efficiencies around 50%, in which case half of the power in would end up as heat directly, half would be emitted as light. But, most likely your LEDs are in the 20-30% efficiency range, so assuming 0% efficiency is close enough. The error in this assumption works for you, giving a bit of extra margin.

Standard formulae apply; see the thermal resistance specification of the heatsinks and LEDs you are going to use.

For example:
Heatsink is rated at 10 degC/W
LED is rated at RthJ-C (thermal resistance junction-to-case) = 20 degC/W,
Ambient temperature = 50 degC
LED power in = 2W

Resulting die temperature = 50 degC + 2W * (10+20)degC/W = 110 degC.

Then note the LED derating curves on the datasheet. While you can run most LEDs at around Tj=120 degC no problem, they last longer, give better efficiency, and allow higher output power if you can keep them cool.

To simplify all this, LEDs can run roughly at the same temperatures as CPUs. So if you have a CPU cooler designed to properly cool a CPU specified to produce 65W of heat flow, it will be able to cool a 65W LED as well. This assumes you can thermally couple the LED to the heat sink as well as you would couple the CPU to it.

They produce the same watts that are put in.  But it's split between light and heat.  So the heat will be less depending on the efficiency. 

 

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 22435
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Yeah, modern LEDs are so efficient, they dissipate palpably less than their electrical input.  There isn't much design consequence as the heatsink isn't, like, magnitudes smaller (not yet anyway..), but it's there.

As for finding out what the difference is, hard to say; you can calculate backwards from the luminance, angle and spectrum, though that's a PITA.

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
It might be easier to just measure it. Take a known volume of aluminum, thermally bond the LED to it and then measure the temperature rise over a period of time, that should allow you to calculate the heat output reasonably accurately.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 10336
  • Country: fi
If it's an Ebay LED, the efficiency is almost guaranteed to be below 30%; very likely below 20%.

If it really is, say, 50% efficient, that's called "state of art" and you know if you have it; they tend to be unobtanium even from proper distributors (datasheet numbers are misleading; the front space spec tends to be for a higher-efficiency bin which you actually can't buy; for example OSRAM does this all the time).

Therefore, assuming 100% of power in goes to heat is a good first-order approximation when designing for cooling, and the reality is only slightly better (playing in your favor).

If you need Tc (led mounting case temperature), measuring is obviously easiest because it solves all unknowns, such as Rc-to-heatsink, Rheatsink, and so on.

If you want to know Tj (LED junction temperature), you can combine measuring the heatsink temperature with the known power in. If you don't want to assume 0% efficiency, by all means assume 25% instead and you'll be very close.
« Last Edit: June 07, 2020, 07:07:54 am by Siwastaja »
 

Offline UndweeberTopic starter

  • Regular Contributor
  • *
  • Posts: 107
  • Country: us
Thank you all for the replies, this is the LED I’m dealing with it’s 2700K and 95+CRI https://download.luminus.com/datasheets/Luminus_CXM-27_GEN1_Datasheet.pdf

Those large heatsinks in the original post are indeed CPU coolers and since those are 60-80W normally it will definitely handle the LED it’s also the perfect size, it does appear like I will have to just wing it and test it, I will also test the small active cooler from the GPU by slowly increasing the brightness.

I’m not sure what efficiency the LED  is, I would hope above 30%, but it’s Junction temperature is 140°C  (I don’t know what it means), it’s nominal operating temperature it was tested at was 85°C so that’s what I’ll be aiming for at full power 65W. In their tests they far exceed the 65W so I don’t know what’s all that about.

I am still looking for the constant current power supply with dimming abilities... still not sure what AWG wore to get, it’s wither 10AWG or 14AWG? 10 seems a bit too fat.
 

Offline Gregg

  • Super Contributor
  • ***
  • Posts: 1186
  • Country: us
Quote
still not sure what AWG wore to get, it’s wither 10AWG or 14AWG? 10 seems a bit too fat.

From the data sheet the LED draws 1800mA (1.8A) at the desired 65 watt illumination; the maximum listed is twice that or 3.6A.  20AWG wire would be quite sufficient and easy to use unless the LED is a long distance from the power supply.  The power wires can be considered series resistors with very low resistance which in your case will not affect the LED light output by a noticeable amount. 
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf