Electronics > Beginners

Power supply and resistor for CREE LEDs

(1/2) > >>

Paul92:
I got some CREE LEDs from AliExpress, and I am trying to build a nice lamp out of them. And I have lying around an ATX power suppy which I hope to use.

According to the seller, the LEDs are 3.4-3.6V at 3W, and my suppy can deliver 15A at 3.3V. Yes, 3.3V is lower than the minimum 3.4V, but the LEDs seem to light up anyway.

Since my source matches the forward voltage of the LEDs, do I need a resistor in there? If so, how could I compute it? And is an ATX supply a reasonable choice here?

james_s:
Cree is a company, they make *many* types of LEDs.

Power LEDs like that normally run around 750-1,000 mA so a resistor is not really a good way to limit the current. Instead they use constant current drivers in the form of buck or boost converters. The PT4115 is one that I've used several times, it's inexpensive, easy to work with and works well. You can get ready made drivers in the form of small boards meant for LED MR16 retrofits.

Audioguru again:
Your power supply voltage is low so the LED current is low when they are cold. But as they warm up they need less voltage then the current rises which makes them warm up more which makes the current rise more which makes them warm up more which .... See, then they burn out with thermal runaway. You MUST use a resistor or regulated current to limit the current.

mariush:
Don't power them from 3.3v because the forward voltage will vary with temperature of the leds.

The easiest and cheapest way would be to use some diodes to lower the 5v voltage closer to 3.6v and then use the resistor to limit the current.

For example, get some 2-3A diodes with forward voltage of around 1v ... ex RL255 or 1n5408 and these will have around 0.8v..0.9v voltage drop at 1A.
If you don't have these, you could parallel two 1n400x diodes so that each diode will only transfer half an amp.

Anyway, with a diode (or two paralleled) in series with the 5v output, you'll have approx. 4.1..4.2v
Let's say you have a 3.6v 3W (3/3.6 = 0.83A) led ...  now your formula to calculate resistor becomes a bit easier :

4.2v - 3.7v = 0.83A x R  => R =  (4.2-3.7) / 0.83 = 0.60 ohm  ... so you could use 0.68 ohm, or two 1.2 ohm resistors in parallel.

The resistor will dissipate  P = IxIxR = 0.83 x 0.83 x 0.6 = 0.41334 watts, so you'll have to use resistor (or resistors) rated for 1w

+5v ----> [  diode(s) >| ---- [ resistor(s) ]----- led +
-GND --------------------------------------------------led -


HOWEVER, what I would suggest is to get THREE of those leds and wire them in series, and you'll get 3 x 3.6v = ~ 10.8v and then you can power it from 12v

12v - 10.8v =  0.83A x R  => R = 1.2v / 0.83A = 1.44 ohm ... so you can just use a 1.5 ohm resistor, or 2 x 3.3 ohm in parallel for 1.55 ohm (higher resistance means a bit less current)

The power dissipated will be 0.83 x 0.83 x 1.5 = 1.03335 watts, so you'd want to use a resistor rated for 3w or something like that.


ANother thing I'd suggest is to not expect those leds to be 3W... expect more like 1W and limit the current accordingly. Also... you MUST use a heatsink for the leds, or they'll die soon.

james_s:
If they really are Cree, then 3W LEDs can be driven at 3 Watts. We're not talking about cheap integrated lamps rated in bogus Chinese Watts. When you can get proper drivers for a few dollars why  mess with resistors or stringing diodes together? There are all kinds of drivers like these: https://www.aliexpress.com/item/4000034857143.html?

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod