EEVblog Electronics Community Forum
Electronics => Beginners => Topic started by: desertgreg on April 28, 2023, 08:21:53 pm
-
Most projects I see power their WS2812 leds using a 5V power rail. Internally there are 3 separate LEDs and the blue led only needs (I think) 3.5V. My thought is, would there be an advantage if I tuned my buck converter powering the LEDs to output something closer to 3.5V instead of just using 5V? It seems like this would allow the individual leds to do a smaller step-down to the LED voltage (though I don't know how ws2812's work internally). Obviously if I get too close to the min voltage for the blue led, then it would stop working correcly.
TLDR: Would my device gain efficiency if it powers its ws2812 leds at 4V instead of 5V?
-
Yes it would be more efficient ( assuming your power supply is a switcher whose draw is proportional to output power, not output current. - reduce voltage until you see a drop in current draw, then increase a little. I've run short LED strips at 3.3v, though we didn't need blue in that job
Just make sure you allow for voltage drop, especially when using LED tape
-
None at all.
Read and respect the datasheet. The 5 V is for the internal logic/drive and interface.
-
None at all.
Apart from saving up to 25% power, which may or may not matter
Read and respect the datasheet. The 5 V is for the internal logic/drive and interface.
The datasheet says 3.7-5.3v supply
-
The datasheet says 3.7-5.3v supply
Good.
Then continue your project with the OP on running them at 3.5 V.
-
Rather late to the conversation... but heat would be another reason to test on lower voltages... I have a tight 10x10 grid of 2020s and at 5V and full brightness - it gets too hot to touch.
-
The datasheet says 3.7-5.3v supply
That should probably be measured at the far end of the LED chain. There is going to be a voltage drop.