Electronics > Beginners
Battery powered LED PWM dimmer circuit
(1/4) > >>
doublec4:
Hi All,

I have two LED modules that were designed to run at 12V. They consume ~200mA each. I would like to run my LED lights from 3 18650 batteries wired in series to produce a total of 12.6V peak and probably have a cutoff somewhere around 9V. When I hook up my LEDs to my adjustable power supply and run them down to ~9V there is no appreciable change in brightness.

Therefore, I am looking to run them either full brightness (seems straightforward, run them right off the batteries...) or half brightness at the flip of a switch (think car tail/brake lights). The dimming part is where I am having some difficulty as I would like it to be efficient (PWM rather than constant current?) and simple. I have seen various circuits using 555 timer as the PWM pulse going to a transistor base but I'm not sure if this is efficient? I would like to get the most out of the battery life if possible.

Any guidance on what to look for in terms of achieving efficiency or candidates for circuits would be helpful.

Thank you!
mariush:
How about you use a proper LED DRIVER IC?

See https://www.digikey.com/products/en/integrated-circuits-ics/pmic-led-drivers/745 for examples.

Plain leds are not controlled by voltage, the brightness is controlled by varying the amount of current going through the LED. There is a minimum forward voltage required for the LED to turn on, and there's a very narrow region of voltage where the LED may not be fully turned on, like let's say between 9v and 9.5v but once you go outside that region you must limit the current otherwise the led would be damaged.

The most basic way to limit current is by placing a resistor in series with the led, so that's what you may have on your "module".

A led driver IC starts with a low voltage and slowly ramps up the voltage until the leds connected to it start to light up and current flows through the LEDs ... a resistor is used to monitor the current and the led driver will constantly adjust the voltage up and down to maintain the current flowing through the leds at the level set.

Here's an example of a very efficient led driver IC : https://www.digikey.com/product-detail/en/diodes-incorporated/AL8860WT-7/AL8860WT-7DICT-ND/6226981

See page 2 of the datasheet where you can see how simple the circuit is: https://www.diodes.com/assets/Datasheets/AL8860.pdf
You can set the maximum brightness by setting the maximum current (200mA) with a resistor, and then you can use the CTRL pin to adjust brightness easily.

For example, for the chip linked above, the maximum current is set with the formula Iout = 0.1 / Rs ... so Rs = 0.1 / Iout. 
So, for 200mA you'd need Rs = 0.1 / 0.2A = 0.5 ohm,  for 100 mA you'd need 1 ohm and for 50mA you'd need 2 ohm .... so you could have 0.5 ohm + 0.5 ohm + 1 ohm in series....

50mA : leave everything as is
100mA : use a switch to short out the 1 ohm resistor ... you're left with 1 ohm so you get 100mA
200mA : use switch to short out 0.5 ohm + 1 ohm, leaving you with only 0.5 ohm

Or, if you only need 2-3 thresholds/levels of brightness you can  place multiple resistors in series which in total give you the resistance for the worst case scenario (maximum current) and then you can use a switch to short out particular resistors in the chain, reducing the overall resistance.


Drive to voltage below 0.2V to turn off output current
Drive with DC voltage (0.3V < CTRL< 2.5V) to adjust output current from 0 to 100% of IOUT_NOM
Drive with PWM signal from open-collector or open-drain transistor, to adjust output current.
Linear adjustment range from 1% to 100% of IOUT_NOMfor f < 500Hz

So for example the easiest super simplest would be to send a voltage on the control pin between 0.3v and 2.5v to adjust brightness.
The cheapest LM317 or 1117 linear regulator can get you 1.2v ...2.5v which should translate to 55%...100% of maximum current.
You can use a potentiometer instead of the adjustment resistor to set the voltage between 1.2v and 2.5v

Buriedcode:
In terms of efficiency, driving the LEDs at 50% means... 50% power consumption.. but would cost at least a bit of power for the PWM circuit.  For a 555 timer circuit, which is cheap, will happily run from 12V, you're looking at ~10mA plus whatever is required for the switching element.  Most 555 circuits have limited adjustment for duty cycle, but as "brightness" is non linear, the top 80-100% will pretty much look the same, and <20% would probably be quite dim.

I would experiment with frequency and duty cycle to see what is acceptable.  For low duty cycles, one can notice flickering, especially with the lights being the only light source in a dark area, or in peripheral vision, at 1kHz and sometimes higher.  But at higher frequencies, >10kHz, using a MOSFET to switch the LED's uses up power just switching (google, MOSFET switching losses and gate charge).  So its a compromise.

Once you have an idea of the duty and frequency, then you can work on making it efficient.  But even with the above setup, with 200mA LED's, run at 50% duty, thats an average of 100mA.  +10mA for the circuit means <10% of the current draw will be the switching circuit.  A CMOS schmitt trigger RC oscillator driving a MOSFET will be pretty low power, but again you're only getting an extra 10% battery life by optimizing that part, where-as if you find you can get away with a duty of say 20% then you've saved 80%.

I don't know if you're a beginner, hobbyist or what, but if you are, I would start off with the 555, experiment, then optimize later.
Zero999:
A simple 555 timer circuit will do the job. You could use a potentiometer with a switch in series with the whole circuit to make sure it turns fully off, drawing no power.
doublec4:

--- Quote from: mariush on May 22, 2019, 06:08:09 pm ---How about you use a proper LED DRIVER IC?

See https://www.digikey.com/products/en/integrated-circuits-ics/pmic-led-drivers/745 for examples.

Plain leds are not controlled by voltage, the brightness is controlled by varying the amount of current going through the LED. There is a minimum forward voltage required for the LED to turn on, and there's a very narrow region of voltage where the LED may not be fully turned on, like let's say between 9v and 9.5v but once you go outside that region you must limit the current otherwise the led would be damaged.

The most basic way to limit current is by placing a resistor in series with the led, so that's what you may have on your "module".

A led driver IC starts with a low voltage and slowly ramps up the voltage until the leds connected to it start to light up and current flows through the LEDs ... a resistor is used to monitor the current and the led driver will constantly adjust the voltage up and down to maintain the current flowing through the leds at the level set.

Here's an example of a very efficient led driver IC : https://www.digikey.com/product-detail/en/diodes-incorporated/AL8860WT-7/AL8860WT-7DICT-ND/6226981

See page 2 of the datasheet where you can see how simple the circuit is: https://www.diodes.com/assets/Datasheets/AL8860.pdf
You can set the maximum brightness by setting the maximum current (200mA) with a resistor, and then you can use the CTRL pin to adjust brightness easily.

For example, for the chip linked above, the maximum current is set with the formula Iout = 0.1 / Rs ... so Rs = 0.1 / Iout. 
So, for 200mA you'd need Rs = 0.1 / 0.2A = 0.5 ohm,  for 100 mA you'd need 1 ohm and for 50mA you'd need 2 ohm .... so you could have 0.5 ohm + 0.5 ohm + 1 ohm in series....

50mA : leave everything as is
100mA : use a switch to short out the 1 ohm resistor ... you're left with 1 ohm so you get 100mA
200mA : use switch to short out 0.5 ohm + 1 ohm, leaving you with only 0.5 ohm

Or, if you only need 2-3 thresholds/levels of brightness you can  place multiple resistors in series which in total give you the resistance for the worst case scenario (maximum current) and then you can use a switch to short out particular resistors in the chain, reducing the overall resistance.


Drive to voltage below 0.2V to turn off output current
Drive with DC voltage (0.3V < CTRL< 2.5V) to adjust output current from 0 to 100% of IOUT_NOM
Drive with PWM signal from open-collector or open-drain transistor, to adjust output current.
Linear adjustment range from 1% to 100% of IOUT_NOMfor f < 500Hz

So for example the easiest super simplest would be to send a voltage on the control pin between 0.3v and 2.5v to adjust brightness.
The cheapest LM317 or 1117 linear regulator can get you 1.2v ...2.5v which should translate to 55%...100% of maximum current.
You can use a potentiometer instead of the adjustment resistor to set the voltage between 1.2v and 2.5v

--- End quote ---

Thanks for this!

I guess one challenge is that my battery is 12.6V fully charged and goes down to 9V before cutoff. Therefore, the IC cannot step down the voltage to the ideal 12V as the battery level drops below 12V...

I would have to find a higher voltage battery, or as I understand it would stress the LEDs but I suppose I could find the current at which they run at ~9V and set the max current to this value. Then the IC would maintain this current and drop the voltage to ~9V to the LED to make this circuit work.

Navigation
Message Index
Next page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod