Author Topic: Reducing Current on a constant current LED by parallel resistor  (Read 580 times)

0 Members and 1 Guest are viewing this topic.

Offline drakejest

  • Regular Contributor
  • *
  • Posts: 167
  • Country: 00
The connstant current IC im using to drive the led only goes to 10mA as its lowest setting. I would like to reduce it to 5mA, 10mA seems to bright for my eyes. since no matter what Rext i use i wont be able to push the IC below its lowest setting.

I had a crazy idea to parallel a resistor to the led to shunt some of the current from the led



Question 1 : Will it work??

Question 2 : because of the wide voltage range i need to cover i would assume the resistor value would need to change depending on the input voltage, if the voltage were fixed 5v, 12v, 24v what resistor values i need for those so the LED will have 5mA
 

Offline gcewing

  • Regular Contributor
  • *
  • Posts: 175
  • Country: nz
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #1 on: July 16, 2021, 01:47:34 pm »
Yes, I think it will work. The resistor value shouldn't depend on the supply voltage, only the forward voltage of the LED.

Measure the voltage across the LED when it's on, and choose a resistor that will pass 5mA with that voltage across it.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 3356
  • Country: us
  • Retired, now restoring antique test equipment
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #2 on: July 16, 2021, 01:58:30 pm »
This will work for applications where current stability is not important (like an indicator), since the voltage across the LED (and therefore the current through the extra resistor) depends on the temperature of the LED chip itself, which heats up with the current through it.
 

Offline drakejest

  • Regular Contributor
  • *
  • Posts: 167
  • Country: 00
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #3 on: July 16, 2021, 02:15:30 pm »
This will work for applications where current stability is not important (like an indicator)

yes only for indicator light.

depends on the temperature of the LED chip itself, which heats up with the current through it.

Temperature is really hard to take into account, what is the scale of its effect?


Measure the voltage across the LED when it's on, and choose a resistor that will pass 5mA with that voltage across it.

I dont have a prototype yet, this will be the first. so i dont really have something to measure. Given this LED can you help me make the formula ?
« Last Edit: July 16, 2021, 02:17:29 pm by drakejest »
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 3356
  • Country: us
  • Retired, now restoring antique test equipment
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #4 on: July 16, 2021, 02:36:12 pm »
Fig 2 of the data sheet referenced by your link shows about 1.92 V across the diode with 5 mA through it.  Although the current is a strong function of the voltage, the voltage (at constant temperature) is a weak function of the current.  I wouldn’t worry about the temperature at this low power, unless you have a high ambient temperature.
 

Offline drakejest

  • Regular Contributor
  • *
  • Posts: 167
  • Country: 00
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #5 on: July 16, 2021, 04:17:54 pm »
Fig 2 of the data sheet referenced by your link shows about 1.92 V across the diode with 5 mA through it.

Im sorry i could not think of a way to come up with the resistor value.

I did try brute forcing by simulating and adjusting the resistor, but i feel this is cheating, and possibly inaccurate.

I set the led forward voltage to 1.92v on this one



 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 3356
  • Country: us
  • Retired, now restoring antique test equipment
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #6 on: July 16, 2021, 04:22:41 pm »
Simulation is a good way to "reality check" a circuit like this, but it is better to start from manufacturer's data (e.g., the curve of voltage vs current) when available.  For a diode, it is usually better to consider the current as the independent variable and the voltage as the dependent variable, since the current as a function of voltage is a very strong function, and shifting of the voltage scale has too strong an impact on the current.
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 9801
  • Country: de
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #7 on: July 16, 2021, 06:50:44 pm »
To calculate a suitebale resistor, take the voltage of the LED at the desired current (e.g. some 2.5-3 V for a blue one at 5 mA). The resistor should absorb the excess current at the votlage.

A simulation is a good check if the math was right.

Unless the resistor would absorb almost all the curren, the temperatureffect should not be a problem. The voltage at the LED tends to go down a little (e.g. -2 to -5 mV/K) and thus a slight increase in current with a higher temperature. Usually this would be less than compensating the droping efficiency.

Unless the voltage drop needs to be very low, there may be alternative way to limit the current, e.g. with a depl. mode MOSFET and a resistor at source to adjust the current.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 3356
  • Country: us
  • Retired, now restoring antique test equipment
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #8 on: July 16, 2021, 07:28:16 pm »
For his simple application, a reasonable resistor suffices, using the voltage from the data sheet.
The constant-current device will deal with the wide range of supply voltage.
One of my professors, from Germany, once posed a simple differential equation to the class for solution, expecting the answer “force an exponential” or “force a sine”.  One theoretically inclined student in the front row answered, “find the Green’s function”, to which the professor replied “you could also pull your trousers up with tongs”.
« Last Edit: July 16, 2021, 07:47:54 pm by TimFox »
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 16111
  • Country: gb
  • 0999
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #9 on: July 16, 2021, 10:04:43 pm »
If you're using a microcontroller, you could PWM it at a low duty cycle.
 

Offline ConKbot

  • Super Contributor
  • ***
  • Posts: 1269
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #10 on: July 16, 2021, 10:42:05 pm »
Since the voltage across the LED will vary by a small amount going from 10mA to 5mA, compounded by variations in led voltage from batch to batch, across temperature etc, you also should add a small series resistor with the LED, then your parallel shunt resistor from your supply to the vout pin.

It looks like you need 1.4v on Vout, 1.9v for your LED, so I'd size it for 1V @10mA, or 100 ohms.   With this the combined voltage across the series resitstor/LED will vary from 3v@10mA to 2.4v@5mA rather than 2V@10mA to 1.9V@5mA, making it a lot easier to  size the parallel resistor to use half the current without having a huge variation in how well balanced the current is.
 

Offline HackedFridgeMagnet

  • Super Contributor
  • ***
  • Posts: 1994
  • Country: au
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #11 on: July 16, 2021, 11:04:29 pm »
thin coat of nail polish over the LED lens?
 
The following users thanked this post: RJHayward

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 3356
  • Country: us
  • Retired, now restoring antique test equipment
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #12 on: July 16, 2021, 11:05:11 pm »
I don’t know the compliance limit for his current regulator, but he does have a wide range of input voltage.  Adding a series resistor will do what you say, but will reduce the headroom for input voltage.  Since the total current is constant and he only needs an indicator, the voltage across the LED at 5 mA will be determined well enough to use a single parallel resistor to “steal” a fixed current from the 10 mA regulated current.
 

Offline viperidae

  • Frequent Contributor
  • **
  • Posts: 284
  • Country: nz
Re: Reducing Current on a constant current LED by parallel resistor
« Reply #13 on: July 17, 2021, 12:07:44 am »
To shunt 5mA from the led with a parallel resistor you just need the voltage drop across the led and ohms law.
Assuming 1.92V like someone said above, the formula is 1.92/0.005 = 384ohms.
A standard value close to that would be 330 or 390.
The low the resistance, the brighter the led.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf