Author Topic: When current limiting resistors are needed with LED?  (Read 6944 times)

0 Members and 1 Guest are viewing this topic.

Offline YoukaiTopic starter

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
When current limiting resistors are needed with LED?
« on: April 26, 2018, 11:47:23 pm »
I'm trying to understand when/how current limiting resistors are needed.

Specifically I'm using a segment of this https://www.superbrightleds.com/moreinfo/flexible-led-strip-lights-color-changing/rgb-led-strip-light-flexible-custom-length-led-tape-light-with-9-smds-per-ft-3-chip-smd-led-5050/3502/

I got a wall plug from the thrift store that outputs 12v. Since the output is 12v and the LED is rated for 12v is a current limiting resistor needed?

If I add a resistor in series with the LED strip that would reduce the voltage that is available to the LED strip correct?

If a current limiting resistor is needed I would want to put it in the DC12+ line so I only have to use one resistor instead of 3 correct?
 

Offline jm_araujo

  • Regular Contributor
  • *
  • Posts: 72
  • Country: pt
Re: When current limiting resistors are needed with LED?
« Reply #1 on: April 26, 2018, 11:53:36 pm »
That strip already has the current limiting resistors. It's those small black SMD components on the strip, so in your case you don't need any additional ones when using a 12V power supply as specified by the manufacturer.
 

Offline ovnr

  • Frequent Contributor
  • **
  • Posts: 658
  • Country: no
  • Lurker
Re: When current limiting resistors are needed with LED?
« Reply #2 on: April 26, 2018, 11:53:57 pm »
The LED strip already has current limiting resistors; you don't need any.

Yes, adding more resistors will reduce the current - and voltage. If you do add one to the V+ line, you will have issues where adjusting the brightness of one color will impact the brightness of the other colors, therefore it's recommended to use separate resistors per color.
 

Offline YoukaiTopic starter

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #3 on: April 27, 2018, 12:19:12 am »
Ok good to know about it already having the resistors. Thank you for that.

What if I get a single red LED that is rated for 5v and I power it from a 5v power supply? Is a current limiting resistor needed then?

I'm trying to understand conceptually when/why you need a "current limiting resistor". Wherever I have used resistors in the past I have used them for Voltage reducing. E.g. I have a 5v power supply and the LED is rated for 3v. Are Voltage limiting and Current limiting basically the same due to Ohms law?
 

Offline Jwillis

  • Super Contributor
  • ***
  • Posts: 1689
  • Country: ca
Re: When current limiting resistors are needed with LED?
« Reply #4 on: April 27, 2018, 01:35:47 am »
LEDs will take as much current as they can  get until they burn out .Most LEDs need to be limited to under 75mA current .So you select your resistor according to the voltage to only supply Under 75ma using Ohms law.

R=V/ Less than 75mA .

Some LEDs may require more ,some may require less .So it's important to know what the LED is rated for.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9886
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #5 on: April 27, 2018, 01:56:56 am »
Get the datasheet and look for IF and VF

Here is the datasheet for some common LEDs.  Look at the Red LED, it has a VF of 2V at IF of 20 mA.

To drive from 5V, you need to waste 3V on the current limiting resistor with 20 mA of current flow.  E = I * R so 3V = 0.020 * R or R = 3 / 0.020 = 150 Ohms.  Any lower value will be overdriving the LED and it will probably work pretty well with a resistor twice as big, say 330 Ohms.  It depends on how bright the LED needs to be.

https://www.sparkfun.com/datasheets/Components/YSL-R596CR3G4B5C-C10.pdf
 

Offline Nerull

  • Frequent Contributor
  • **
  • Posts: 694
Re: When current limiting resistors are needed with LED?
« Reply #6 on: April 27, 2018, 02:12:27 am »
LEDs do not have nice linear V-I curves like resistors. You must reach the diodes forward voltage to turn it on, but once you reach the forward voltage the current through a diode increases dramatically with very small changes in voltage. This means that trusting an exact voltage to limit the current through a diode is extremely unwise.



As voltage across the LED approaches Vf, the diode starts to look more and more like a short circuit.
 
The following users thanked this post: Mr. Scram

Online Brumby

  • Supporter
  • ****
  • Posts: 12288
  • Country: au
Re: When current limiting resistors are needed with LED?
« Reply #7 on: April 27, 2018, 03:17:53 am »
Are Voltage limiting and Current limiting basically the same due to Ohms law?
I would have to say "Yes" to that - because your thinking is heading in the right direction.
 

Online Brumby

  • Supporter
  • ****
  • Posts: 12288
  • Country: au
Re: When current limiting resistors are needed with LED?
« Reply #8 on: April 27, 2018, 04:13:39 am »
What if I get a single red LED that is rated for 5v and I power it from a 5v power supply? Is a current limiting resistor needed then?
Now that is an interesting question.  Red LEDs typically have a (forward) voltage drop of around half that or less.  If you actually had one rated at 5V, I would suspect there is already some mechanism in place to safeguard the LED ... perhaps a resistor, or even a basic buck converter.

If you were to change your question and say: What if I get a single red LED that is rated for 1.8v and I power it from a 1.8v power supply? Is a current limiting resistor needed then? Then the answer is: It depends.

Quote
I'm trying to understand conceptually when/why you need a "current limiting resistor".
This is a common question that many have asked - and it is a very fair question.

I would point you to the chart above and get you to pick one colour - say the red - and look at the curve very closely.  What you are looking for is the change in current as you increase the voltage.  Start at 0.5V and see what happens to the current as you go up in, say, quarter volt increments.  When you get to 1.75V, you can see a trend beginning ... a trend that goes right off the chart (quite literally!) in another two steps.

While not perfect, I do like to use the water analogy - and my offering for this is as follows...

A resistor is like a pipe with water flowing through it.  A bigger pipe has less resistance to water flow - so a bigger current of water can flow for a given pressure, than will flow in a smaller pipe.

An LED is more like a dam (with a slightly uneven top and some other little idiosyncrasies).  As the water level rises, water starts flowing over the lower points.  As the water level rises a little bit more, the water flows over more of the dam wall and the volume increases noticeably.  Once it gets to the point that water is flowing over all the dam wall, then even a tiny increase in water height will result in a massively increased flow of water.  There is, effectively, nothing to limit the volume of water that flows over the top of the dam.

Looking back, you will see that the volume of water flowing changes very dramatically over a relatively small range of water height.  If you chart this, you can see a definite area where this happens - and it will look somewhat similar to the LED chart above.

Back to the LED.  While there is this "dam" characteristic, there is a point where the current through the LED is just going to destroy it, so the challenge is to limit the current that could possibly flow.  Adding a resistor does exactly that.
 

Online IanB

  • Super Contributor
  • ***
  • Posts: 11790
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #9 on: April 27, 2018, 04:27:14 am »
Ok good to know about it already having the resistors. Thank you for that.

What if I get a single red LED that is rated for 5v and I power it from a 5v power supply? Is a current limiting resistor needed then?

I'm trying to understand conceptually when/why you need a "current limiting resistor". Wherever I have used resistors in the past I have used them for Voltage reducing. E.g. I have a 5v power supply and the LED is rated for 3v. Are Voltage limiting and Current limiting basically the same due to Ohms law?

One way to answer this is to say that a bare LED, all by itself, is never rated for a voltage. Not ever. An LED, by its very nature, is always rated for a current. For example, 20 mA maximum, or if it is a big power LED maybe 300 mA or so (with a heat sink).

So if you see an "LED" rated for 5 V, then it is not an LED. It is an LED module that contains other components like a built in resistor, or a current regulator.

Therefore, if your "thing" says it is designed for 5 V, or 12 V, then you don't need a current limiting resistor. You can just feed it 5 V or 12 V and be done.

However, if your "thing" says it is designed to operate at 20 mA maximum, then you do need a current limiting resistor or some other kind of current regulation to stay within the specified current.
 
The following users thanked this post: Mr. Scram

Offline Electro Fan

  • Super Contributor
  • ***
  • Posts: 3163
Re: When current limiting resistors are needed with LED?
« Reply #10 on: April 27, 2018, 05:16:45 am »
Generally resistors are used with LEDs but it is not an absolute requirement.

LEDs can handle some amount of current - they will get brighter as you increase current up to some point that too much current (driven by too much voltage) will overheat and burnout the LED.  The threshold for burnout will vary based on the specifications of the LED but if you keep the current below 20mA the LED will probably be ok (but it’s good to have some spares available just in case).

if you hook a DC Power Supply directly to a LED - without a resistor - and you set the PS for a maximum of 20mA you can then start applying some voltage.  At 1 Volt nothing much will happen.  Gradually turn up the voltage 1/10 Volt at a time.  Somewhere around 1.6 -1.8 Volts you will see a (red) LED start to faintly emit light, and get brighter as you increase the voltage in 1/10 steps.  On the PS (or with multimeter in series with the circuit) watch the current increase from a few mA until the voltage gets high enough that the LED draws 20mA.  At this point the PS is limiting the current - you can probably let the circuit run like this indefinitely and the LED will be fine.

If you raise the current limit on the PS, say to 100mA, and keep turning up the voltage at some point instead of getting brighter the LED will burn out.  With a resistor in the circuit the LED could have survived more voltage.  How much more can be calculated if you know the specs of LED including maximum rated current and it’s forward voltage - then you can size the resistance of the resistor using Ohm’s Law.  After you see the relationships expressed by V=IR all of this will make more sense, but these circuit making experiments will also help the math become more tangible.  In the process the math will help you see the forces at work that cause the shapes of the curves posted above.  After you burnout a few LEDs be prepared to burnout a few small resistors - at which point P=VI will become more clear. 
« Last Edit: April 27, 2018, 05:19:27 am by Electro Fan »
 

Offline YoukaiTopic starter

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #11 on: April 27, 2018, 05:50:48 am »
One way to answer this is to say that a bare LED, all by itself, is never rated for a voltage. Not ever. An LED, by its very nature, is always rated for a current. For example, 20 mA maximum, or if it is a big power LED maybe 300 mA or so (with a heat sink).

Ok so that makes a lot of sense I guess. Looking at this red LED: https://www.superbrightleds.com/moreinfo/through-hole/5mm-red-led-120-degree-viewing-angle-flat-tipped-1200-mcd/279/1206/ the site says it's rated for 30mA. But also for 2.0V. So is it "rated for 2.0V" or is that somehow a misnomer/incorrect way to think about it?

EDIT: Oh or is the 2.0V it's forward voltage and it's just poorly labeled?
« Last Edit: April 27, 2018, 05:59:45 am by Youkai »
 

Offline ebastler

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: de
Re: When current limiting resistors are needed with LED?
« Reply #12 on: April 27, 2018, 06:17:26 am »
Ok so that makes a lot of sense I guess. Looking at this red LED: https://www.superbrightleds.com/moreinfo/through-hole/5mm-red-led-120-degree-viewing-angle-flat-tipped-1200-mcd/279/1206/ the site says it's rated for 30mA. But also for 2.0V. So is it "rated for 2.0V" or is that somehow a misnomer/incorrect way to think about it?

EDIT: Oh or is the 2.0V it's forward voltage and it's just poorly labeled?

You got it, it is 2V forward voltage. The "rated for 2V" seems to be your own misinterpretation -- I can't find the term "rated" anywhere on the page?
 

Offline ogden

  • Super Contributor
  • ***
  • Posts: 3731
  • Country: lv
Re: When current limiting resistors are needed with LED?
« Reply #13 on: April 27, 2018, 07:53:28 am »
Ok so that makes a lot of sense I guess. Looking at this red LED: https://www.superbrightleds.com/moreinfo/through-hole/5mm-red-led-120-degree-viewing-angle-flat-tipped-1200-mcd/279/1206/ the site says it's rated for 30mA. But also for 2.0V. So is it "rated for 2.0V" or is that somehow a misnomer/incorrect way to think about it?

EDIT: Oh or is the 2.0V it's forward voltage and it's just poorly labeled?

You got it, it is 2V forward voltage. The "rated for 2V" seems to be your own misinterpretation -- I can't find the term "rated" anywhere on the page?

Indeed you want to know forward voltage at rated current.

Identical LED's, both have +/- identical initial currents around 20mA. Voltage of both supplies increases by 10%, yet resulting currents differ. I would say LED powered by voltage becomes destroyed, one with resistor lives:



[edit] Answer to original question "When current limiting resistors are needed with LED?" is: always when you power LED using voltage source.
« Last Edit: April 27, 2018, 08:00:24 am by ogden »
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19345
  • Country: gb
  • 0999
Re: When current limiting resistors are needed with LED?
« Reply #14 on: April 27, 2018, 08:03:00 am »
The comments about LEDs being rated for a certain current and their voltage being non-linear are correct.

To complicate matters further, the forward voltage can vary significantly between LEDs, of the same type, from the same manufacturer and decreases, as the temperature increases. For example, you might have a blue LED, with a typical forward voltage of 3.3V, when the current is 10mA, but when connected to a 3.3V, constant voltage, supply it's brighter than expected (draws more than 10mA) and problems are experienced with overheating. When the LED starts to pass current, it heats up, dropping its forward voltage, but the regulated 3.3V supply keeps its output voltage constant, so the current increases, to the point when the LED overheats. What's worse is this might not happen, when bench tested at room temperature, but is more likely in the field, when the operating temperature is higher.

The solution to the above problem is to add a small series resistor. The blue LED will be more than bright enough, at 10mA. A forward current of 1mA would probably suffice. If high brightness is desired, then a more elaborate solution is required, such as a constant current supply.
 
The following users thanked this post: ogden

Offline Electro Fan

  • Super Contributor
  • ***
  • Posts: 3163
Re: When current limiting resistors are needed with LED?
« Reply #15 on: April 27, 2018, 09:24:40 am »
One way to answer this is to say that a bare LED, all by itself, is never rated for a voltage. Not ever. An LED, by its very nature, is always rated for a current. For example, 20 mA maximum, or if it is a big power LED maybe 300 mA or so (with a heat sink).

Ok so that makes a lot of sense I guess. Looking at this red LED: https://www.superbrightleds.com/moreinfo/through-hole/5mm-red-led-120-degree-viewing-angle-flat-tipped-1200-mcd/279/1206/ the site says it's rated for 30mA. But also for 2.0V. So is it "rated for 2.0V" or is that somehow a misnomer/incorrect way to think about it?

EDIT: Oh or is the 2.0V it's forward voltage and it's just poorly labeled?

In addition to being labeled with the Forward Voltage of 2.0 the spec in the link also gives a Maximum Forward Voltage of 2.4, a Peak Forward Current of 100mA, a Power Dissipation of 85mW, and a Total Power Consumption of 0.12 Watts.  It’s all under the Specification section.

Edit - also from the spec sheet:  Continuous Forward Current 30mA
« Last Edit: April 27, 2018, 11:33:37 pm by Electro Fan »
 

Offline YoukaiTopic starter

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #16 on: April 27, 2018, 05:21:20 pm »
Ok so I think knowing that "LED are rated for current not voltage" and that I need to readjust my thinking from "rated for 2V" to "forward voltage of 2V" will both help my greatly. Thanks everyone.
 

Online IanB

  • Super Contributor
  • ***
  • Posts: 11790
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #17 on: April 27, 2018, 05:41:28 pm »
Ok so I think knowing that "LED are rated for current not voltage" and that I need to readjust my thinking from "rated for 2V" to "forward voltage of 2V" will both help my greatly. Thanks everyone.

In particular, look back at the data sheet you linked: https://www.superbrightleds.com/moreinfo/through-hole/5mm-red-led-120-degree-viewing-angle-flat-tipped-1200-mcd/279/1206/

Scroll down to the "absolute maximum ratings" section. You must at all time stay within the limits given there.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9886
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #18 on: April 27, 2018, 07:04:47 pm »
Another thing:  There are tolerances in real devices.  Just because VF is 2.0V doesn't mean you should just connect it to a 2V power supply with, essentially, infinite current.  You want to control current and VF falls out where it does.

When you design for, say, 20 mA, you will probably know that you are safe up to, say, 30 mA.  As a result, the resistor value isn't super critical and its value can be calculated as I gave above.  The current may vary a little bit with tolerances but it won't get to some magic limit.

Just remember, it's the current you design for.  Many times you will set the design point a good deal lower than the nominal IF because you simply don't need the brightness.

And, yes, Ohm's Law is the way the calcs are done as I showed above.  You know the current and you get the required voltage (drop) from VSource - VF.  Given this voltage drop and current, Ohm's Law gives the resistor value.  It will almost never be a standard value so pick the next larger value.

One last thing:  Ohm's Law is a LAW, not a suggestion!  You can't treat it like a speed limit!

 

Offline Vtile

  • Super Contributor
  • ***
  • Posts: 1144
  • Country: fi
  • Ingineer
Re: When current limiting resistors are needed with LED?
« Reply #19 on: April 27, 2018, 07:06:52 pm »
In the limits of the maximum values, which are typically given in form of Vf and If or Imax there is no difference if you feed the bare led with current or with voltage. What will happen is that the LED will dynamically adjust itself in a way that it pulls or pushes the amount of the missing component (V or I)  (Compared to the I/V curve, which is derived this way).

Edit. Removed the misleading part.

If that is a good design practice is another topic.
« Last Edit: April 27, 2018, 07:26:48 pm by Vtile »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9886
  • Country: us
Re: When current limiting resistors are needed with LED?
« Reply #20 on: April 27, 2018, 11:48:55 pm »
In the limits of the maximum values, which are typically given in form of Vf and If or Imax there is no difference if you feed the bare led with current or with voltage. What will happen is that the LED will dynamically adjust itself in a way that it pulls or pushes the amount of the missing component (V or I)  (Compared to the I/V curve, which is derived this way).

Edit. Removed the misleading part.

If that is a good design practice is another topic.

Yes, the current and voltage will ride the curves.  If the voltage goes up, the current goes up and in some areas of the curve it goes up a LOT for just a tiny increase in voltage.  That's why it is described as non-linear.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf