so like where is the rest of the resistance coming from? does the LED make up the rest? I thought they were supposed to have virtually no resistance hence them needing a series resistor in the first place
This is the best response so far:
1. The thing is, a diode does not act like a resistor.
2. Below its forward voltage it passes virtually no current.
3. Above its forward voltage it conducts almost like a short circuit, but it always has that forward voltage drop.
So you don’t use Ohm's law with a diode (or LED).
For an approximation, you take the LED's "typical" forward voltage and just say that is the voltage across it. It's almost constant.
It can get a little more complicated if you start getting into V
f verses current - but for the sake of basic calculations, this is quite good enough.
This is a textbook example of the basic calculation for a current limiting resistor:
Example of an LED with a current-limiting resistor in series: If you have 5V supply and the LED's forward voltage is 1.8V, that leaves 3.2V across the resistor. If you want 20mA of current, Ohm's law says the resistor should be 160 ohms.
You can also calculate the power dissipated through the LED, by using the voltage drop across the LED and the current flowing through it. While you might think
AHA! This is looking like a resistor! - don't be fooled. Yes, the power dissipation from the LED will be the same as a resistor with the same voltage and current, but that is for those
specific values. If you
change something, then they each respond differently. The resistor will do so in a linear fashion - the LED will not.