OK, one of the simple ohms law calculations is to calculate the resistor size for an LED. I ran across this web site describing how to do it:
http://www.seattlerobotics.org/guide/electronics.html He has a 6VDC power source, with a 330R resistor and a red LED all in series. He calculates the theoretical current by ignoring the LED and calculating the value for the resistor alone using the total voltage of the circuit. I have seen this before and it make no sense to me. Once the circuit is on, there is a voltage drop across the LED and therefore there will *not* be a full 6V across the resistor. So the current in the circuit will be less than the his calculation.
So by my calculation, the red LED will have a 1.9V voltage drop, so 6V-1.9V=4.1V is across the 330R resistor, giving 12.4mA I hooked up a circuit and the experiment agrees.
So... am I correct in that his method for calculating the current limiting resistor is flawed?