So how does it know when it outpuys enough current? Is it based on the rating on the transformer? E.g if the transformer says it will output one amp it just jiggles around with the voltage until that one amp target is matched?
Also, why don't leds take up as much current as they need at a constant voltage, like every other circuit?
Thank you.
A
resistor will obey Ohm's Law, where the voltage and current are related by
V/
I =
R.
If you apply a voltage (e.g., from a battery, where the terminal voltage is almost independent of the current), the current through the resistor will be given by that equation.
Semiconductor diodes (including LEDs) are
not linear: i.e., the voltage is a monotonic function of the current (and vice-versa), and the current is almost an exponential function of the voltage.
Therefore, it is very dangerous to apply a voltage from a "stiff" source to a diode or LED, since the current could be very high and destroy the device.
Therefore, LED drivers use the above-described feedback from a current sensing resistor to control the current through the load by automatically adjusting the voltage to obtain the desired current through the current sensing circuit.