So, I was preparing to do a video demonstrating the importance of putting a current limiting resistor in series with an LED.
So, I set up a simple project, switch, 22k resistor, then a 220 uF capacitor in parallel with an LED (with a 22k resistor in series with the led). Push the switch, LED fades on. Release switch, LED fades out. This part of the demonstration worked great, and let me demonstrate how capacitors start as a short circuit, then increase in resistance until they appear as an open circuit. That went great.
Then I wanted to dump the capacitor into the led without current regulation, so I pull the resistor that's in series with the led, so it's just the inrush resistor, then the led in parallel with the capacitor.
My theory was that as long as you hold the switch, the capacitor is fine (because of the inrush resistor), but as soon as you interrupt the power, the capacitor dumps an unregulated pulse of current into the LED, and kills it.
So, I push my switch, wait for the voltage to peak, then release the switch. The LED fades out, and nothing interesting happens. So I figure the 220uf cap is too small and can't dump enough current, so I upgrade to a 470uF. Same result. Next I try I 6F super cap. Still nothing interesting (side note, it's almost 24 hours later and the led is still lit).
So, based on these observations, I'm drawing the conclusion that you can't get runaway current on an LED unless the source voltage is higher than the forward voltage of the LED, and this circuit rather precisely charges the capacitor to the exact forward voltage of the led.
Am I on the right track here?
(I'm currently on my couch buried in cats, but I'll upload diagrams a little later if you guys want)
Sent from my ONEPLUS A3000 using Tapatalk