EEVblog Electronics Community Forum
Electronics => Projects, Designs, and Technical Stuff => Topic started by: PbFoot on September 27, 2012, 12:02:49 am
-
Hi,
I am working on a device which pulses an array of high-output LEDs. I am using a TIP120 to drive 4 parallel groups of 4 series LEDs. I each LED is rated for 3V at 1A maximum pulse. I am using a duty cycle of 1%, so each pulse is in the microseconds range.
My question is this: I have set up my bench power supply for 12V and 4A limit. When I run my LED array, the ammeter on the power supply only reads about 25mA. I know this can't be right, and I suspect this is because of the very short pulses. Am I right?
Part 2: How can I verify that my LEDs are using the correct current? Do I need a current probe for my scope?
-PbFoot
-
Add a small sense resistor in series with the leds, and you can measure the voltage drop with your scope. Ohms law does the rest.
Choose a very low value so the circuit is not affacted too much, perhaps 100mOhms for 1Amp peak so you get 100mV reading.
-
Preferably non inductive
-
The current meter will be reading the average, so that 25mA is in the right range for 1% of 3 Amps.
-
A uCurrent for a scope would be a very useful tool for a lot of stuff. An upper bandwidth of 50kHz or so should be fairly easy, maybe up to 1MHz or so with careful design. For isolation, TI makes an isolated Delta Sigma modulator.