I would like to verify whether or not this is the proper use for an adjustable "current limiting" power supply (those cheap units, the expensive ones might work differently, but this probably applies to any).
The idea is:
First set a reasonable current limit for the component under test, then slowly ramp up voltage until the operating voltage is reached. For example, if testing a simple LED, set the current to the desired value (say 10 - 20mA), set the voltage to 0, connect the output, and ramp up the voltage until the forward voltage (or until you see the current reach the set value). If you by accident don't stop at say 3v but go as far as 4v, the current limiting will have some time to kick in and do its thing.
On the other hand, if you relied exclusively on the current limiting, and left the "maximum voltage" at 10v or 20v or whatever the case might be, and then connect the LED, it might get damaged before the current regulation kicks in and the voltage drops. A more expensive power supply might do better than a cheaper one in such a case.
The possible issue with this idea is that I'm not sure if it is always applicable. I assume that some circuits may require the proper voltage, and not work properly if started with a lower voltage and then ramped up. In that case, perhaps setting the absolute minimum operating voltage and reasonable current limit is the best way to test it. It's best to start safe with minimum values as to take advantage of the "protective features" of an adjustable power supply.
I would just like to confirm the logic. I should know better having previously read and participated in some power supply discussions but it never hurts to check, confirm, and learn something new.
There is still a voltage ramp time, and this time is longer than current sampling time. With decent power supplies it is safe to just turn on the full voltage.
the problem is that depending on the power supply you have , the output cap can be pretty high value, the higher the value the bigger the problem you told us about becomes, linear power supplies usually are good and fast and have pretty low output capacitance , but mos smps's can damage sensible electronics , so always check your output bvoltage before connecting stuff
On the other hand, if you relied exclusively on the current limiting, and left the "maximum voltage" at 10v or 20v or whatever the case might be, and then connect the LED, it might get damaged before the current regulation kicks in and the voltage drops. A more expensive power supply might do better than a cheaper one in such a case.
If you must rely on the current limit to limit current, you can do it the following way.
1. Adjust the power supply voltage
2. Place a short circuit on its output and adjust the current limit for the correct current flowing
3. Place the device under test (in this case LED) in parallel with the short circuit
4. Remove the short circuit
This will ensure the current through the device will be that set by the current limit or less. There will not be any overvoltage issues.
More complex devices may malfunction if ran at a lower voltage than specified. As examples, MOSFETs are usually intended to be run either full on or full off, running with a low gate voltage may only turn the MOSFET partially on, greatly increasing power dissipation and causing it to burn up. Motors also draw more current and run hotter at low voltages.
If you find yourself often testing LEDs and other diodes you could make a proper constant current source.
On the other hand, if you relied exclusively on the current limiting, and left the "maximum voltage" at 10v or 20v or whatever the case might be, and then connect the LED, it might get damaged before the current regulation kicks in and the voltage drops. A more expensive power supply might do better than a cheaper one in such a case.
Some really good power supplies have a tiny amount of output capacitance and they are more suitable for constant current applications. Most have too much output capacitance and some have way too much.
A better way is to include either a ballast resistor or high impedance current source in series with the device being tested. If you have money, then a source meter would be the way to go.
The possible issue with this idea is that I'm not sure if it is always applicable. I assume that some circuits may require the proper voltage, and not work properly if started with a lower voltage and then ramped up. In that case, perhaps setting the absolute minimum operating voltage and reasonable current limit is the best way to test it. It's best to start safe with minimum values as to take advantage of the "protective features" of an adjustable power supply.
Some circuits will misbehave, possibly even destructively, if operated at low voltage. They might for instance latch into a non-functional state or not even start. New IC designs occasionally have problems starting if their supply voltage ramps up too slowly because their startup circuit was poorly designed.
Some circuits will misbehave, possibly even destructively, if operated at low voltage.
I am playing around with some Chinese bucks (at 12v 3a) powered by a 19v Dell laptop PS. My biggest surprise is how long it takes for the voltage to drop to 0 when turned off (load is a power resistor). I wondered if this slow (maybe 5 sec) voltage drop could damage something. The worst thing is I am not used to auto ranging meters so I watched the voltage drop to 500 volts
(it was 500 mv). Had to have a stiff drink after that one. FYI I do not work in the electronics field (thank god for them
)
Should I be concerned about this slow voltage drop? The PSU will only be used for Arduino type of stuff.
thanks
Should I be concerned about this slow voltage drop? The PSU will only be used for Arduino type of stuff.
A slow voltage drop is usually less of a problem because the final result is off.