I would like to verify whether or not this is the proper use for an adjustable "current limiting" power supply (those cheap units, the expensive ones might work differently, but this probably applies to any).
The idea is:
First set a reasonable current limit for the component under test, then slowly ramp up voltage until the operating voltage is reached. For example, if testing a simple LED, set the current to the desired value (say 10 - 20mA), set the voltage to 0, connect the output, and ramp up the voltage until the forward voltage (or until you see the current reach the set value). If you by accident don't stop at say 3v but go as far as 4v, the current limiting will have some time to kick in and do its thing.
On the other hand, if you relied exclusively on the current limiting, and left the "maximum voltage" at 10v or 20v or whatever the case might be, and then connect the LED, it might get damaged before the current regulation kicks in and the voltage drops. A more expensive power supply might do better than a cheaper one in such a case.
The possible issue with this idea is that I'm not sure if it is always applicable. I assume that some circuits may require the proper voltage, and not work properly if started with a lower voltage and then ramped up. In that case, perhaps setting the absolute minimum operating voltage and reasonable current limit is the best way to test it. It's best to start safe with minimum values as to take advantage of the "protective features" of an adjustable power supply.
I would just like to confirm the logic. I should know better having previously read and participated in some power supply discussions but it never hurts to check, confirm, and learn something new.