I would like to know how much time it takes a certain power supply to rise from zero output to the rated output voltage (15VDC) once power starts appearing on its output. I wanted to find out if the power supply will meet some critical parameters for the intended load, which has a build in processor that might corrupt its own firmware (apparently due to a manufacturer design glitch) because somehow it might get into an incorrect state if the ramp up of the power supply voltage is too "slow", or even unstable. Not sure how or why exactly this problems happens as I am not intimate to the design of the device, but I was advised of the possibility of this occurring with a slow rise time power supply. I had the device connected to a variable LM338K based power supply, all seemed to be going well, and then about 1 hour into it suddenly the device stopped working. I sent it back and sure enough, the firmware was corrupted and had to be reloaded.
So I just got this small stand-alone 35W switching power supply which seems to be very stable, and now more out of curiosity I wanted to see if there was a way to actually measure the output voltage rise time of it. With a DVM set on a fixed range of course the change from zero to 15V appears to be instantaneous, but I wonder how many milliseconds it actually takes to fully rise to 15V.
I am assuming the obvious way to go would be with a scope, I do have a Tektronix 2247A, 2213A and a 222. Since the parameter to be measured is not really a repetitive event in the usual sense of a waveform, I am guessing I will need a storage scope to capture this one time event, and I am hoping my 222 with do the trick. But I wanted to ask opinions as to how would be the best setup to capture it. Also, its been a while since I last used the 222 (my main scope is the 2247A), so if anyone could refresh my memory on how to setup the 222 scope would also be helpful and probably save me some fiddling time.
Thanks in advance.