I have a power supply for an old telephone system that has the control circuitry burned out. Only the regulator chip is bad, everything else is fine. I would like to convert it into a laboratory power supply with adjustable voltage and current limiting.
The original power supply is marked as supplying 22 - 26 volts DC regulated at 5 amps and also supplied two outputs (isolated from each other and the other winding) at 11 volts AC. All three outputs were marked as being capable of supplying 5 amps, one at DC and the other two AC. I measured the output voltage of all three secondary windings and two of them are 11 volts AC as expected and the DC winding is 30 volts RMS AC. I also measured the DC voltage on the main filter capacitor which was 41 volts DC. That makes sense as it is the peak voltage minus 2 diode voltage drops in the bridge rectifier.
Is it reasonable for me to assume that the higher voltage winding was rated at 212 VA (peak voltage times 5A) and the two other windings should be rated at 55VA each (RMS 11VAC times 5A)?
FYI, the regulated supply used three 2N3055s, which I will scavenge.
I'd like to build the adjustable supply to provide 5A regulated 0-30V DC from the higher voltage winding and to put the two 11VAC windings in series, then provide a second regulated DC supply at 0-20 VDC and 3A.
Does that seem reasonable, or do I need to derate it more? Any problem with putting two transformer windings in series, as long as I get the polarity correct?
Thanks for comments.