All:
I restore old test gear, primarily HP, as a hobby and have more and more been wondering about how to deal with failed - or about to fail any day now - power supplies. Typically anything I work with is at least 35 to 50 years old - I just replace any and all electrolytics since if they haven't failed yet, they soon will. Unless something is rare or for aesthetic reasons, I do "functional" restorations - use the best modern components I can with the goal of the equipment lasting another 30 or 40 years and not trying to match the exact component appearance - I restore to **use** not sit on a shelf. Which brings me to where 90 percent of the repairs are: the power supplies.
The one(s) in question at the moment are in a couple of 3335As that i just acquired. One has a bad rail, the other works (for now). IMHO, the 3335A is a truly amazing beast with performance that is very rarely matched today and was obviously designed by very, very talented EEs... who got high and decided to play "what if." And while they were busy eating chips and designing wondrous things, the new kid got stuck with doing the thermal management for the thing. Who, when he pointed out that the senior engineers had created a mighty nice oven, was evidently ignored. Because this otherwise impressive feat of engineering was sent out the door with a temperature disconnect on the power supply. Not on a pass transistor or such, oh no. It's there for high ambient temperatures.
High. Ambient. Temperatures.
Sigh.
All snark aside, this really struck me that it would just make more sense to do a completely new supply using modern components, keeping the transformer and staying linear, of course. Which made me think of how many other pieces I have worked on that it would have been faster and cheaper to do the same. It seems kinda *wrong* somehow, but is it really any different than using a better spec'd modern radial in place of a old screw-in can? Or using an radial in place of a axial?
So, what are your thoughts and opinions?
TIA,
Hal