I have no idea. I could probe the voltage across the base resistor R13, but to be honest I'm kind of sick of seeing the inside of this heap of junk. Replaced the dead transistor, added a temperature LED, it works, I'm happy. (I also rebuilt half the circuit board, as whoever assembled it didn't seem able to push a component all the way through the holes on the PCB. Some of the leads were within a millimeter of touching a nearby component and shorting out, with the body of the damn component 2cm off the board...)
Out of curiosity, is it actually common for a power supply to have no temperature sense whatsoever, or is that a "cheap piece of crap" thing? It seems like really poor engineering to make a power supply that can be physically damaged by running it within its specified output range.
(The thing was loaded with other weird and plain stupid design choices, too. The front panel pots were 6.8k! I replaced the voltage pot because it was terribly scratchy - replaced it with a 50k, changed the other resistors in the voltage divider to match, and all is well. They could have used any old value as long as the others matched, and they chose 6.8k. They used a separate op amp chip for the constant voltage, even though there were two unused units on the main one, and the separate chip is no better in any way. Tons of unnecessary power consumption, especially in the various voltage regulators, which all drop huge voltages from the unnecessarily large input voltage. Some capacitors really close to their rated voltage, even though they used expensive brands - it's an input filter cap on a 7812 for Christ's sake! Just use a cheaper brand than Nippon Chemi-Con and get the proper voltage rating! Et cetera...)