Most audio amplifiers (except the real professional ones) won't stand that power for longer than a minute or so. The heat sink and the power supply (transformer) usually isn't designed for this. Once I've read a rule of thumb: an audio power amplifiers supply / heatsink is designed for about 70% of its nominal sinewave power.
Actually a myth, and easily busted on the testbench. Under sinewave conditions the max heating effect on the output transistors occurs at some point
below max output.
They are effectively dropper resistors, and when there is no voltage drop then no heat is produced. Of course that never arises with a sinewave output. In fact, the voltage across each transistor follows a cosine law, being the supply minus the output voltage. Therefore the heating effect follows a squared cosine law across the half cycle. (V
2/R)
Thus for small reductions in output power, the increase in transistor heating follows a square law. Reducing the output moderately will cause the output stage to run considerably hotter than at full sinewave power.
Only when you reduce the output to less than 50% or so is the current sufficiently lower that the total heat produced starts to fall.
It's possible that the mains transformer could overheat under continuous full output, but I've never seen one burn out.
The thing that will kill a domestic grade amp on stage, is hooking too many paralleled speakers to it. A 2-ohm load is a definite no-no. 4 ohms, stretching things. 8 ohms, OK. With a very low impedance load and loud but not max volume the thing will get
roasting hot.