Yeah, get one that fails shorted.
Could you provide any hint of
how long these things are supposed to survive?
Momentary overload isn't an unusual thing, but getting more than ~ms will depend on having a lot more thermal mass than a typical "1W" component.
The easiest watt to dissipate is the watt that you didn't dissipate in the first place: are you sure it can't be avoided through better design (e.g., instead of shunting a surge, disconnect from the source), or greater complexity (e.g., instead of dissipating the power, use a switching circuit to return it to the source, or to a capacitor, or transform it to some other form of energy e.g. EM radiation).
A somewhat relevant example of that last case, and quite a pedestrian one actually: lamps used to be so inefficient, you could ignore their light output for heat dissipation purposes. A 100W incandescent might give off 1W of useful output, everything else is useless IR (radiation) or heat (convection). LEDs have always been an improvement over incandescent of course, but even just a decade ago they weren't that much better from a thermal standpoint. They might've dissipated 90-95% of their electrical input as heat. Well, now that white LEDs are over 100 lm/W standard, their optical output is a significant fraction (10-30%) of the electrical input -- this allows chip sizes to shrink, while keeping the same power ratings, or higher!
My direct example is a light fixture I built about as long ago, using cheap Chinese "3W" LEDs (~10mm dia. thermal pad with leads style); over time, they aged to a dim purplish output. Not that I could tell how dim they were by eye. Last year I replaced half of them with new, 5050 chip size, brand name LEDs. The new LEDs have somewhat lower voltage drop, much lower heat dissipation, and far more light output. The improvement is immense!
Tim