I am working on a semi industrial device (terminal with LCD and some functions) and I am having a hard time with a dummy load circuit. This device is (together with other simmilar devices) powered from a 12V - 18V supply.
In past, there was a problem with technicians, who often overloaded the supply, using weaker supplies than it was recommended. It worked fine in stand-by mode, but when several devices switched to a mode where they take more power, whole network could crash.
In reaction to that, our former designer used a simple circuit, it was basicaly a comparator and a 22R resistor. From 0 - 6V, the resistor would be engaged (peak current at about 275mA) and after passing 6V threshold, the resistor would be disengaged and the input regulator for the rest of the electronics would be enabled. That way, all our devices load the supply right at the power-up, which works suprisingly fine.
On the other hand, this circuit is "dumb", there is no logging, no error signalization. If the supply is unable to provide required power, it just doesn't work (and the resistor heats up significantly).
Because I was adding some other stuff (like input voltage sensing), I decided to change this circuit. I came up with this (the IC1 is actually a switch mode regulator):

T1 works as a current sink, when the voltage reaches the treshold of regulator (about 5V), it's disengaged by the 3V3 net rising above 0.5V. The LOAD_ON is for switching this thing on by the controller, which is useful for checking the connection to the supply.
But I am having doubts about the failure modes of this solution. The biggest one is that when IC1 is damaged, this load will be engaged. As far as I tested it, T1 can survive continuous opperation at 18V while heating up to 120°C, which could melt the plastic body of this device. There is some advantage in the temperature characteristics of voltage drop across D1 and T2, but it's not enough.
So I am not sure if I should make some fail-safe mechanism or completely abandon this solution. Any ideas?