What happens when a 3.3V device/IC is ran at 4-5V?
I had two cases like that.
One was a camera with an external power supply that was supposed to be 3.3V being used with a 4.2V supply.
It ran fine for a while, but after some time it stopped working with batteries, and won't start from anything too far below 4V.
In another case, i ran a 3.3V IC at 4V (thinking it was a 3-5V version).
I noticed after a few days of that, and got it down to 3.3V.
It kept on working, however it won't accept I2C input signals at 3.3V any more.
4V I2C works, however, even with 3.3V supply.
It seems that ICs are getting addicted to high voltage, once they tasted it.
Question is - how does it work?
I would have expected a gate breakdown somewhere, or internal resistors overheating, or something else that is followed by smoke.