Hard to write a proper subject for this thread. But I am designing a 1 cell li-ion powered thing, that shall charge directly from the 12V of a running (and charging) car. It's typically ~13.8V while charging, but many models (especially older ones), can be higher (15-16V). Automotive also have a lot of spikes, load dumps, etc, but I've have implemented input protection already.
I only need a 3.3V power rail, and this is supplied by either by a 12V->4.2V buck-, or the li-ion battery-. The selection battery/buck is handled by a simple mosfet/schottky diode power path. The power path then goes into a linear 3.3V regulator.
Previously I used 5V buck which fed a linear TP4056 charger, but then I found the BQ25606. It has several benefits (besides being cheap, reliable/good manufacturer and doesn't require a lot of circuitry). It's a switch mode charger, so more effective, and it has a built-in power path, so I can eliminate both the 4.2V buck regulator and the external power path. Seems I can tap into the SYS pin, to power my 3.3v regulator from this, and the charger takes care of everything else.
Would it be correct (as intended) to use the SYS like this?
However, it shuts off >13.5V (abs max rated at 22V). I had 10 assembled evaluation boards made before I discovered the 13.5V limit (the empirical way).
Is it bad practice to drop the input voltage a bit, instead of finding a more expensive chip?
I assume I set the battery charging to somewhere between 500 and 1000mA, and the 3.3v regulator will supply 400mA at the most.
Which methods are most appropriate/effective/safe for the dropping?