My personal stance:
Most li-ion management ICs do very poor job at making the designer's life easy. They have their own faults (or some are outright broken-by-design and dangerous in unobvious ways), so you need to understand them and babysit them.
I tend to prefer doing everything from scratch; at least then I know what's going on.
I have had to make one product recall (luckily only <5 units) due to a broken-by-design TI BQ li-ion charger IC with an internally faulty overvoltage protection. Inaccurate application note I followed contributed to the problem; this part was actually my fault, but I want to point out even I as an experienced designer fell into the trap of copying parts of the appnote; it happens to a beginner even more easily.
As a result, I'm never putting a BQ anything in any li-ion product I design, ever. The trust is completely gone. They tend not to have any safety certificates either, so no bonus for paperwork - you are still fully responsible to the same degree as in a completely home-brew system.
An integrated product would make everything easier, but then it would need to be a really proper integrated solution, not some TI-acquired BQ shit IC which fails and requires as much control code as a full-blown BMS control algorithm.
My personal learning path has been:
* Looked at overly complex redistributive balancing schemes, built prototypes
* Teardowns of failed commercial BMS units, recognized the usual issue: too much quiescent current, possibly because being stuck in the wrong state, killing the cells
* Designed an overly simplified distributed BMS system. Still a valid design
* Tried two TI BQ ICs in another product, trying to "reduce development time" since I had much more to do than just battery management. An utter failure. Removed all TI BQ ICs for production PCBs, redid from scratch.
Additional points:
-Overcharge Protection
...
-Overvoltage
-Overdischarge Protection
...
-Undervoltage
What's the difference?
-Thermal Runaway
I assume you mean temperature monitoring, and specifically the upper limit.
Do not forget to add lower limit for charging. Limit charging currents below 25degC linearly, until you reach zero current at about -5 degC. (Traditionally a step function of no charging below 0 degC, full charging above 0 degC is used. I don't recommend it, as full charging currents at +1 degC can be damaging to the cell lifetime.)
You can't stop thermal runaway if it happens, and it's almost never happening due to anything that can be prevented by BMS temperature monitoring. The only such scenario I can think about is totally failing the thermal design of the pack, or completely miscalculating the fuse rating. Don't forget the traditional fuse!
So thermal runaways happen due to internal cell failures, physical damage (puncture/deformation), severe overcharge (which you have already covered by voltage monitoring).
Cell Balancing
Remember you need very little balancing, and not much muscle to do it. I have seen systems designed to shunt currents in excess of 1A, and fail at thermal design with stuck-on balancer. Don't do that. My system balanced at 40mA, and was still fully able to keep a repurposed 300Ah pack with damaged cells with increased self-discharge in balance.
SOH
I haven't seen any formal, scientifically valid definition for this. I have a feeling it was a trendy buzzword in battery academic papers a decade ago.
-Maximum charge current as a charge current limit (CCL)
Remember to add cell temperature dependency to this. Charging does less damage at higher temperatures. Somewhere around 30-40 degC is optimum. I have measured greatly decreased cycle lives when cycled in a refrigerator (at +6 degC) at the rated current.
Total number of cycles
Cycle counting can be difficult, as the cycles are of different length, and with different starting points.
A cycle between 60%->30%->60% is almost meaningless for the battery life. A cycle of 100%->70%->100% if a lot worse. A cycle of 100%->10%->100% is not that much worse anymore.