So I'm missing something here. Charging should terminate when the current reaches 0.1C, how would the charger know what 0.1C is if we don't define capacity first?
The requirements for charging are that:
1) You must not exceed ImaxThe value of
Imax depends on the battery, and usually
Imax is proportional to the capacity of the battery. So if you use a smaller battery than the charger was intended for then
Imax may be exceeded. This is not good as it could lead to overheating and battery damage.
2) You must not exceed VmaxThe value of
Vmax depends on the chemistry and is usually 4.2 V for older designs of lithium ion. However it could be 4.3 V for some newer products on the market.
3) Charging should stop completely when the battery is fullThe definition of "full" is quite flexible, and is most readily detected when the charging current falls to a "small" value during the CV phase. Setting the stop point at 0.1C is just a rule of thumb. Whether you stop charging at 100 mA, or 10 mA or 1 mA doesn't really affect the battery at all, but it might take a really long time for a large battery to get down to 1 mA. So chargers tend to stop charging at a relatively high current to avoid wasting time. It's a law of diminishing returns anyway. There's no big difference in available charge between a battery that is 99% full and a battery that is 99.9% full.