Many thanks for the reply. I had thought I was on the right track with running cells in parallel, but it's good to have a second opinion.
Make it a 3rd opinion. We make up 5-15KWHr Li packs for Solar car racing (Aurora solar), and usually put 5-7 cells in parallel. Our next generation
pack will have ~12 in parallel. The benefits are that neighbours can "condition" weaker cells, and IF a cell goes high(er) impedance, you have
a LOT of redundancy !! IOW, you don't SUDDENLY lose all power and are pretty much screwed :-)
What I wasn't clear on before is how the current is gradually reduced when it comes to the constant voltage phase: does the charging IC apply a gradually reducing current limit, or does the battery gradually draw less current and the charging IC just supplies as much current as is needed.
This is where it gets VERY MESSY, unless you design your own Li Charger system. Manufacturer specs usually state "if you want the longest life / best
performance", you MUST limit the CV current to C10, or C20 (ie1/20th C). You can definitely go higher, but the cell life will diminish.
In racing we don't give a rats, and if we only get 10-20 cycles life, FINE ! But back to you -
Once the IC has reached CV changeover, and as LONG AS your P/Supply can deliver FULL Load AND "trickle" currents, the ACTUAL current going in to
the cells will be "around" the correct level. In fact it will likely be a bit higher. You need to examine the manufacturers charge curves carefully, find
the CC/CV changeover point (lets assume it's 4.10V), then calculate how much over their recommended CV current (and how long) you end up.
Basically, it's NOT ideal, but unless you want 100% cycle life, it won't be too bad.
I have JUST finished a design (3rd gen), where I split up the DC Input into 2 paths - one SOLELY for the battery charger, so it's mostly ONLY on standby,
and the 2nd path is to the DC Output. Either a PNP FET/BJT switches the battery instantly in event of Power fail. (Some people even use a tiny relay).
If you go that way, you can also include a battery load test function, that puts a say 1/10 load for 5 mins, once a day.
A little more work, but it's the "ideal" method.
During the constant current stage, the charging IC just lets the battery slurp as much current as it needs - up to the programmed charging current - whilst gradually increasing the voltage. Once the constant voltage phase kicks in the battery will gradually draw less current and the charging IC just supplies as much as is needed (which will gradually reduce over time as the resting voltage of the battery approaches 4.2v).
Therefore, if that is the case, then there should not be a problem - provided I can find a charging IC that can supply a current which is greater than the maximum current draw of the load. The more that the load draws, the less will be available to charge the battery - the net result being that the charging time is simply elongated.
Yup, pretty close. The charger IC "should" ALSO limit the current to C10/C20 in CV mode, IF you want full cycle life.
That's where you need to make a decision !!