Author Topic: How charging lithium (LiFePO4) batteries works in the face of changing current?  (Read 5383 times)

0 Members and 1 Guest are viewing this topic.

Offline NandGateTopic starter

  • Newbie
  • Posts: 6
  • Country: us
Lithium batteries are generally charged with constant current and then constant voltage.  But how do you charge lithium batteries when your source of power is unstable, often dropping below the power needed to source the desired current-voltage combination?  Do you prioritize voltage and regulate current based on how much input power you have available?


Context & Details:

I'm planning to do a solar installation at some point in the future and wanted to understand the mechanics/possibly design some of the circuitry myself.  It'll start as a simple 100W installation with 500Wh of battery to power lighting in my garden, but I'll grow it slowly to take the house off the grid (~7000W of panels, 20kWhr of battery).  LiFePO4 (and its minor variants) seems to be the best chemistry for the job today, so I'll use that, but it seems lithium batteries all behave more-or-less the same.

I understand how to charge lithium batteries; CC-CV.  But I can't wrap my brain around what happens when your input power suddenly drops, nor can I find any good reference material to cover this topic.  So, for example, let's say you're charging at 1 amp max, 4V max, for simplicity.  You eventually get the battery to 1A@4V (4W), switch to constant voltage, and current begins its descent.  But suddenly the solar panel becomes shaded, and now you can only supply 2W.  What current/voltage do you switch to at that moment?

I suppose my issue is that I don't know how to model the battery.  Does it behave like a simple resistive load?  If that's the case then it seems it would start at a very low resistance and monotonically increase resistance as charging progresses.  At 1A@4V it would be a 4Ohm load, so if suddenly all I had was 2W then I'd probably want to supply 0.5A@4V in constant voltage mode.  Trying to supply 1A@2V would mean having to slowly drop voltage as the battery's resistance continues to increase, and that doesn't seem right.  But suddenly dropping the charging current doesn't seem right either...


Thank you!
 

Offline amyk

  • Super Contributor
  • ***
  • Posts: 8240
It's better to think of charging as current-limited, voltage-limited. They don't have to be constant, but that's the fastest way to charge. For lithium phosphate, the maximum charge voltage is 3.6V/cell.

A lithium cell is not like a resistor. It's more like a capacitor which must be kept within a certain voltage range and cannot be discharged or charged faster than a certain current. Initially, when the cell voltage is low, the current is kept within limits, and once the cell voltage rises to the limit, the voltage limit is observed.

Quote
But suddenly the solar panel becomes shaded, and now you can only supply 2W.  What current/voltage do you switch to at that moment?
Just focus on limiting the current and voltage to below the cell's ratings, while keeping the voltage output high enough to drive current through the cell. If the voltage has already reached the maximum, then the current will naturally decrease.
 

Offline CraigHB

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
You have to be careful with solar cells and reverse currents.  They have a voltage threshold much like a diode where current flows limited only by the small resistance of the device internally.  If you want to charge a battery from a solar cell supply you need a diode inline or a device that performs the same task for the case when solar cell output voltage is greatly below battery voltage.

It's not a problem if supply voltages are unstable, but you do need to observe voltage and current limits for your battery.  A maximum of 3.6V per cell was already mentioned.  LiFe batteries are typically limited to a 2C charge rate for those with higher drain limits, 1C for those with a 2C drain limit.  That means battery input current must be less than or equal to one or two times the charge capacity.  You're going to need a charger controller in there to provide those limits.  Charger controllers can usually deal with varying supply voltages so long as you don't exceed the controller's input voltage tolerance.  A charger controller can also prevent reverse currents.
« Last Edit: October 01, 2016, 03:55:02 pm by CraigHB »
 

Offline NandGateTopic starter

  • Newbie
  • Posts: 6
  • Country: us
Thank you so much for the replies!

I played with charging and discharging a lithium cell I had laying around, to get an intuitive feel for how it works.  After doing that, what amyk said made a lot more sense.  For whatever reason it didn't click in my head that the battery has a voltage, and that the current flow is dictated by the difference between the power supply voltage and the battery voltage.  Everything makes a lot more sense now.  I suppose my confusion was derived from seeing all the charge graphs which actually show the charge voltage, not the battery voltage.  In fact, I don't think I ever found a charge graph that graphs the battery voltage...  Trap for young players I suppose.

Quote
Lithium based rechargeable batteries can be damaged (over time) by slow charging currents.

Out of curiosity, do you have a reference that goes into more detail?  I tried googling for "trickle charge dendrite" and variations but couldn't find anything that said that dendrite growth was related to slow charge.  One website actually discusses slow charging and doesn't mention any ill effects (http://www.powerstream.com/li.htm), but that talks about 0.18C.  I imagine charging under 0.05C or 0.01C could be bad, because it makes cut-off more difficult, but that's really just about overcharging and causing plating.


New questions:

*  With respect to battery balancing, I found this post which argues for bottom balancing instead of top balancing: http://www.myelifenow.com/2012/10/lifepo4-charging-method-dont-ruin-your.html  He suggests bottom balancing each cell to 2.75V, and then setting the charge/discharge profile to 3.0V min, 3.5V max.  No active balancing circuit needed.  He also argues that top balancing is bad (at least for LiFePO4s).  Any thoughts on that?  I haven't found any technical documentation that suggests one way or the other.

*  Are there advantages/disadvantages to an MCU controlled buck type lithium charger, versus something more analog?

*  Most systems I've seen put the cells into packs, and then hook all the packs up, and charge the whole kit using one charge controller.  Is there any reason why we can't have one tiny charge controller per cell instead?  I vaguely imagine a system where you use an MPPT buck and/or boost to generate a constant voltage rail from the solar panels (say 48V).  Then each cell has an MCU controlled buck that drops that rail to charge its cell, combined with a boost to supply voltage back to the rail when in discharge mode.  Then the rail can be hooked up to the mains inverter.  Possibly a master controller to switch the cells between charge and discharge based on system load and limit their charge based on system supply.  To me that makes it so each cell is individually balanced, and cells can be added and removed from the system very easily (for replacement, or just growing the system slowly over time).  Also seems like you could take better advantage of economies of scale by buying lots of small components, rather than a few very large components.
 

Offline tatus1969

  • Super Contributor
  • ***
  • Posts: 1273
  • Country: de
  • Resistance is futile - We Are The Watt.
    • keenlab
you may want to dig into the MPPT (maximum power point tracking), that is the strategy behind getting the most out of your solar panel while taking into account its momentary power delivery capacity and thebattery charging restrictions.
We Are The Watt - Resistance Is Futile!
 

Offline CraigHB

  • Regular Contributor
  • *
  • Posts: 227
  • Country: us
In fact, I don't think I ever found a charge graph that graphs the battery voltage...

A lot of times you can find discharge curves for a particular battery under various current levels like 1/2C, 1C, 2C, etc.  You don't see plots for no-load voltage versus state of charge, but you'll typically see the 1/5C rate which is as close to open circuit as you can see plotted.  That can give you some idea for state of charge versus open circuit voltage.  Though one thing about LiFe batteries is they have a flatter curve than other types of Li-Ions so that can make things more obscure since voltage changes less over state of charge.

Quote
He suggests bottom balancing each cell to 2.75V, and then setting the charge/discharge profile to 3.0V min, 3.5V max.  No active balancing circuit needed.  He also argues that top balancing is bad (at least for LiFePO4s).  Any thoughts on that?  I haven't found any technical documentation that suggests one way or the other.

I'm not sure exactly what you mean by top down or bottom up balancing.  In any case there's a few different methods used in chargers for balancing cells.  The best one is to use a virtual ground at every cell junction with a charger controller dedicated to each cell in the pack.  That allows each cell to charge individually independent from the other cells.  Other methods are employed such as a two wire pack charge through CCM with a charge/discharge balance algorithm operating in the final CVM phase of the charge cycle.

Quote
Are there advantages/disadvantages to an MCU controlled buck type lithium charger, versus something more analog?

It's not uncommon to see an MCU based charger solution for Li-Ion batteries.  Hobby chargers for high drain LiPos are like that.  Otherwise you can get charger controller chips for single cells which perform that function at the hardware level.  In the case of multiple series cells, you can dedicate one to each cell as mentioned already.  That's going to require the smallest footprint and provide the most simple solution.

Quote
Most systems I've seen put the cells into packs, and then hook all the packs up, and charge the whole kit using one charge controller.  Is there any reason why we can't have one tiny charge controller per cell instead?

As already discussed there are chargers that employ this method and there's no reason you can't design one yourself.  Other chargers can use different methods to balance packs.  Typically those require an MCU based solution.

Chargers often utilize a buck converter instead of a linear regulation method.  It improves efficiency and greatly reduces heating allowing much higher charge rates.  Hobby chargers always use a converter like that to regulate voltages.  For lower charge rates typically under an Amp, a linearly regulated chip is a small and simple solution.  The power it has to dissipate is input voltage minus battery voltage times the charge rate.  Once a linear charger controller gets above a rate about an Amp it usually hits thermal limiting and starts dropping the rate to keep die temps within tolerance.  So if you want higher rates you'll need a charger that uses a converter.
 

Offline NandGateTopic starter

  • Newbie
  • Posts: 6
  • Country: us
Thank you for the input, Craig!

Quote
A lot of times you can find discharge curves for a particular battery under various current levels like 1/2C, 1C, 2C, etc.

Yeah, I've seen discharge curves, it just didn't translate into my head that the battery's voltage would be doing something similar during charging.  Basically my point was just that if I had seen a charging curve that showed not only the power supply's voltage but also the battery's voltage during charging then it would have made a lot more intuitive sense to me that the battery is behaving similar to a capacitor.


Quote
I'm not sure exactly what you mean by top down or bottom up balancing.
The video buried in the post that I linked to had a really good visual demonstration.  I can't say whether how he describes everything is correct or not, of course, but here's the video link:  http://media.ev-tv.me/news111309-1280%20-%20iPhone.m4v  at ~38:30 in the video.

Regardless, I'm still not sure if I buy his argument for bottom balancing.  I guess the issue that guy was running into was the cells would be top balanced, he'd drive his car until it almost died, and then discover a cell or two had been discharged too far.  That makes sense, given the disparate capacity of cells in the pack and the nature of top balancing.  So he decided to bottom balance and cut power at 3.0V during discharge.  Sure, fine, that would presumably avoid any chance of over discharging, but ... shouldn't he be using a BMS on each cell to cut power when any go undervolt?  Maybe the argument is just that bottom balancing and locking voltage to [3.0, 3.5] avoids the cost and complexity of BMS and active balancing, but I wouldn't really feel safe without a BMS anyway.


Quote
Otherwise you can get charger controller chips for single cells which perform that function at the hardware level.
Are there chips that can handle a variable amount of input power?  I need some way to prioritize the Load (the inverter powering the garden/house/whatever) so that those demands are met first, and any remaining solar power gets dumped into the batteries.  So that means the charge controllers will receive a variable amount of power.
 

Offline amyk

  • Super Contributor
  • ***
  • Posts: 8240
Dave made a recent video about charging lion cells here, where you can see the voltage and current graph:



He didn't use a phosphate cell, but the charge/discharge requirements are essentially the same, with different voltages.

Out of curiosity, do you have a reference that goes into more detail?  I tried googling for "trickle charge dendrite" and variations but couldn't find anything that said that dendrite growth was related to slow charge.  One website actually discusses slow charging and doesn't mention any ill effects (http://www.powerstream.com/li.htm), but that talks about 0.18C.  I imagine charging under 0.05C or 0.01C could be bad, because it makes cut-off more difficult, but that's really just about overcharging and causing plating.
Lion cells are perfectly fine with being charged (or discharged) at anywhere from 0 to their rated current; I suspect a lot of the worries about trickle-charging is related to confusion with other chemistries like NiMH which are "current-charged" and where the cell voltage during charging can be significantly higher than the resting voltage. On the other hand, a lion cell is fine sitting at its termination voltage (3.6 for phosphate, 4.2 for cobalt/manganese) indefinitely, although there is some degradation in the long term. There's plenty of misinformation out there. Here's a good thread to read:

https://www.eevblog.com/forum/projects/diagnosing-lithium-cells/
 

Offline mtdoc

  • Super Contributor
  • ***
  • Posts: 3575
  • Country: us

*  With respect to battery balancing, I found this post which argues for bottom balancing instead of top balancing: http://www.myelifenow.com/2012/10/lifepo4-charging-method-dont-ruin-your.html  He suggests bottom balancing each cell to 2.75V, and then setting the charge/discharge profile to 3.0V min, 3.5V max.  No active balancing circuit needed.  He also argues that top balancing is bad (at least for LiFePO4s).  Any thoughts on that?  I haven't found any technical documentation that suggests one way or the other.

Interesting link. In fact there are several experienced solar PV system designers/operators who are now advocating bottom balancing of LiFePO4 battery banks with no BMS.

Have a look around the Midnite Solar forum or the NAWS forum. Many knowledgable users there - some EEs. PNjunction and Zoneblue are 2 that come to mind who are bottom balancing LiFePO4 banks I believe.

I'm going to be designing and installing a second large solar PV system over the next 2 years and plan to use a LiFePO4 bank.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf