Author Topic: Linear lithium-ion charger  (Read 1764 times)

0 Members and 1 Guest are viewing this topic.

Offline derGoldsteinTopic starter

  • Regular Contributor
  • *
  • Posts: 149
  • Country: il
    • RapidFlux
Linear lithium-ion charger
« on: June 26, 2016, 09:46:35 am »
I've been working on the current-limiting part of a linear lithium-ion charger, which is very similar to the eevblog constant current load, for which there are countless threads on this forum already. So I read through many of them and came up with the attached circuit.
I quickly wired it up on a breadboard, without the noise-limiting components (R5, R6, C1) or the protection diodes (so as to keep it a simple test), and it works as expected. I used an LM7812 for voltage regulation because the mosfet I'm using isn't logic-level and I'm going to be charging 4S packs and above anyway. The D2 diode is in there in case the charging source is a battery itself. I used the LM358 because is the op-amp I'm most familiar with (which is to say, just a little more than completely unfamiliar with).

I have the following questions about the circuit:
1) Is there a reason not to use a very low-value sense resistor (0.1ohm or lower) to keep the heat loss through it to a minimum?
2) Since the load is a "component" which will change its voltage, the heat output of the mosfet will be greatest when the charging begins, and decrease as it progresses. If I'm charging a 5S lithium-ion pack from ~15V to 21V, the mosfet will start the charging process dissipating Vin-15*A watts of heat, and by the time it gets to the constant-voltage stage it will be dissipating Vin-21*A watts of heat. Is this correct?
3) The voltage drops from input to output so far are: ~1V for 200mA, ~1.8V for 500mA, and ~3V for 1A. Is this more to do with the characteristics of the circuit, the characteristics of the mosfet, the relatively high value of the sense resistor, or just the fact that it's on a breadboard and the connections are crappy?

Any other suggestions are welcome. Thanks.
 

Offline Signal32

  • Frequent Contributor
  • **
  • Posts: 251
  • Country: us
Re: Linear lithium-ion charger
« Reply #1 on: June 26, 2016, 10:19:38 am »
1. Only reason I can think of is the opamp's input offset voltage -- at 200mA you'll have 20mV across a 0.1R resistor and the offset voltage of the op-amp is 2mv, so you can get a 10% error when setting the current. For a 0.5R resistor, the  error will be 2%.
2. Well it will share the load with the current limiting resistor. So MOSFET + resistor will disspiate (Vin-15)*A at the start of charging. Yes, at constant voltage charge it will dissipate (Vin-21)*A.
3. "from input to output" ?  Do you mean from V+ to what the battery is getting ? What's the input voltage and how many cells.
 

Offline Audioguru

  • Super Contributor
  • ***
  • Posts: 1507
  • Country: ca
Re: Linear lithium-ion charger
« Reply #2 on: June 26, 2016, 03:58:50 pm »
Your circuit has no V+ number.
It also has no constant voltage mode. The battery will keep overcharging until it explodes or catches on fire. it also has no low charging current detection and disconnection.
A good lithium battery charger circuit measures the battery voltage at first and if it is low then it tries charging with a very low current. if the voltage does not rise after a certain amount of time then it disconnects the damaged battery to prevent an explosion or fire. Your circuit will apply full current continuously to a damaged battery and cause a spectacular event (post its video).
A multi-cell lithium battery needs a "balanced charger" to prevent one or more of the series cells from overcharging. The voltage across each cell is detected and if it exceeds 4.20V then that cell is shunted until the voltage drops to 4.20V.
 

Offline derGoldsteinTopic starter

  • Regular Contributor
  • *
  • Posts: 149
  • Country: il
    • RapidFlux
Re: Linear lithium-ion charger
« Reply #3 on: June 26, 2016, 06:19:30 pm »
I omitted some details for the sake of brevity, maybe a bit too much.
I start with a voltage-regulated power supply which feeds into V+. If my battery pack is a 5S (21V max), I set the voltage regulation to 21V plus the minimum voltage drop of this circuit, which is 0.7V, so 21.7V. As long as the battery pack is drawing more current than this circuit will allow, it's in constant-current mode (I usually limit it to 1A, but larger packs can handle 3A or more). When the battery voltage gets close to its maximum 21V, its current draw will drop to below the current limiting range of this circuit, so it'll be in constant-voltage mode.

All of the battery packs I'll use this circuit to charge have PCMs with cell-balancing. I don't have to worry about the individual cell's voltages, just the overall maximum voltage and current.

When I talk about voltage drop I mean between pin 1 and pin 2 of JP1, that's my output. I use a constant-current load to test the circuit.

Maybe I should have omitted the whole lithium-ion battery pack detail and just said that I was trying to construct a constant-current power supply and that I'd like to reduce its voltage-drop overall to a minimum.
I think my problem is the resistance of the load... I mean if I were just to short the output (JP1) the "voltage drop" would be Vin minus the drop across the mosfet and the sense resistor. If, instead, I hooked up a 10ohm power resistor, the voltage drop across the different components would change.
Should instead I be measuring the voltage between the input and output of the mosfet to see if it's fully-on?
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf