EEVblog Electronics Community Forum
Electronics => Projects, Designs, and Technical Stuff => Topic started by: delphes123 on December 08, 2023, 02:22:03 pm
-
Hello
I have a 4S Lipo Battery 14,8V with 5,2 A/h capability, I need to produce regulated 12V at maximum power I can, my idea was to use a synchronous buck boost controller like LT3790 https://www.analog.com/en/products/lt3790.html (https://www.analog.com/en/products/lt3790.html) , I wonder if there isn't a simpler solution ?
Thanks in advance
-
What's the lower limit of the battery voltage?
-
I wonder if a buck converter wouldn't work as well. By the time the batteries are down to 3.5V, they are already mostly discharged. And that would still give you 2V headroom over the 12V.
Another possibility would be to rearrange the batteries into 2S2P, and use a boost converter. But that would complicate charging.
Would even a linear regulator be a possible option? How much current will you need?
Edit: This thread is relevant:
https://www.eevblog.com/forum/beginners/3-or-4-18650-batteries-to-get-precise-12v/ (https://www.eevblog.com/forum/beginners/3-or-4-18650-batteries-to-get-precise-12v/)
-
I wonder if a buck converter wouldn't work as well. By the time the batteries are down to 3.5V, they are already mostly discharged.
Not even close - this myth lives strong on forums; but OP specifically mentions "maximum power I can"; whatever it actually means, it suggests significant voltage sag on battery cells. If the OP truly means what they write, i.e. battery current close to maximum rated current, then to extract full capacity one needs to go down to the specified discharge cutoff voltage, typically 2.0 to 2.5V on high discharge current cells. And 4*2.5V = just 10V, buck is not going to make it.
Only for very low discharge currents (< C/10 or so), you can assume 0% = 3.3V, and maybe 30% = 3.5V.
My personal choice would be to go to 5s or 6s battery and then use synchronous buck which can go as close as 100% duty cycle as possible, but if 4s is a fixed requirement and "maximum power" means something like 1C out of cells, if not even more, then buck-boost, SEPIC or similar is pretty much mandatory.
-
Hello
I have a 4S Lipo Battery 14,8V with 5,2 A/h capability, I need to produce regulated 12V at maximum power I can, my idea was to use a synchronous buck boost controller like LT3790 https://www.analog.com/en/products/lt3790.html (https://www.analog.com/en/products/lt3790.html) , I wonder if there isn't a simpler solution ?
Thanks in advance
Hello,
If you have a 4 cell battery then the top voltage is 16.8v and the lower limit is about 12.8v, and that gives you 0.8v of overhead when it gets low.
What this means is you need a regulator that can work with a 0.8 volt difference between input and output.
A run of the mill linear regulator like the LM317 or similar would not be good because the overhead requirement is about 1.2 volts or so. You would need a low dropout linear regulator or a buck regulator that could work with that 0.8 volts overhead.
The buck circuit will usually be more efficient than the linears, but it takes some calculations to decide if you can get away with a linear without losing too much energy. A buck circuit can typically get you a longer run time, but it's not guaranteed as it depends on the input and output range of voltages which one actually comes out ahead.
That's of course if you planned on using it until the battery got as low as possible. If you can put up with a higher final terminal voltage then the options get less ridged.
-
I need to give you more details on the expression "maximum power I can", if the Battery can handle 5 Amps current, the 12 V output needs to handle somethings like 4 Amps, concerning voltage min and max, it can goes from 12.8V at low bat and 16,8V at full charge.
-
IMHO, a linear regulator is almost always a bad idea with batteries because you waste so much energy/capacity.
-
I would just use a buck converter.
At 3v/cell you are pretty much tapped out in capacity anyway, you would only gain maybe 5-10% capacity going lower than that and reduced efficiency of a sepic/buck-boost would probably eat that up at least in added losses
A good buck converter should be able to get close to input=output, it's just means it's running the FET at 100% duty.
-
Thanks for answer, I finally found a very good candidate for my application https://www.ti.com/product/TPS65265 (https://www.ti.com/product/TPS65265)
-
OK so we can calculate instead of "it's OK, it's OK" handwaving.
Assuming TPS65265
* high-side switch resistance typical 39mOhm, no MAX given, let's assume 50mOhm
* This IC doesn't seem to have an arbitrarily low maximum duty cycle, so it's only limited by bootstrap cap charging time; datasheet is not helpful, so let's assume maximum duty = 98%
At V_out=12V, 98% duty cycle, 4A I_out, Vin must be at least (12V+4A*0.050Ohm) / 0.98 = 12.45V.
12.45V/4 cells = 3.11V/cell
"battery can handle 5A" vs. actual input current of 4A, battery would run at 80% rated maximum current. Now without knowing the exact battery, let's look at some random li-ion cell data, let's pick the good old Samsung 29E from lygte-info; this cell represents quite typical COTS NCA chemistry from a decade ago.
https://lygte-info.dk/review/batteries2012/Samsung%20INR18650-29E%202900mAh%20(Blue)%20UK.html
Maximum rated current is 8.25A, out of which 80% would be 6.6A, so let's look at the closest discharge curve, 7A curve (blue) of the worst of the two specimen,
(https://lygte-info.dk/pic/Batteries2012/Samsung%20INR18650-29E%202900mAh%20(Blue)/Samsung%20INR18650-29E%202900mAh%20(Blue)-Energy.png)
At 3.11V/cell, energy is 7Wh. Alternatively, at manufacturer-specified cutoff voltage of 2.8V, energy is 8.8Wh. Available energy at 3.1V cutoff is therefore 79.5%.
So once we done an actual calculation, even a napkin one, with fair assumptions, we instantly see how "no capacity loss" or "just 5-10%" falls apart. Now I admit there are quite some many assumptions; maybe OP does not mean absolute maximum cell current but some more conservative rating.
On the other hand, if the OP has any intention to run this at cold ambient - doesn't need to be Siberia-tier weather, just something like +5degC does the trick already - voltage sag of the cells explodes. Same happens when the cells age. End-of-life is usually defined as ESR rise of +100%, meaning voltage sag doubles. These implications mean that 3.1V cutoff in some totally realistic conditions means losing more than half of available energy. You don't want to make such compromise willy-nilly; instead, do educated choice.
Really, I say this again: all assumptions about li-ion cells being "fully discharged" at cutoffs > 3V (some say 3, some say 3.3V, most ridiculous claims say 3.5V) are valid only in low discharge current applications. When the designer specifically says "maximum power" - and here it doesn't matter if it's really absolute maximum, or some kind of recommended long term maximum -, such assumptions should fly out of window that very second. More careful analysis is needed.
This is not limited to li-ion design; alkaline cells share the same tendency of increasing ESR loss near the end, or at cold, so that open-circuit voltage is meaningless. Gadgets that die too early and leave unused battery capacity exist, and people use stuff like Batterizer / build their own Joule thief circuits to make them work. I suggest, don't make that mistake to begin with; design your thing (commercial or hobby) to run down to 0%, or maybe 10% is acceptable, over all intended operating conditions. Something which you feel is 5% but could in reality be 50% is not acceptable.
-
A good buck converter should be able to get close to input=output, it's just means it's running the FET at 100% duty.
"Goodness" is a poor metric here. Usually, a converter which uses an NFET instead of PFET, because NFETs inherently have better figure of merit (smaler Rds_on for given die size and parasitic capacitance), even with added complexity from bootstrap circuitry, suffers from less than 100% maximum duty cycle, while it is still definitely "good". If one needs 100% duty, then a PFET-based buck IC can be used, but then again it could have higher Rds_on. Some controllers have even lower maximum duty cycle, for other reasons than just charging the bootstrap capacitor; read the datasheet and choose wisely.
-
(https://lygte-info.dk/pic/Batteries2012/Samsung%20INR18650-29E%202900mAh%20(Blue)/Samsung%20INR18650-29E%202900mAh%20(Blue)-Energy.png)
Strange discharge curve. Not wanting to start a fight, but it seems to me like you picked the strangest one you found on google pics to prove your argument true.
But the majority of the discharge curves are nowhere close, or maybe simply the scaling makes it look weird.
Still, there's little left at 3.5-3.4V. Also check the papers showing the cell damage as deeper the discharge gets.
So for extra 5-10%, the cell lifespan shortens a lot.
I wonder if a buck converter wouldn't work as well. By the time the batteries are down to 3.5V, they are already mostly discharged.
Not even close - this myth lives strong on forums
Maybe other Lithium chemistries have good power under 3.5V, but common Li-ion discharge curves say a different thing:
(https://siliconlightworks.com/image/data/Info_Pages/Li-ion%20Discharge%20Voltage%20Curve%20Typical.jpg)
(https://www.lipolbattery.com/image/3.8V%20LiPo%20Battery%20Discharge%20Curve.jpeg)
-
I have a similar design I put together very recently, I just used a buck converter. As mentioned, it's going to fall off quickly before it gets to 3V per cell so little to gain with boost.
BTW, I had the MOSFET on the low side which is possible with a floating supply. That makes gate driving much, much easier to deal with all the way to 100% duty cycle. The charging (from the 12V bus) needed a P channel on the high side for the boost converter, but that's only about 2-3A compared to the up to 15A peak for discharge.
-
Strange discharge curve.
It isn't, it's really typical for the most common NCA, LCO, NMC, LMO chemistries - two first-mentioned are what most probably mean by "li-ion". It was chosen exactly because of said typical-ness. LFP would be significantly different. Most of the tens of different cell types I have measured are similar-ish. Differences exist, some are flatter, but the "Silicon Lightworks" curve you posted is a totally made-up, which is pretty obvious if you look at the voltage where it starts... The second curve set is clearly real but doesn't even specify the discharge current so pretty useless for the discussion. For someone who has actually measured cells, it is instantly obvious this is a relatively low-current discharge, probably C/5, C/2 tops, and it's probably a 4.35V (3.8V nominal) LCO variant which is a pretty rare beast, used in some laptop packs a decade ago but not very relevant.
99% of li-ion cells we would discuss here would be 3.6 or 3.7V nominal types, so it really appears you picked a weird chemistry "to make a point".
And if you look at the Samsung 29E curve I posted, my point holds: at 0.2A discharge current, the suggested 3.1V cutoff works perfectly fine. It is the higher discharge current where the problem arises.
If you look at high power cells, manufacturers specify even lower cutoff voltages down to 2.0V.
All you need to do is to learn and stop giving advice about stuff you have zero understanding or knowledge about; I don't do that either. But I happen to have pretty good idea of how typical li-ion discharge curves look like because I have tested and measured hundreds and hundreds of them and written reports of the results so I instantly recognize bogus stuff. This is why I replied to this thread. I did not reply in hundreds of other threads where I don't know about the subject.
Don't assume. Test, measure, calculate. Don't trust stuff which doesn't mention any details. Trust those that specify part numbers, testing conditions etc. - e.g. lygte-info. Basic engineering stuff.
You can't just go and assume that OP's case happens to be a lucky best case, or that OP exaggerated the discharge current. Either we need more information including part number of the battery, or we are going to assume worse-than-average case. Or at ****ing least the average case.
Finally, don't project your own ego issues to others. While your intention is clearly something else, mine is only to provide correct information and bust harmful myths, not to win a case. Now can you please just add me to your ignore list as you have promised before and stop derailing threads where I happen to correct or refine incorrect or misleading advice, so that others can still learn while you are unwilling to?
-
99% of li-ion cells we would discuss here would be 3.6 or 3.7V nominal types, so it really appears you picked a weird chemistry "to make a point".
Well, you perfectly know 3.7V li-ion is 4.2V fully charged, so no weird chemistry, lots of phones have 3.85V batteries these days (4.35V charged) and that plot showed up, I though it was interesting to compare them.
As you see, pretty much the same, just storing a bit more energy due higher cell voltage, but almost gone below 3.5V.
I don't understand the overdramatic answer? If you look for li-ion discharge curves, most of them have that flat area around 3.7V.
My main point was on the "3.5V forum myth".