Electronics > Projects, Designs, and Technical Stuff
Panasonic NKY467B2 36V 15AH 540Wh. Ebike Li ion upgrade, burning my father'ass?
<< < (3/9) > >>
nctnico:

--- Quote from: Siwastaja on October 09, 2018, 05:45:49 pm ---
--- Quote from: nctnico on October 09, 2018, 09:26:21 am ---Unlike other chemistries you can not use the voltage for the state of charge in Li-ion batteries.

--- End quote ---
Completely the opposite.

On li-ion (except maybe LFP), the voltage is way better estimate for SoC than on almost any other chemistry. On NiCd, NiMH, lead acid, SoC estimate based on voltage is almost useless, because the discharge curve is almost a flat line: the voltage difference between 80% and 20% may be just 5%, or even less.

--- End quote ---
I think you got this completely the other way around. The charge/discharge curve on most Li-ion cells is a flat curve for a big part of the cycle and the temperature has a much bigger influence on the cell voltage than the charge level. IOW: the cell voltage is completely useless to determine the amount of charge (SOC). This is the reason why quality BMSses use a charge gauge which is calibrated for the specific chemistry of the cells reaching accuracies of 1%. One of my customers makes Li-ion battery so I know a thing or two about Li-ion cells and BMSses.
Siwastaja:
Please check the link I gave you with the actual curves. They have a lot! You can easily see they are not flat. My measurements give similar curves. This is again one funny discussion, it's always a strange feeling when someone comes from around the corner and tells something you have been doing on a daily basis for years just fine is "impossible" or "does not work"  |O. But let's be more specific here, I hope this info helps anyone:

I have designed and built battery measurement & cycle testing equipment and performed probably millions of test cycles on at least 30 different cells of LFP, NMC, NCA, LCO and LMO chemistries. I have designed and small-scale commercialized a few BMS systems as well, and built two concept EV conversion battery systems (20 kWh and 40kWh) to test said systems on the road. One of the cars was in daily driving for years I think, giving a lot of data to look at at varying real-life conditions. I was doing some of this research for some time at a university (including curve testing, internal resistance testing, self discharge testing, capacity fade testing, and DCR rise testing); I had some very good chances to communicate with the chief chemist of a commercial li-ion manufacturer there. These kinds of contacts are really valuable when you need to deal with the massive flow of true and false information on the 'net - even in academic research.

Then I went freelancer to design another piece of cycle testing equipment. We did quite a few measurements for companies designing large li-ion equipment. The problem with the li-ion cells is: even if you are a customer buying in tens of millions, you are not big enough to get real data on the products. So you need to test. And to do that, you outsource it to a test lab. A test lab, then, has a $1000000 setup of MACCORs. Or, you can use some HobbyKing  style battery testers. We tried a PowerLab - they are utter jokes and toys (as expected). So I designed some sanely working redistributive (energy-saving) "accurate enough" (to 0.5% typically) equipment which can automatically cycle, perform different pulsing for DCR measurements, inject specific AC current waveforms, measure AC resistance, log capacity, voltage, temperature... With configurable channels, we could run 20 different cells at +/- 20A each, or run fewer tests with paralleled channels. We had some 100A discharge tests going on some 6Ah cells for over a thousand of cycles. We used fridges and heatbeds to prove things the customers didn't want tested, but we knew how the chemistry works and needed to prove the point. And so on...

Anyways, to the results, and back to the question of do you need coulomb counting or is voltage measurement enough:

With LFP (lithium iron phosphate), I find coulomb counting is almost absolutely necessary, since the discharge curve is indeed flat. You are probably thinking about this specific chemistry. Which has become quite a niche IMHO.  To be fair, some other chemistries are somewhat as flat as well - I have seen some fairly flat NMC (nickel - manganese - cobalt oxide) cells as well. Not as flat as typical LFP, but I'd expect these are difficult. But these are in minority.

In expensive, large pack EVs, I use coulomb counting as well, because 1) the relative cost isn't prohibitive; 2) and because users often find they need an exact, steadily dropping "km/miles range left" indication, and become panicked if the range jumps near the end; and, last but not least importantly: 3) these systems tend to be fully charged after almost each cycle, or almost every day, giving a reliable reset point for the integrator. This is really important! No one builds an everyday BMS with a 0.1% or 0.01% precision current sensing. These exist in said $1M MACCORs. +/-2% is reality, and this means it drifts below the accuracy of a simple voltage based system in just a few cycles if not reset. And, reseting is always based on voltage; the trick is it happens on well known conditions, such as 4.20V at C/20 charge rate.

But in simple, small gadgets, I always tend to lean towards voltage-based estimation; sometimes I'm so lazy I just do it as a linear approximation - not a single customer has complained about the nonlinearity of even these solutions! It runs faster when it's starting to get empty, and wiggles around with the load a bit. Depends on the exact specifics how usable it is, but your blanket statement it's not usable is clearly untrue. I find it usable more often than not.

Agreed, using a chemistry-specific lookup table provides much better results, and is utterly trivial to implement - download a curve of your cell of choice from lygte-info.dk who probably already tested it and you are done! Yes, the readout wiggless with changing load - but that's intuitive for the customer. It can be a good feature: what good it makes to have a steady 10% SoC value on screen, then the product suddenly dies on you when you press the pedal or push a button, with the load peak hitting LVC limit behing the scenes? With the voltage based estimate, you start seeing it wiggle alarmingly towards 0% momentarily, and you know it'll die anytime soon with the next peak; it doens't take a li-ion chemist to understand this UI behavior. Very intuitive, especially on a graphical bar scale. It's "good enough" out of the box, without complex algorithms or UI design.

Voltage based approximation has one very good trait: it's stateless - and it automatically tracks the remaining actual capacity with absolutely no algorithm! The accuracy is what it is (I say typically around +/-20% when people want to have a simple single number), but it works very robustly in cases where natural reset points for the integrator do not happen. It's an absolute classic to see a completely invalid battery level integration on a coulomb counting based system just after about ten cycles, this has always happened with laptop battery management, for example, and people have complained about it for ages. What's the value in the "+/-1% best case accuracy", if the worst case accuracy is +/- 50%, and the algorithm cannot tell you whether to trust it or not?

This being said, really well designed coulomb counters can, of course, fall back to the voltage mode, do some kalman filtering like correction all the time to the integrator value, and apply temperature and current compensations to get, say, +/-10% accuracy in voltage mode and +/-1% accuracy in the integration mode. While I have seen discussions of such algorithms, and given quite some "brain masturbation" on this idea while walking in the forest hunting (edible) mushrooms, I have never seen such system implemented in practice. Maybe some magical IC is already getting there? I don't know - I don't trust them after seeing so many total failures.

It's worth mentioning that coulomb counting the exact SoC, even if accurate, serves only one single purpose: the user experience. This info is typically not used for battery management purposes.

I did quite some work as a BMS failure analyst. Totally unplanned, it just happened that the failing systems were everywhere. Typically very complex, overengineered. Although the most typical failure mode is that they imbalance the pack and overdischarge a cell (or several cells), it's also very typical they implement broken-by-design coulomb counting algorithms.

Use what tools you need to use - but if you are unsure, and if you are not really up to the task, choose the simplest, and naturally most robust way to do that, if ever possible! A four-five-LED indication of the approximate voltage is really useful in real life. Something like that has always been used in approximate battery gauges, but it only works for alkaline cells - and li-ion, surprisingly!
SiliconWizard:
In my experience, estimating a charge level on a Li-ion/LiPo battery based on battery voltage only is often pretty inaccurate and has many pitfalls. For one, it depends so much on the actual current draw profile that it's hard to use in many applications in which the current draw is pulsed and not at all constant, with periods of low load (maybe in the µA range) and peaks in the tens or hundreds of mA for instance. For a constant current load, that's already much more predictable, but a lot of applications are not a constant current load.

I now tend to use ICs that do a good job at this task ("battery fuel gauge") while using nicely developed algorithms based on voltage, current and accumulated energy, such as the MAX1704x series.
Siwastaja:

--- Quote from: SiliconWizard on October 10, 2018, 05:17:55 pm ---I now tend to use ICs that do a good job at this task ("battery fuel gauge") while using nicely developed algorithms based on voltage, current and accumulated energy, such as the MAX1704x series.

--- End quote ---

Thanks for pointing this out - if you read the datasheet, you'll see they point out the problems with coulomb counting.

Yes. This IS a voltage-based device! They don't measure the current. This is exactly what I have been talking about.

I wonder how you didn't notice you didn't connect any current sense leads anywhere while using this device? How do you think it measures current?

They show an example error analysis in one particular test case with -7%/+3% error over the SoC. This is fairly typical for a curve-compensated, temperature-compensated algorithm. It's probably way more accurate in worst case conditions (probably around +/- 15%) over the classical ones that use current sensing and coulomb counting (with underdeveloped algorithms) and can show any random value.

The best thing? I'm 99% sure their touted magical "sophisticated battery model" which "simulates the internal dynamics of a Li+ battery" is something simple and trivial  ;). The 1k | 1u RC filter they suggest rules out most "AC trickery" as well. Since they don't measure the current, they can't compensate for it (directly).

Just filtering the shit out of the signal (for example, with cumulative moving average on the MCU, with a time constant in range of half a minute), you have a nice smooth number like in coulomb-counted systems. So you'd never know it's "just" voltage based!

OTOH, I would assume that for such 7% error, you would need to:
"To  achieve  optimum  performance,  the  MAX17043/MAX17044  must  be  programmed  with  configuration  data  custom  to  the  application. Contact the factory for details."

This would include the cell curve lookup table, I guess.

It might be easier to just upload the graph from lygte-info.dk to your MCU and do it yourself. Depends on how keen you are on developing your own I guess. The obvious plus side would be the BOM and complexity savings, since this IC doesn't provide any "analog" feature you need, if your MCU has an ADC, this is just some secret code running on a separate device; a way to license the algorithm would make so much more sense.
SiliconWizard:
I mentioned the MAX series as an example and those are the ICs I currently use for this task (I like the elegance of MAXIM's approach and find them reasonably accurate). But I've also used other similar ICs that were based on coulomb counting only or a mix of current measurement and coulomb counting, and those were not that bad actually. Certainly much better than just predicting the SOC based on just the loaded cell voltage on a proportional scale as I've seen done a few times. Using a cell voltage measurement in addition to the estimated consumed charge helped getting more accurate predictions AFAIR, but that was pretty much calibrated on a given battery model.

If you think about it, even though MAXIM doesn't disclose their algorithm, current is still indirectly used IMO. For a given battery model, variations of the cell voltage on a short term will give you a good indication of the drawn current if you have enough resolution. They may not *directly* rely on current estimation, but I think it still gets its influence indirectly in their estimation.

As they state and as I remember from someone working at SAFT, the open circuit voltage of a Li-ion battery is actually a good indicator of its SOC. But in a real, always-on application, the OCV can't be directly measured (usually), so this is an indirect estimation based on the loaded cell voltage as far as I've gotten it, so I guess they most likely use "short"-term variations as well as the average value on a longer term.
Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod