Anyhow, I think that's a rather contrived interpretation, since the datasheet expressly warns not to ever discharge below 2.5V.
Yes but if you always follow the standard discharge ending condition there will never be a need for the precharge!
The precharge is a conditioning step for overdischarged, possibly damaged cells. The problem with the Samsung instruction is that it lacks the initial acceptance voltage. I have seen values like 1.0V or 1.5V before.
What is true is that they do not define what should to be done with a cell that nonetheless has fallen below 2.5V. But there is no ambiguity that <2.5V is NOT an allowed condition. Since -99.9V<2.5V, and 0V<2.5V, they clearly aren't "allowing" charging in that state.
A sensible interpretation of the datasheet is this: precharge current of up to 150mA from 2.5V to 3V, then regular CC charging up to 4.20V, then constant voltage to C/10.
... so you think they are implicitly saying (as follows from following other criteria) the initial acceptance voltage is 2.50V. This is an interesting interpretation, and a good one which I would like to agree with. But I'm almost sure they
don't mean that (see later on my message about a contradiction); it's nearly the same as not doing precharge at all. If you
do use an initial 2.5V threshold for acceptance, this 150mA procedure is much safer and I have nothing against it.
But your assertions only exist by ignoring the maximum values given elsewhere in the datasheet.
Yes but the need for precharging usually arises from ignoring the given values. It's a well-known "rescue tactic" that varies from forbidden, to "dodgy but acceptable with a careful algorithm", to something considered a part of "normal operation".
I like the middle one, of those three opinions about it.
There is one interesting corner case though: very low-current discharge. This is because when a manufacturer comes up with the discharge end voltage, they do that on some certain discharge current, very often C/5. Samsung 29E datasheet, unsurprisingly, forgot to document this vital piece of information* as well

. Now if you use a significantly lower current than what they did, the ESR * I drop over the cell is lower, you can discharge more capacity out of it before reaching 2.50 under-load voltage, and you end up at lower SoC% (per manufacturer definitions 0% being with the given conditions, this will be a negative percentage!), and lower open-circuit voltage. This will lower the amount of voltage bounce-back after the discharge. While a cell discharged at C/5 until 2.50V bounces back somewhere around 3.3V, one discharged at C/50 until 2.50V is slightly overdischarged and only bounces back to, let's say e.g., 2.90V.
*) Again, I'm sure you can get the actual datasheet from Samsung if you commit to buy ten million cells.
Now if you use the
same discharge cutoff value (2.5V) as the acceptance limit to enable charging, there will be no problems. But if you use a higher acceptance limit like 3.0V, there might be: your gadget discharged the cell until end, and now it refuses to charge. This effect could be seen in some laptop batteries of early 2000's; they refused to start charging and you would need to inject a bit of charge into the cells using a lab supply, then it would start. It was a design issue because it was
the laptop itself which discharged the cells too far to begin with; OTOH, the cells were fine so maybe it was the charging side which had too high of acceptance limit. Maybe it was a "discharge to 2.5V, require 3.0V" issue with a misestimated bounceback.
The
correct way to solve this problem at the root, IMO, is to make the low-voltage cutoff limit dynamic and based on current. This is often done for high discharge applications, but you should think about this for very low current applications as well, for example only discharging down to 3.2V and change that limit temporarily to 2.5V under short load peaks, for example when a radio sends a packet with high power.
The datasheet says to use 3V as the 0% SOC
Certain open-circuit voltage corresponds to certain SoC% and that's defined by the actual cell chemistry and characterized by the manufacturer or the customer due to lack of manufacturer data availability. I have characterized the 29E and OCV for 0% SoC for this cell is roughly around 3.35V to 3.45V, I don't remember exactly. Edit: looked up that 29E discharged to 2.50V with C/10 cutoff bounced back to 3.304V OCV. With C/5 cutoff that would be a tad higher.
I guess you are referring to this: "Cell voltage in SOC 0% of Pack should be Min. 3.0V."
My interpretation is that after discharge is stopped i.e.
load is removed , each cell in the pack should always read
at least 3.0V. Which is a very sensible requirement. It doesn't mean 0% SoC is at 3.0V.
But that aside, don't you see the contradiction to what you said? If this condition is satisfied,
there never is need for the precharge, as normal charging can commence already at 3.0V according to the very same datasheet! This IMHO confirms my interpretation that the precharging as described is a rescue procedure when other datasheet conditions have been already violated (in this case voltage < 3.0V). So if they assume this 3.0V is already violated (it's not a recommendation, it says "Requirement"), what makes you think the 2.50V limit is not violated, too?
That's also why I don't like the fact this rescue protocol is so poorly documented and simplified, but it needs to be seen as providing some absolute maximums, i.e., never exceed 150mA below 3.0V. In my opinion, more limitations (initial voltage, time before giving up) are needed but these might be available in the full documentation.
The datasheet also specifies maximum discharge currents for <3V and <2.5V, which is presumably to limit the chances of a cell going below 2.5V to begin with.
No, you got this backwards. To prevent cell from going below 2.5V (OCV, no-load voltage, or <0% SoC), you need a MINIMUM not maximum discharge current rating! You need enough current going, generating high enough voltage drop over I*ESR, to ensure ending up above 0% SoC given 2.50V cutoff voltage.
Samsung does not specify this parameter or any data helping with determining this value in this datasheet; I'm almost sure I have seen other similar-looking Samsung datasheets where the current used to obtain discharge cutoff voltage
is specified. But in this case, own testing is needed. Choosing discharge cutoff at 3.0V is obviously the easiest and safest choice and Will Just Work and satisfy all conditions given in the datasheet. It just greatly limits the available capacity in high discharge current applications.
I guess you refer to this:
"Shut Down Mode : Under 10uA / Under 3.0V.
Under 1uA / Under 2.5V."
This refers to their opinion about what is a good quiescent current taken from the cell by the BMS in "shut down mode", not maximum discharge currents when you are discharging to loads on purpose. Here the idea is to ensure the cell is not overdischarged over some (unspecified) calendar time due to the BMS leakage. My suggestion for a BMS designer is to always calculate this yourself leaving the necessary paper trail on the reasoning.