Yes, you see I agree with that. I think it is the "generalisation" occurring, where people are parroting some small contextualised bit of information and then applying it in far broader terms than it deserves.
For instance, you mentioned "open-circuit voltage". That is a term most of these people don't understand. They will give you the voltage they see on the harbour freight meter to 2 decimals, but with a load applied and then confidently state something like "Look, as I said the rest voltage for the 4S pack is exactly 13.20V. "
So it's been frustrating me. I stopped and asked, what exactly am I trying to answer that is leading me through this frustrating content.
I am working on and testing the top (and bottom end) responses in my own solar/lithium system. So I am dealing in some of the finer details such as balancing, absorption (or tapering depending on what you call it). Trying to get the battery with it's BMS in harmony with the MPPT such that (1) I am getting the full capacity, (2) I am not invoking the BMS disconnect and (3) I am not over stressing things unnecessarily.
Obviously any voltage measured while load is applied is either invalid, or incomplete. I am not making things easy on myself, because I am conducting these tests in-situ, not "lab conditions". I stubbornly refuse to cheat and waste power or use the mains to charge the pack. So I wait on the sun to work with me.
What is making that easier is persistent high resolution data across multiple measurements. So if the sun plays ball while I am not on the ball, I still get to review what happened afterwards.
Not a lot of what people are claiming is adding up. I come here and people usually make sense. Which is why I came back here.
On the datasheets. They are very, very hard to come by. What you usually get is a single page which has the bog standard stuff on it. Do not exceed, 4.20V. Max charge voltage: 3.65V, Max discharge 2.5V. etc. The best you can hope for is the capacity test conditions and results. That is about as close as you'll get for "charge parameters". They vary between tests, manufacturers, batteries and the various studies I have read. It just seems they are "parameters" and without defining them all the rest of your results are invalid. What I'm saying is, a datasheet might state it's capacity test as 3.65V until a current of 0.05C. I feel that is a statement of necessity and having to draw a line somewhere rather than a limit.
I am rapidly approaching the point where I'm realising the slim benefits that exists anywhere in that top few % of capacity are just not worth the diligence or the chance of mistake.
Coming back around on this for about the dozen'th time now I am starting to realise, some of those completely "made up" figures people throw around, the over zealous babying and artificially lowered limits, are not just safety margins of "lieing to the children", and no they are not valid and correct to the letter of any documented process... what they are however is what people have decided to target which lies somewhere between stupidly babying and probably doing more damage versus running them to within an "inch" of their limits.
It was when I pondered buying a much bigger battery and looking at £2500 I started to consider if I would be so willing to do these experiements and argue/haggle and debate over the "peaks" or would I rather not set the operational limits well inside the datasheet maximums and not have as much concern, monitoring, risk of damage and just "chill" and let it be. If I did that, how close would I end up to these "made up figures" which annoy me.