Electronics > Power/Renewable Energy/EV's
Lead-acid charging.
Codebird:
I've been through a couple of cheap solar PWM chargers off of Amazon, and I've noticed some of them have 'unconventional' ideas about battery management. I'm pretty sure that under no circumstances should it go over 15V.
So, I built my own controller. Cobbled together. And it was good. Now I'm taking it one further: A fully-functional charge controller, open design which I shall release once I've worked the bugs out. Cheap to make, over-current protection, logging, monitoring of input and output current, all the good stuff. The prototype is in my garage right now. But, there is one area that I cannot figure out: Optimal charging of a 12V lead-acid.
It's not that there's a lack of information. It's an excess, most of it contradictory. Some sites specify an exact optimal float voltage, some give a range, some say it depends on the type of battery, some say temperature compensation is essential, and a few say to see the manufacturer's datasheet. Some day saturation charging is essential, but don't agree on what circumstances it should be performed, or how, or when to stop. So, I'm seeking expert advice here.
My charger has the ability to monitor battery voltage to a precision of 0.04V. It can also monitor battery current, though due to my design for a super-low-cost current measuring circuit only to an accuracy of 20%. This should be good enough. It currently has no temperature monitoring ability, but this can be added if need be. Easy enough to put one of those nifty little one-wire sensors in.
My current approach uses a simple two-state machine:
- In the 'float' state, regulate a constant voltage of 13.4V. After 240 hours have elapsed in this state, or if the battery voltage falls below 11.5V, transition to 'saturation' state.
- In the 'saturation' state, regulate a constant voltage of 14.2V. When the current required to do this falls below 3A (With a peak-follower to eliminate the impact of clouds), transition to 'float' state.
This is for a flooded battery - it's a big one designed to power an RV. But the voltages are adjustable. For that matter, the whole charging process is governed by an arduino or a bare ATMega, so it could be adapted for other chemistries entirely if you wanted to.
Now, feedback, please. Am I doing it right, or is my approach going to shorten battery life? This charge controller is designed for low-cost, small-scale solar power installations up to 300W, so I really want to get the longest life practical from the battery - no point using this charge controller if it'll ruin your battery in a year. I'm quite hopeful about this design - it should cost about the same to build one as buying a low-cost PWM controller off the shelf, but it'll be superior to most of them.
GoneTomorrow:
Temperature compensation is generally considered important, especially if the batteries are somewhat exposed to wild environmental temp changes. The voltages required to properly charge lead acid batteries actually vary substantially with different cell temperature. Compensation for FLA is about -4mV per °C per cell. So if your baseline absorption voltage is at 25°C, decrease by 4mV every degree above 20°C, and increase by 4mV for every degree below 25°C. An example of the effect is that at -15°C battery temp, your absorption voltage is going to be a whole volt higher than at 25°C. The temp sensing should ideally be remote, right on the batteries.
Maybe also add an equalisation mode, for flooded batteries that aren't moved for a few weeks, which can cause stratification of the electrolyte. Usually entails charging up to 14.8-15V for a few hours every month or so.
Probably not much of an issue for an RV that's gonna be moving around (stirring the electrolyte up), but definitely useful for static installations using FLA batteries.
Codebird:
I was undecided about the the need for a temperature compensation sensor, but if you say it's important I can look into adapting the design - should just need a single component, though I don't know what kind of range I'm getting on one-wire bus with only a 3.3V driver. Right now my prototype is very slowly charging batteries, because December. I get about half an hour of direct sunlight a day on the panels.
I didn't use one initially because I notice the cheap-and-nasty controllers seldom have temperature compensation. This made me wonder just how essential it is. I'm going to leave it in for testing for a week or so (Once I manage to get the telemetry out - it's right on the edge of range for the 1200bps radio link I'm using), then decide what revisions it needs.
GoneTomorrow:
If your batteries are in a fairly controlled environment, and you charge at low current (little battery heating), then you can probably get away without temp compensation. But otherwise you run the risk of over/undercharging when at extremes of temperature which will shorten battery life.
Yes, a lot of the cheapo chargers don't have it, because it costs money, and isn't essential for charging a battery, as long as you disregard lifetime. You'll find all the best solar chargers (Morningstar, Midnite, etc) are all temperature compensated, even at the low end.
Yansi:
Even the simplest LM317 based Pb charger can be taught to compensate for temperature: Just stick two Si diodes inside the feedback loop. :)
Navigation
[0] Message Index
[#] Next page
Go to full version