Hi experts in EEVBlog,
As a newbie in electronics, I am currently interested in building a battery charger, specifically for a car battery (lead acid). From what I learn about battery charger, it maybe similar to an adjustable DC power supply, but with a preset of voltage (maybe about 13.8V-14.4V) and a constant current. I have studied the behavior of charging a battery. Basically, it will have 3 phases: at phase 1 the battery will pull all available current until it reaches the topping voltage (it will be limited with a constant current circuit), at phase 2 the battery will draw less current and it bottoms out, and at phase 3, it will draw very low current to counter the self discharge current.
I've experimented to find if this is true by charging a small car battery (7Ah rated) with my DC power supply. I set the power to 2A and 13.8V. So far, it behaves similarly like the theory suggests. However, when I searched through the internet, I found that there are many car battery chargers out in the market with a very high current rating, such as 12A, 15A, 20A, so on. What bugs me is that how does this current rating affect the charging of a battery with a far lesser capacity than the battery charger? Will the battery pull all available current that the charger can supply and potentially damage the battery because of over charge? I need to know this behavior to design my own battery charger, and how to sense the state of charge of the battery. Any explanation is very much appreciated.