Thanks for the article.
After reading the article maybe connecting the headset directly to a linear Power Supply set for 5V and 1A would have worked, or maybe the headset charging circuit would have been expecting a more "intelligent" "advertised" handshaking arrangement? Perhaps the headset would have expected an initial lower setting on the current and then expected the current to ramp up based on communications between the headset charge management circuit and the Power Supply (which of course would have had no communications path to the headset)? So maybe it was good/lucky that I stuck with the USB power adapter?
No, you're overthinking that. Absolutely no part of the charging process control happens on the charger side. All it has to do is provide a stable 5V and be capable of reliably providing current of at least up to the stated 1A. Plus, as others already mentioned, there's often either a chip or just some simple pullup/pulldown resistor trickery on the USB data lines to tell the charged device what it's dealing with. You won't be able to damage a modern, properly designed USB-charged consumer piece of electronics with "too much current", end of story. It's the actual device that governs the Li-Po charging process. In case of overwhelming majority of consumer gear, a "charger" really is a constant voltage DC power supply. The worst that could possibly happen after connecting the headset to your lab supply with USB data lines floating, it would assume basic/non-compliant charger and fail-safe either by refusing to start charging or (most probably) charging slowly at safe less or equal to 500mA current. Just think about it, how silly and irresponsible on the device engineer part would it be to rely on the actual charger to control the Li-Ion chemistry charging process. It would just be terrible, terrible engineering. There are literally thousands of cheap chargers, power banks, car adapters, you name it, used willy-nilly with thousands different USB-charged devices used by millions people around the world every day. And it's the single isolated incidents of someone unlucky enough to actually get hurt you can read in the press. Another example: your car battery is easily able to provide some 100-300A of current you need to start the engine. Have you ever been afraid that's "too much current" for your car radio or car laptop adapter to handle? Or that the utility power in your house provides "too much current" for the new LCD TV you just got? Of course you haven't - the devices just take what they need. They present an equivalent resistance (according to Thévenin's or Norton's theorem) to the source and it doesn't matter what current the source is theoretically able to provide. You really should stop thinking in terms of a "supply giving too much current" - it's the job of the circuitry inside the device being powered to be able to deal with the supply that is able to provide stated voltage and at least the maximum current it needs. Of course, as with almost everything in this world, there are exceptions to this rule (some super-cheap power tools, toys, novelty items, etc.). That's why I kept reiterating "properly designed".
Zbig, Thanks for your reply. It is very helpful.
Here is a quick recap to make sure I have it (at the risk of over-thinking
) and a follow-up question.
For any device that runs on DC that is properly engineered, the device could be powered by a lab power supply as long as the lab PS supplies a stable voltage as required by the device and as long as the PS is set to current equal to or greater than (including any amount greater than) what is required by the device. In this arrangement the device will draw whatever current it needs, and no more than it needs. I'm pretty sure this is an accurate summary but I will rely heavily on it and want to make sure something wasn't lost in the communications (and based on your very good post above I'm confident it is correct, I think
/ I hope
- Thanks).
Moving on...
When a battery operated device is fully charged, what happens to the battery during the charging process? Does the resistance raise to a point (when the battery is fully charged) that the current can no longer flow? Does this mean that a battery with no charge has very little resistance and as the battery charges the resistance either gradually increases, or maybe it remains steady at some charging resistance level from empty to near full and then it significantly increases? (And presumably the PS just keeps happily supplying the set Voltage and little (trickle) or no current?)
On a related question, if the battery's resistance changes during charging, what is the role of the charge management circuit in the device?
Finally, what is the risk if any of overcharging a battery (assuming an on-going supply of the proper Voltage and minimum required current)? To what extent does this risk vary with types of batteries? (And to what extent does the risk depend on the charge management system (if it is present)? Are some types of batteries more happy to charge in a "dumb" environment (with no charge management or a very limited charge management circuit), and are other types of batteries more reliant upon relatively smarter (more active or IC-enabled) charge management circuits?
Thanks again for the very good teaching. EF