EEVblog Electronics Community Forum
Electronics => Beginners => Topic started by: DW1961 on September 07, 2020, 05:21:08 pm
-
In order for a battery bank to discharge all of its stored energy to another device, is it voltage of the device that needs to be lower than bank's voltage?
I'm trying to figure out how many cell phones (given known battery voltage) a charger can charge before it starts slowing down, or not charging at all.
For instance, if you have a 120V power strip, and it has a USB-A port rated at 12W, 5V, 2.4A port, what is going to be the limiting factor for charge speed and how many phones you can charge at the same time?
I do realize that all phone batteries can have different voltages etc. I'm just trying to get a general rule here so I can figure things out on my own.
I'm also assuming that cell phones have smart charging circuits (like caps) that could allow it to charge from a lower voltage device (although slower)? However, that's a different question.
To be clear, I'm not asking how fast x will charge y. I'm asking what is the limiting factor in charging a cell phone (or any battery for that matter) given known voltages and amps, and I assume that factor is the same no matter if you are charging from a 120V source or a battery bank?
Thanks so much again!
-
For instance, if you have a 120V power strip, and it has a USB-A port rated at 12W, 5V, 2.4A port, what is going to be the limiting factor for charge speed and how many phones you can charge at the same time?
It's complicated. A cell phone has circuitry which regulates the amount of current it draws from a charger. It could draw 1A or possibly 2A if it is capable of handling that amount of current. When the battery nears full charge it will start drawing less current.
The charger signals to the phone how much current it can deliver -- usually by resistors placed on the D- and D+ lines of the USB port, so it is unlikely you can use the same USB port to charge multiple phones.
So the charging speed is a combination of what the charger can deliver and what the cell phone will draw.
In order for a battery bank to discharge all of its stored energy to another device, is it voltage of the device that needs to be lower than bank's voltage?
A battery bank usually converts the current battery voltage (whatever it is) to a fixed output voltage -- like 5V. For instance, typical cell phone power banks have Lithium-ion batteries in them which are at a nominal 3.7V. A step-up converter is used to produce 5V at the output. When fully charged the lithium-ion batteries will be at 4.2V, but as current is drawn from the bank they will discharge and their voltage will decline. When the batteries reach around 3.0V the bank will simply stop working until the batteries are recharged. But while above that voltage the power bank will attempt to put out 5V at its output. On the flip side, the cell phone will always expect 5V from the USB port.
You might want to get a "usb voltage tester". It will show you how much current your phone is drawing from a charger (or power bank) as well as what voltage the charger is putting out.
-
For instance, if you have a 120V power strip, and it has a USB-A port rated at 12W, 5V, 2.4A port, what is going to be the limiting factor for charge speed and how many phones you can charge at the same time?
It's complicated. A cell phone has circuitry which regulates the amount of current it draws from a charger. It could draw 1A or possibly 2A if it is capable of handling that amount of current. When the battery nears full charge it will start drawing less current.
The charger signals to the phone how much current it can deliver -- usually by resistors placed on the D- and D+ lines of the USB port, so it is unlikely you can use the same USB port to charge multiple phones.
So the charging speed is a combination of what the charger can deliver and what the cell phone will draw.
In order for a battery bank to discharge all of its stored energy to another device, is it voltage of the device that needs to be lower than bank's voltage?
A battery bank usually converts the current battery voltage (whatever it is) to a fixed output voltage -- like 5V. For instance, typical cell phone power banks have Lithium-ion batteries in them which are at a nominal 3.7V. A step-up converter is used to produce 5V at the output. When fully charged the lithium-ion batteries will be at 4.2V, but as current is drawn from the bank they will discharge and their voltage will decline. When the batteries reach around 3.0V the bank will simply stop working until the batteries are recharged. But while above that voltage the power bank will attempt to put out 5V at its output.
You might want to get a "usb voltage tester". It will show you how much current your phone is drawing from a charger (or power bank) as well as what voltage the charger is putting out.
So the charging device voltage must be higher than the device it is charging?
-
So the charging device voltage must be higher than the device it is charging?
For a raw battery - yes.
But a cell phone has a lot of circuitry in front of the battery. In the case of USB charging the protocol is that chargers supply 5V and cell phones accept 5V from the USB port. What they do with that 5V is up to them. For instance, they could boost it to 14V to charge a lead-acid battery.
-
If you want to play around with charging raw batteries, I'd suggest getting some TP4056 modules and some 18650 cells which you can find in old laptop battery packs:
https://youtu.be/wfrm6lbt8Pc
-
If you want to play around with charging raw batteries, I'd suggest getting some TP4056 modules and some 18650 cells which you can find in old laptop battery packs:
https://youtu.be/wfrm6lbt8Pc
So how many watts are they going to take (cell phone)?
Now I'm trying to figure out how many cell phones (generally) this could charge: 120V power strip, USB-A port rated at 12W, 5V, 2.4A
-
So how many watts are they going to take (cell phone)?
Now I'm trying to figure out how many cell phones (generally) this could charge: 120V power strip, USB-A port rated at 12W, 5V, 2.4A
If you only have one USB port you basically can only charge one phone at a time.
Most phones will attempt to charge at 1A minimum -- so 5W.
Some phones will attempt to charge at 2.4A -- so 12W.
To charge more phones I would plug more chargers into the power strip to give you more USB ports.
-
If you want to play around with charging raw batteries, I'd suggest getting some TP4056 modules and some 18650 cells which you can find in old laptop battery packs:
https://youtu.be/wfrm6lbt8Pc
So how many watts are they going to take (cell phone)?
Now I'm trying to figure out how many cell phones (generally) this could charge: 120V power strip, USB-A port rated at 12W, 5V, 2.4A
There is no general rule for that. It depends on the battery and the charging mechanism.
I've a few phones, one with a 3500mAH but the charger board inside the phone would limit the current to 1A. Another phone (same brand, different model) with a 2500mAH battery has a charger board that would take 1.1A max. Whereas, just the immediate previous reply ledtester wrote, most phones [he encountered] has 1A being the min -- case and point how phones differ. So, general rules is difficult to come by when individual device varies so much.
My phones (when on) will consume 50mA-350mA while ON but with occasional 500+mA and 1A+ burst of consumption (probably running internal garbage collection, connection check, trying to communicate with the tower or something like that). That means typically if you can supply each with > 350mA, they will charge. But when the burst of higher consumption is frequent, what you put in between bursts may not cover the consumption. So you have to charge it for a while to see if battery charge go up or down.
The 2500mAH phone's charger is very finicky about charge voltage. If it cannot hold voltage above 5V+-0.1V even if for sub-second (and climb back up), it will stop charging. Whereas, the 3500mAH phone is more forgiving with brief voltage drop but very temperature sensitive. If the phone warm to touch (but not yet too hot to hold), it will cut charging.
Now those reactions are my phones - four total. They all acted different except the two which are exactly the same brand and model. Other brands and model would likely be different as well. So you really can't make a general rule out of that.
For smart phones running off LiIon, typically, the charge circuitry will be one designed for 5V/USB and the charge circuit drops it to proper charge voltage of around 4.2V. So the best generally rule one can assume (but not assured) is once voltage drops to around 4.6V ish, you don't have the voltage needed to charge. Since they are typically designed for USB, another generally rule one can assume (but not assured) is if current it can supply is >=500mA for each connected device, it may have enough current to work with. The actual drop out point may be a lot more than just 500mA each.
The way to really find out is to try it, and see if all connected device has battery going up. USB tester (aka USB charge doctor) is a great device to have. Get a few with different brand/make. Some will have lower internal shunt resistance than others. My better feature one has 0.1 ohm shunt (0.1V drop at 1A), great feature but shunt is too high. I have a couple of cheaper one has 0.05 ohm shunt, less accurate and no nice features but less voltage drop. I ended up using the cheap ones most often since I like the less voltage drop (less chance my phone drops out of charging).
EDITed:
- First Paragraph - include a contrast to the immediate previous reply by ledtester which came in after I wrote the initial reply.
- Second to last paragraph, the 500mA line missed a "not", so reworded without the not to make that work instead of not work.
-
There is no general rule for that. It depends on the battery and the charging mechanism.
I get that for sure. If it charges, great, if not, well there you go.
Thanks much.