I bought a USB multimeter, and a USB load tester along with loads of adapters, and I am doing a bunch of tests on all my cables and chargers to see which ones are bad, and which ones are good.
There are great cables like Anker 3ft ones which will transmit like 98% of the power with almost no voltage loss, and then there are terrible cables which will do like 15% (Yes really. This is the absolute max current I could get before my multimeter didn't get the 3v required to be powered on)
Thats not even USB spec!
BUT! My tests are not actually relevant unless we know what voltage devices will charge at.
Sure, a 6ft Belkin USB extention will pull 2A at 3.97v which equals almost 8w, not too bad right? Well what devices charge at 4v? None probably.
Lets assume a device will charge at 4.8v, well now you can draw a MAX of 400ma... Thats less than 2w and your device will take a million years to charge
4.7v Maybe? Still at only 2.7w. 4.6v? Still a pathetic 3.5w
I plugged an iPad into my multimeter in place of the load tester, and through the USB extension it drew almost 1A and the voltage dipped to 4.4v and it was still charging. This would lead me to believe the limit is somehwere around 4.2v (Lithium cell charge voltage) It may have actually been 4.2 at the iPad since there was an extra cable going from my meter, into the iPad
But then to make this even more confusing, I plugged in a crappy Micro USB cable, which is now drawing 1.5a at 4.64v. It could get the full 2a before even getting close to 4.2 or even 4.4v...
How on earth is it choosing how fast to charge? If I had some kind of variable voltage power supply I would test it out, but sadly I do not... Its either 5v or no voltage for me!