Interesting "car normal use" is not enough to keep your power bank on and needs a whole array or LED to keep it awake. My ASUS power-bank would stay on with a meager 20mA.
Yes it is strange but the whole array of LED with a 220 ohm resistor on one end (and some value of SMD resistor on the other) is not enough to keep my PowerBank awake. I could test the current draw on that and see what it is, but we could just calculate it. I am a bit confused as to what is going on with my calculations, as I seem to be missing something as the results are ending up strange. I will explain...
It looks like my parasitic circuit has 24 LED's, standard white regular size it seems, would assume 3.2V 25 mA. So if we have 24 LED's each taking 25 mA each, then 24 in parallel x 25 mA = 600 mA being used in total across all of them? Is this correct? I'm supplying 5 V, the drop across the parallel LED's is still 3.2V for each, as well as the parallel array, so 5 - 3.2 = 1.8 V which is what will need to "drive" the LED's... and that has to be at 600mA current. To figure out the resistor, using V=IR, V/I=R, 1.8V/0.6A=3 Ohm resistor? Therefore, if I was powering my LED's with a 5 V supply, I would need a 3 Ohm resistor? Seems like it is too low.... Therefore, I would expect that that SMD resistor on there (which I can't read the numbers on) is somewhere around 3 Ohm because it needs to make sure only 600 mA total is getting shared (25 mA each) across those 24 LED's from a 5V supply. I hope my calculations are ok so far?
I think I am screwing up somewhere here in my thinking... If I add more and more LED's, and the current drawn goes up for each LED added, then I need to use a smaller and smaller resistor. Intuitively it seems to make sense but if I carry out the calculations for 1, 2, 5, 10, 20, 50, 100 LED's in parallel I seem to get smaller and smaller resistors to ensure I am not limiting my current, as long as the supply can deliver it. I must be missing the "internal resistance" of the battery or some other factor that is limiting current, since I am not using an ideal power supply? Is this where I am going wrong? Let me continue then with my messed up calculations...
However, when I powered the LED array by the 5V PowerBank I noticed it was very bright and some "magic smoke" was being released from one of the components (maybe from insulation near one of the supply wires soldered to the board) as they were thin and could be heating up? That leads me to think that this LED array was *NOT* originally designed to be powered by a 5V supply but could have been in a flashlight using I would guess 3 AA batteries (2 AA would be under the 3.2V forward drive voltage, but 4.5V would be enough). In that case, if that SMD resistor was designed expecting a 4.5V supply it would be needing a 2.17 ohm resistor to provide 25 mA current across each LED (600 mA total), and running it at 5V would have provided 830 mA total current (or 34.5 mA per LED). BUt I don't think it was the LED issue as it would have boiled off a few and smoke would have been coming from them. Instead smoke came from part of the board, either a solder joint, the SMD resistor (which would have burned open I guess, not shorted) or the insulation from my thin wires. Either way, it was running HOT and too much current was being drawn.... yet it survived even after 10 seconds of light smoke being produced (a "whisp") and I continued to use it after once I added the 220 Ohm resistor.
Having said the above.... I can't understand how the addition of a 220 Ohm resistor would still allow the LED to turn on. Based on calculating a single resistor for 24 parallel LED's, that would have yielded 5-3.2=1.8V through 220 Ohm, V=IR or V/R=I, 1.8V/220Ohm=8.1mA?

For the entire parallel array? Then that current gets divided across 24 LED, it would be a fraction of a mA? How is that possible? I know my math is wrong here somewhere, I just can't figure it out!
