I don't see how this question makes any sense.
You have plenty of 15 A outlets in your house, but if you plug a USB phone charger into one of them it doesn't draw 15 A, it draws just what it needs (a few milliamps at most). This is how electricity works. The outlet sets the voltage, the appliance sets the current.
I kind of recognized that was happening, but don't understand why. I'll put some research into ohms law.
This is a common question which crops up. I don't know why. It's always been something I thought was obvious because the voltage and resistance of the load is what determines the current. I've considered it to be similar to asking "I have a jack capable of lifting 2000kg. Can I use it to lift a 200kg engine?" To be clear: I'm not attacking the original poster and others who ask it, but the quality of education they've received.
My high school education was kind of similar to a minimum security prison sentence
I had physical science classes in grade 9-10 but it was taught by the most boring, monotone teacher on earth with zero passion for his job. I mostly just ignored everything and still passed, I regret that now of course
You should be able to run a 20A tool on a 30A outlet without any problem. If not how can you use a 1A device on a 15A outlet?
Yeah, just needed to make sure I wasn't overlooking anything. Don't want to destroy expensive equipment.
In the US, I was taught Ohm's Law formally in my 8th-grade electricity shop class, when I was 13 years old.
They probably just teach coding now.
Don't underestimate how easy it is to get through high school. I never failed a test and never studied for any exam except the finals + I skipped school all the time and was pretty faded for any classes after lunch break lol.
-
Appreciate the advice fellas, I'm confident to move forwards now