Author Topic: Why do electronics only consume the Amps they need?  (Read 7018 times)

0 Members and 1 Guest are viewing this topic.

Offline Ace FrahmTopic starter

  • Newbie
  • Posts: 5
Why do electronics only consume the Amps they need?
« on: December 04, 2012, 01:49:19 am »
Simple, basic questions that NONE of the beginner books ever explain:

Why do electronics only consume the Amps they need?

How do you determine how powerful can a power source be before it will "blow up" a circuit?

If a resistor protects a circuit by reducing current, isn't it wasting lots of power as heat?

Is there a miserly power supply design (with a good solid state regulator?) that gives a circuit only the power it requires while wasting little to nothing as heat?
 

Offline GK

  • Super Contributor
  • ***
  • Posts: 2607
  • Country: au
Re: Why do electronics only consume the Amps they need?
« Reply #1 on: December 04, 2012, 02:11:04 am »
Simple, basic questions that NONE of the beginner books ever explain:


Ummm.... for a start you should try to find one that covers Ohms law.
Bzzzzt. No longer care, over this forum shit.........ZZzzzzzzzzzzzzzzz
 

Offline Ace FrahmTopic starter

  • Newbie
  • Posts: 5
Re: Why do electronics only consume the Amps they need?
« Reply #2 on: December 04, 2012, 02:17:56 am »
Ohms Law is not sufficient to answer these questions.
 

Offline GK

  • Super Contributor
  • ***
  • Posts: 2607
  • Country: au
Re: Why do electronics only consume the Amps they need?
« Reply #3 on: December 04, 2012, 02:22:50 am »
LOL, I said "for a start". Since you are clearly up to speed on Ohms law then you should be able to understand why a 20W lamp doesn't blow up when plugged into a wall socket also capable of powering a 2400W heater.
 
Bzzzzt. No longer care, over this forum shit.........ZZzzzzzzzzzzzzzzz
 

Offline IanB

  • Super Contributor
  • ***
  • Posts: 11882
  • Country: us
Re: Why do electronics only consume the Amps they need?
« Reply #4 on: December 04, 2012, 02:33:20 am »
Simple, basic questions that NONE of the beginner books ever explain:

Why do electronics only consume the Amps they need?

For the same reason your car only consumes the amount of gasoline it needs.

Quote
How do you determine how powerful can a power source be before it will "blow up" a circuit?

Circuits are designed not to blow up when provided with the correct power source. If you give a circuit the wrong power source you cannot rely on it working correctly.

Quote
If a resistor protects a circuit by reducing current, isn't it wasting lots of power as heat?

This is not a normal use for a resistor, but yes, if current is passing through a resistor it is converting electrical energy to heat energy. Sometimes this is desired and is not "waste"--for example an electric oven or heater. Even if a resistor is being used as a voltage dropper it may not be "lots" of power, it may only be "some" or "just a little" power.

Quote
Is there a miserly power supply design (with a good solid state regulator?) that gives a circuit only the power it requires while wasting little to nothing as heat?

All well designed power supplies aspire to this. With "green energy" demands becoming common, power supplies are getting better and better at doing so.


Basic electronics books do explain this. It starts with understanding the difference between voltage and current. You should not blame the books for your lack of understanding.
 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 37738
  • Country: au
    • EEVblog
Re: Why do electronics only consume the Amps they need?
« Reply #5 on: December 04, 2012, 02:56:55 am »
Simple, basic questions that NONE of the beginner books ever explain:
Why do electronics only consume the Amps they need?

Because most things are powered from a voltage source, and most loads are (effectively) a constant resistance (or constant power) value.
Given that, ohms law gives you the answer of why a circuit draws a certain current and nothing more.

Quote
How do you determine how powerful can a power source be before it will "blow up" a circuit?

Depends entirely on the circuit and it's maximum designed value (usually a maximum voltage value)
How "powerful" a power supply is is determined by it's current capability. Note the word a capability there.
A power supply can be infinitely "powerful" and still not damage a delicate circuit requires only a millamp, because its only a capability to provide energy.
A 12V battery across a 12K resistor will still take only 1mA of current, regardless of whether it's hooked up to a tiny 12V remote control battery, or a huge 12V truck battery capable of thousands of amps.

Quote
If a resistor protects a circuit by reducing current, isn't it wasting lots of power as heat?

It can be yes, like in a LED dropper resistor for example.

Quote
Is there a miserly power supply design (with a good solid state regulator?) that gives a circuit only the power it requires while wasting little to nothing as heat?

Most circuit will only take the power they require, and nothing more.
Some power sources are close to 100% efficient and lose nothing as heat. Like the 12V truck battery example above.

Dave.
 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 37738
  • Country: au
    • EEVblog
Re: Why do electronics only consume the Amps they need?
« Reply #6 on: December 04, 2012, 02:58:34 am »
Ohms Law is not sufficient to answer these questions.

Yes, it is. You just have to understand the difference between voltage and current.

Dave.
 

Offline MikeK

  • Super Contributor
  • ***
  • Posts: 1314
  • Country: us
Re: Why do electronics only consume the Amps they need?
« Reply #7 on: December 04, 2012, 03:25:32 am »
Why do electronics only consume the Amps they need?

Connect an LED to a battery.  It will consume far more than it "needs", to the point that the LED burns out.  I think the "need" view is a confusing way to look at it.
 

Offline Ace FrahmTopic starter

  • Newbie
  • Posts: 5
Re: Why do electronics only consume the Amps they need?
« Reply #8 on: December 04, 2012, 03:52:57 am »
   OOOO      OOOO
  OO  OO    OO   OO
   OOQQ      OOQQ

 Q             \       
QQ          __|           Q
                              QQ
 
        _mMMMMMm
                 --

 
Q
Q                                Q
                                   QQ





 Q                         
 QQ




                                Q
                               QQ

 

Offline Shuggsy

  • Regular Contributor
  • *
  • Posts: 56
  • Country: us
Re: Why do electronics only consume the Amps they need?
« Reply #9 on: December 04, 2012, 03:55:10 am »
Don't get too discouraged with the answers you see here... it's all about getting the right perspective on how electronics operate. They aren't magic (despite the aptly named "magic electronics smoke" that causes all electronics to work ::)), so there are laws we've discovered over time to describe how circuits work. Or, really, it's how electrons/charge flow through certain materials. Ohms law describes this, but it takes some insight and perspective to understand how the law applies. Moving from the world of formulas and theory to real-world meanings and implementations can be challenging. It was for me.

Why do electronics only consume the Amps they need?

Connect an LED to a battery.  It will consume far more than it "needs", to the point that the LED burns out.  I think the "need" view is a confusing way to look at it.

Indeed. Circuits will consume exactly the power defined by how they are built. Physics doesn't change. In the example with the LED here, it's built as a diode that dissipates some energy as light energy (photons), but it is still a diode. When the current going through the LED exceeds what the package can physically take, it'll blow -- LEDs aren't generally built with any current limiting or protection. That "extra" functionality is up to you as a circuit designer... an easy or straightforward way to limit the current is with a resistor as also previously mentioned. The energy has to go somewhere though, so in that simple LED + resistor circuit there is some energy dissipated in the resistor as heat.

Seeing as physics doesn't change, that's also why many circuits are designed with protection such as fuses, MOVs, reverse-protection diodes, current sensing and limiting functionality, temperature sensing and overheating protection functionality and so on. All of this is so that when nature tosses your circuit something you don't necessarily intend but MIGHT happen, you can protect it as best as possible. Again, the circuit will draw whatever current/power/energy the components demand as built/implemented. You can build in all the protection you could ever think of, have what you feel is the best schematic ever... but accidentally putting a small wire directly between your input power point and ground when you actually build the circuit will be a race against time between that small wire exploding due to massive overheating from the current its drawing (assuming your power source can supply that massive current!) and whatever protection (if any) is built into your power source. This is why homes are wired through circuit-breakers. ;)

Many, many other examples, but the best answer I can give you is that electronics consume the amps they draw by definition of how they are built. Circuit building blocks from simple wires to complex semi-conductors to massive ICs are all about manipulating charge/electrons as nature has defined. It's up to the circuit designer to do interesting things with the flow of electrons. :)
« Last Edit: December 04, 2012, 04:02:12 am by Shuggsy »
 

Offline Ace FrahmTopic starter

  • Newbie
  • Posts: 5
Re: Why do electronics only consume the Amps they need?
« Reply #10 on: December 04, 2012, 07:32:47 am »
I see that a lone LED diode would offer no resistence and thus allows all current to flow, gets hot, and then fails.
I figure that a carbon resistor works because the element carbon is NOT an excellent conductor, so it can limit current.

How in the world does a metal wire-wound resistor resist, if the metal inside is a good conductor?
Why doesn't it get hot and fail like a diode?

How in the world am I supposed to know from Ohm's Law: V=IR that a lone resistor across a mains line will or will not blow up?  Let's say it's 120 AC 60Hz, and a 1 Ohm resistor.  That implies 120 Amps are flowing through the resistor, right?  But how is that sufficient to know whether or not the part will burn up?
« Last Edit: December 04, 2012, 07:35:16 am by Ace Frahm »
 

Offline Skimask

  • Super Contributor
  • ***
  • Posts: 1433
  • Country: us
Re: Why do electronics only consume the Amps they need?
« Reply #11 on: December 04, 2012, 07:43:58 am »
How in the world does a metal wire-wound resistor resist, if the metal inside is a good conductor?
Why doesn't it get hot and fail like a diode?
The wire may or may not melt, depending on it's resistance to the flow of electricity, which if you're thinking about a chunk of copper wire, is likely determined by it's gauge.  Small wire...high-ish resistance...lots of work done trying to push the power thru it....burns up.  Large wire....low-ish resistance....not a lot of work done pushing the current thru it....doesn't burn up.

Quote
How in the world am I supposed to know from Ohm's Law: V=IR that a lone resistor across a mains line will or will not blow up?  Let's say it's 120 AC 60Hz, and a 1 Ohm resistor.  That implies 120 Amps are flowing through the resistor, right?  But how is that sufficient to know whether or not the part will burn up?
It's not sufficient knowing the voltage, current values going into the part alone to know whether or not your part will burn up.
You also have to know the voltage, current, power ratings of the device you're dealing with.
In this instance, if you've got a resistor that's valued at 1 ohm, rated to handle 120v (likely more), AND 120 amps (likely more), AND, most importantly 14,400 watts (volts times amps = watts, more ohms law)....only then, if all of those conditions are met, will the part survive.  (Ok, it may survive with lower ratings, but I wouldn't bet on it).  So, in your example you need a resistor capable of handling almost 15,000 watts.

Similarly, if you take a 1 megaohm (1,000,000 ohms) resistor across that same 120 volts, current flow will be limited to about .00012 amps, .12 milliamps.
Power (watts) = Volts times amps.
120 * .00012 = .0144 watts is dissipated across that resistor.  So, IF the resistor was rated to handle 120volts, you could put a small 1/8 watt resistor across the 120volt source and NOT burn it up.

http://lmgtfy.com/?q=ohms+law+wiki
http://lmgtfy.com/?q=Watt+wiki
« Last Edit: December 04, 2012, 07:46:23 am by Skimask »
I didn't take it apart.
I turned it on.

The only stupid question is, well, most of them...

Save a fuse...Blow an electrician.
 

Offline amyk

  • Super Contributor
  • ***
  • Posts: 8270
Re: Why do electronics only consume the Amps they need?
« Reply #12 on: December 04, 2012, 08:47:56 am »
I see that a lone LED diode would offer no resistence and thus allows all current to flow, gets hot, and then fails.
I figure that a carbon resistor works because the element carbon is NOT an excellent conductor, so it can limit current.

How in the world does a metal wire-wound resistor resist, if the metal inside is a good conductor?
Why doesn't it get hot and fail like a diode?

How in the world am I supposed to know from Ohm's Law: V=IR that a lone resistor across a mains line will or will not blow up?  Let's say it's 120 AC 60Hz, and a 1 Ohm resistor.  That implies 120 Amps are flowing through the resistor, right?  But how is that sufficient to know whether or not the part will burn up?
The short answer is, all conductors (except superconductors but you probably won't be working with those) have resistance, and that along with the voltage determines how much current will flow. Because work is done in pushing the electrons through a non-perfect conductor, energy is dissipated, almost all in the form of heat. The energy dissipated in the resistor raises its temperature, and it also conducts heat to its surroundings; an equilibrium is formed between the energy entering and exiting it, at steady-state temperature. Resistors and other parts are rated for a specific power dissipation, which is the power it can dissipate continuously without reaching self-destructive temperatures.
 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 37738
  • Country: au
    • EEVblog
Re: Why do electronics only consume the Amps they need?
« Reply #13 on: December 04, 2012, 09:36:07 am »
I see that a lone LED diode would offer no resistence and thus allows all current to flow, gets hot, and then fails.

The LED example is confusing because LED's are NOT voltage operated like the majority of other devices. (e.g. chips), they are current operated.
You cannot just connect a voltage directly across a LED, that is incorrect usage of the part, and as result it will not operate as intended and/or die.
Forget the LED example until you understand how voltage operated devices work and "take" "only the current they require".

Quote
I figure that a carbon resistor works because the element carbon is NOT an excellent conductor, so it can limit current.

LED's are not excellent conductors either, hence the name "semi-conductor".

Quote
How in the world does a metal wire-wound resistor resist, if the metal inside is a good conductor?
Why doesn't it get hot and fail like a diode?

Wirewound resistors are not normal (good conductor) copper wire, they are usually nichrome, which has a higher resistance.

Quote
How in the world am I supposed to know from Ohm's Law: V=IR that a lone resistor across a mains line will or will not blow up?  Let's say it's 120 AC 60Hz, and a 1 Ohm resistor.  That implies 120 Amps are flowing through the resistor, right?  But how is that sufficient to know whether or not the part will burn up?

This is were "power dissipation", and the datasheet power ratings for resistors come in.
P=V*I
P=V^2 / R
P=I^2 * R
(although these are technically Joule's laws, they are grouped with and taught as a basic part of Ohms Law)

So a typical 1/4W resistor will only tolerate 0.25W of power before it gets too hot and fails.
So (120V ^2) / 1ohm = 14,000W !!
That poor 1/4W 1ohm resistor will explode with much fanfare  :-+

Maybe this will help a bit, starting 5:40:


Dave.
 

Offline GK

  • Super Contributor
  • ***
  • Posts: 2607
  • Country: au
Bzzzzt. No longer care, over this forum shit.........ZZzzzzzzzzzzzzzzz
 

Offline T4P

  • Super Contributor
  • ***
  • Posts: 3697
  • Country: sg
    • T4P
Re: Why do electronics only consume the Amps they need?
« Reply #15 on: December 04, 2012, 09:46:23 am »
Laws of physics more specifically Conservation of Energy
Total amount of energy remains constant, hence you are not able to consume 2200W when you are burning up 20W
Simply? Ohms law.
 

Offline Ace FrahmTopic starter

  • Newbie
  • Posts: 5
Re: Why do electronics only consume the Amps they need?
« Reply #16 on: December 04, 2012, 10:34:52 am »
Wow, thank you!
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf