Author Topic: Question about voltage vs current  (Read 3054 times)

0 Members and 1 Guest are viewing this topic.

Offline gkmaiaTopic starter

  • Frequent Contributor
  • **
  • Posts: 528
  • Country: nz
  • Electronics Hobbyist
Question about voltage vs current
« on: January 22, 2019, 11:05:58 pm »
I have an equipment that takes 25W of power to run.

If I run it with AC 220 it is expected to drive 0.11Amps
If I run it with DC 12v it is expected to drive 2.08Amps

I based the current consumption on I = P/V

Is that correct? Higher the voltage less current it will need to run?
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: us
Re: Question about voltage vs current
« Reply #1 on: January 22, 2019, 11:16:13 pm »
Yes, your solution is correct.
 

Offline MrAl

  • Super Contributor
  • ***
  • Posts: 1429
Re: Question about voltage vs current
« Reply #2 on: January 22, 2019, 11:56:22 pm »
I have an equipment that takes 25W of power to run.

If I run it with AC 220 it is expected to drive 0.11Amps
If I run it with DC 12v it is expected to drive 2.08Amps

I based the current consumption on I = P/V

Is that correct? Higher the voltage less current it will need to run?

Hi,

As long as the load is a pure resistance.  If it is reactive or partly reactive then you have to take into account the angle between the current and the voltage or know the power factor:
P=V*I*cos(TH)
where TH is theta the angle between current and voltage.
The apparent power is:
P=V*I

and that may be of concern also.
 

Offline helius

  • Super Contributor
  • ***
  • Posts: 3639
  • Country: us
Re: Question about voltage vs current
« Reply #3 on: January 22, 2019, 11:58:33 pm »
There is no equipment that simply uses less current when supplied with higher voltage. The usual result is that it burns out when supply voltage is too high (https://www.eevblog.com/forum/blog/eevblog-1160-weller-responds/), or cannot function if it is too low.
Note that Ohm's Law (V = IR) has V and I proportional. That's the opposite relationship to the definition of power (I = P/V) where V and I are reciprocal. Among these two equations, the defined quantities are usually V (by the supply wiring) and R (by the equipment), and I and P are the unknowns. As V increases, I increases proportionally, and P increases quadratically.

There do exist pieces of equipment that automatically sense the supply voltage and adjust themselves within a limited range. So-called "universal switching power supplies" have a label indicating compatibility with 90 to 260 VAC at 48 to 62 Hz. Other machines are capable of running from either AC or DC as long as the RMS voltage is roughly the same (universal motors, where the meaning of "universal" is quite different). But this adaptability is not the norm and should not be assumed.
 

Offline Doctorandus_P

  • Super Contributor
  • ***
  • Posts: 3342
  • Country: nl
Re: Question about voltage vs current
« Reply #4 on: January 23, 2019, 12:11:23 am »
There are plenty of cheap small SMPS circuits on Ali / Ebay / Etc, based on for example LM2596.
If you use them to generate an output of 5V, and you adjust the input voltage between 7V and 30V then you will see that the input current gets lower for higher input voltages. Input power stays about the same. You can expect efficiencies betwen about 80% and 90%, so input power is only a bit more than output power.

Just buy a few of these and experiment with them.

These are usefull to have in a lot of projects.
I find the versions with the adjustable current limit most usefull.
You can even use these as a simple adjustable lab power supply if you change the trim pots for regular pots and add some panel meters for voltage and current.

 

Offline jeroen79

  • Frequent Contributor
  • **
  • Posts: 529
Re: Question about voltage vs current
« Reply #5 on: January 23, 2019, 01:39:33 am »
I have an equipment that takes 25W of power to run.

If I run it with AC 220 it is expected to drive 0.11Amps
If I run it with DC 12v it is expected to drive 2.08Amps

I based the current consumption on I = P/V

Is that correct? Higher the voltage less current it will need to run?
It depends on what kind of equipment it is.
Not many things will take any kind of powersupply.
So something made for 220V AC may not run at all on 12V DC.

There is no equipment that simply uses less current when supplied with higher voltage.
That depends on the kind of equipment.

A simple resistor will indeed have linear I/V curve.
But equipment like a switching power supply will have an inverse I/V curve.
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7585
  • Country: au
Re: Question about voltage vs current
« Reply #6 on: January 23, 2019, 04:47:23 am »
I have an equipment that takes 25W of power to run.

If I run it with AC 220 it is expected to drive 0.11Amps
If I run it with DC 12v it is expected to drive 2.08Amps

I based the current consumption on I = P/V

Is that correct? Higher the voltage less current it will need to run?

If you have, say, a domestic 220v incandescent lamp like we used before LED ones became available, & try to operate it from 12 volts DC, it won't work at all.
If, on the other hand, you have a 12v  automotive tail lamp  & connect it across 220v ac, it will explode!

Looking at your original figures,  220v, 0.11A, using Ohm's Law in the form  R=V/I, we have

R=220/0.11, so R=2000 ohms.

Now for the case where V=12v
Applying this value of R to Ohm's Law in its more usually quoted form ,I=V/R we have  I=12/2000
=0.0060A, or 6mA.----- a far cry from  2.08A!
By the way, the power will be 0.072 Wattts.

I won't work out the tremendous current that your 12v device will draw from the 220v supply, but the lesson from this is that the same device cannot be used at such widely differing voltages.

Yes,  a 25W device designed for 220v would draw 0.11A, & 25 W device designed for 12v would draw 2.08A.

The sting in the tail is, that if we want to use the 12v device on a 220v supply, or the 220v device on 12v, we will need some sort of voltage conversion device, which will not be 100% efficient, so for instance, your 12v device plus a conversion device will draw more than 0.11A from the 220v supply.

In the other case, the conversion from 12 to 220v not only includes a voltage step up, but a conversion from DC to ac, which is likely to be fairly inefficient, if as likely, it uses a switchmode power supply operating at 50Hz, rather than the higher frequencies used in most such supplies, so your total current from 12v DC will be somewhat more than 2.08A.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 12297
  • Country: au
Re: Question about voltage vs current
« Reply #7 on: January 23, 2019, 03:01:23 pm »

Yes,  a 25W device designed for 220v would draw 0.11A, & 25 W device designed for 12v would draw 2.08A.


To the OP: This sums up the answer to your question in a properly qualified manner.

If we were to take the simplest example of a 25W resistive heater, for a 220V supply, the "device" is a resistor with a value of 1936 ohms.  However, for a 12V supply, the "device" is a resistor with a value of 5.76 ohms.

If some of the other answers above were a bit confusing, it is probably because they were talking around the idea of having one device consume 25W from either of the voltages.

I (and others) have taken your question as something more fundamental - perhaps one of the most basic concepts that you need to understand for a productive journey into things electrical.

You are encouraged to ask more questions - and we'll try to be helpful and hopefully not confuse you too much.
« Last Edit: January 23, 2019, 03:03:03 pm by Brumby »
 
The following users thanked this post: tooki

Offline spec

  • Frequent Contributor
  • **
  • Posts: 833
  • Country: england
  • MALE
Re: Question about voltage vs current
« Reply #8 on: January 23, 2019, 05:26:06 pm »
The OP may have an equipment that runs off mains or 12V, which is quite common, so his question and calculations are valid.
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19494
  • Country: gb
  • 0999
Re: Question about voltage vs current
« Reply #9 on: January 23, 2019, 07:28:32 pm »
The OP may have an equipment that runs off mains or 12V, which is quite common, so his question and calculations are valid.
Mathematically correct yes, but whether this is true practically speaking is another matter. If the device natively uses 12V, but runs off the mains via a switched mode power supply, then expect it to use more power, when run on AC. The name plate might also just give the maximum possible power consumption or it could be overrated, so take it with a pinch of salt.
 

Offline spec

  • Frequent Contributor
  • **
  • Posts: 833
  • Country: england
  • MALE
Re: Question about voltage vs current
« Reply #10 on: January 23, 2019, 08:38:58 pm »
The OP may have an equipment that runs off mains or 12V, which is quite common, so his question and calculations are valid.
Mathematically correct yes, but whether this is true practically speaking is another matter. If the device natively uses 12V, but runs off the mains via a switched mode power supply, then expect it to use more power, when run on AC. The name plate might also just give the maximum possible power consumption or it could be overrated, so take it with a pinch of salt.
It varies, but a common scenario is that the equipment runs from 12V which is either supplied from the mains via a transformer rectifier/off line switch mode PSU or directly from a 12V supply, typically a battery, or a car 12V line. I have a radio just like that.

The crux of the matter is that, give or take a watt here or there, the power consumption is the same, which is the point that the OP was asking about.
« Last Edit: January 23, 2019, 08:43:22 pm by spec »
 
The following users thanked this post: tooki


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf