Author Topic: Consequences of changing Voltage and Amps question  (Read 1398 times)

0 Members and 1 Guest are viewing this topic.

Offline figure1aTopic starter

  • Newbie
  • Posts: 4
  • Country: us
Consequences of changing Voltage and Amps question
« on: April 18, 2019, 01:08:55 am »
I have limited electronics knowledge. Mostly a tinkerer. I have always understood amps to be the measurement of current. Also, I know that volts x amps = watts. So, for example, if I have a DC project that is 20V and capable of delivering 5A, it can output 100W of power. Here is my question. If we were to alter the power supply so that it still equals 100W of output (let's do an extreme like 5V and 20A), what are the consequences? I am asking this because recently I read somewhere where someone wrote about changing a circuit slightly to a lower voltage but with more amps but the exclaimed that the amps would be more and that could be bad. It was also a fairly low voltage DC setup.
 

Offline xavier60

  • Super Contributor
  • ***
  • Posts: 2843
  • Country: au
Re: Consequences of changing Voltage and Amps question
« Reply #1 on: April 18, 2019, 01:27:31 am »
Designing power supplies for higher current is more challenging. What type of design is it?
HP 54645A dso, Fluke 87V dmm,  Agilent U8002A psu,  FY6600 function gen,  Brymen BM857S, HAKKO FM-204, New! HAKKO FX-971.
 

Offline Nerull

  • Frequent Contributor
  • **
  • Posts: 694
Re: Consequences of changing Voltage and Amps question
« Reply #2 on: April 18, 2019, 01:31:04 am »
It's not as simple as the output power of the supply, you have power dissipated in the components of the power supply. This is unlikely to be the same.
 
The following users thanked this post: figure1a

Offline figure1aTopic starter

  • Newbie
  • Posts: 4
  • Country: us
Re: Consequences of changing Voltage and Amps question
« Reply #3 on: April 18, 2019, 01:33:09 am »
Re: Designing power supplies for higher current is more challenging. What type of design is it?

It's mostly a hypothetical question. But it was dealing with a low voltage DC circuit. Besides a component in the design needing a certain amount of voltage to be more efficient or something, is there a difference between 20V/5A and 5V/20? Safety? One run hotter than the other?
 

Offline figure1aTopic starter

  • Newbie
  • Posts: 4
  • Country: us
Re: Consequences of changing Voltage and Amps question
« Reply #4 on: April 18, 2019, 01:35:18 am »
It's not as simple as the output power of the supply, you have power dissipated in the components of the power supply. This is unlikely to be the same.
Gotcha. Thanks!
 

Offline radioFlash

  • Regular Contributor
  • *
  • Posts: 163
  • Country: us
Re: Consequences of changing Voltage and Amps question
« Reply #5 on: April 18, 2019, 01:38:28 am »
If you deliver the same amount of power at a lower voltage and higher current (more amps), you will have more losses from the resistance of the wires because of Ohm's Law, or you will need thicker wires to compensate for the increase in current. This is why power lines have very high voltages. Of course, usually what's consuming the power has some requirements for the voltage so you can't arbitrarily choose the voltage of the power supply.
 
The following users thanked this post: figure1a

Offline mariush

  • Super Contributor
  • ***
  • Posts: 5055
  • Country: ro
  • .
Re: Consequences of changing Voltage and Amps question
« Reply #6 on: April 18, 2019, 01:44:46 am »
If it's a switching power supply, then you most likely won't be able to alter it to such big change. Most switching power supplies have components like the transformer that are custom designed to work best at some output voltage and up to some current... the design may allow for let's say up to 20-30% up or down around the default voltage, like let's say you could adjust your 20v power supply between 16v and 24v ... and maybe the power supply would be able to supply up to 6A
Any bigger changes would require changing the number of windings on the primary and secondary side of the transformer, changing other parts.. and so on..

With classic transformers, it's just easier to design a transformer to output a higher voltage. A 20v AC 5A transformer may output 20v +/- 5% , while a 5v AC 20A may output 5v +/-10%, and most likely the output voltage at very low loads like let's say 0.1A will be much higher, let's say 7v AC.


It is preferred to use higher voltages because of losses in transmission.  The higher the current, the more losses.
You have the standard Ohm's law V = IxR  or Voltage equals Current x Resistance.

So let's say you have a power supply that can output either 20v 5A  or 5v 20A and you have a 1 meter pair of AWG18 wires between the power supply and your device. AWG18 is the standard thickness of wires used in computer power supplies.

You know that AWG18 wires have a resistance of approximately 21 mOhm (0.021 ohm) per meter, according to wikipedia : https://en.wikipedia.org/wiki/American_wire_gauge#Tables_of_AWG_wire_sizes
and between power supply and device there's gonna be two meters of wire, so the total resistance will be 42 mOhm.

So using that formula

at 20v and 5A, you're gonna lose V = 5A x 0.042 = 0.21v on the wires between psu and device, so your device actually "sees" 19.79v
at 5v and 20A, you're gonna lose V = 20A x 0.042 = 0.84v on the wires between psu and device, so your device actually "sees" 4.16v

So you can see at high currents, there's big losses in the wires... computer power supplies counteract that by using multiple pairs of wires... for example a 6pin pci-e connector that's standardized for 75w of power, uses 3 pairs of wires to transfer 12v at up to 6.25A to a video card (12v x 6.25A = 75w) ... so each pair of wires only has to carry around 2A of current, making losses in wires very small.

Higher current also means you need beefier connectors - for example you can't use a USB connector for 5v and 20A,  you wouldn't even be able to use a basic barrel jack connector, as most can probably only handle around 6-10A max.
You also have to be aware of how connectors would handle insertion and removal because when you have 20A and you plug something, you could have sparks and the metal of the contacts could pit from the sparks 
You have to be aware of the current rating of connectors ... for example mini-Fit Jr. series from Molex (the connectors used for pci-e on video cards in computers) have a rating of 9A per pin ... it would not be smart to transfer 20A through a 2x2 connector of that series, as your 20A exceeds the recommended maximum of 2x9A = 18A
 
The following users thanked this post: figure1a

Offline figure1aTopic starter

  • Newbie
  • Posts: 4
  • Country: us
Re: Consequences of changing Voltage and Amps question
« Reply #7 on: April 18, 2019, 01:56:26 am »
If it's a switching power supply, then you most likely won't be able to alter it to such big change. Most switching power supplies have components like the transformer that are custom designed to work best at some output voltage and up to some current... the design may allow for let's say up to 20-30% up or down around the default voltage, like let's say you could adjust your 20v power supply between 16v and 24v ... and maybe the power supply would be able to supply up to 6A
Any bigger changes would require changing the number of windings on the primary and secondary side of the transformer, changing other parts.. and so on..

With classic transformers, it's just easier to design a transformer to output a higher voltage. A 20v AC 5A transformer may output 20v +/- 5% , while a 5v AC 20A may output 5v +/-10%, and most likely the output voltage at very low loads like let's say 0.1A will be much higher, let's say 7v AC.


It is preferred to use higher voltages because of losses in transmission.  The higher the current, the more losses.
You have the standard Ohm's law V = IxR  or Voltage equals Current x Resistance.

So let's say you have a power supply that can output either 20v 5A  or 5v 20A and you have a 1 meter pair of AWG18 wires between the power supply and your device. AWG18 is the standard thickness of wires used in computer power supplies.

You know that AWG18 wires have a resistance of approximately 21 mOhm (0.021 ohm) per meter, according to wikipedia : https://en.wikipedia.org/wiki/American_wire_gauge#Tables_of_AWG_wire_sizes
and between power supply and device there's gonna be two meters of wire, so the total resistance will be 42 mOhm.

So using that formula

at 20v and 5A, you're gonna lose V = 5A x 0.042 = 0.21v on the wires between psu and device, so your device actually "sees" 19.79v
at 5v and 20A, you're gonna lose V = 20A x 0.042 = 0.84v on the wires between psu and device, so your device actually "sees" 4.16v

So you can see at high currents, there's big losses in the wires... computer power supplies counteract that by using multiple pairs of wires... for example a 6pin pci-e connector that's standardized for 75w of power, uses 3 pairs of wires to transfer 12v at up to 6.25A to a video card (12v x 6.25A = 75w) ... so each pair of wires only has to carry around 2A of current, making losses in wires very small.

Higher current also means you need beefier connectors - for example you can't use a USB connector for 5v and 20A,  you wouldn't even be able to use a basic barrel jack connector, as most can probably only handle around 6-10A max.
You also have to be aware of how connectors would handle insertion and removal because when you have 20A and you plug something, you could have sparks and the metal of the contacts could pit from the sparks 
You have to be aware of the current rating of connectors ... for example mini-Fit Jr. series from Molex (the connectors used for pci-e on video cards in computers) have a rating of 9A per pin ... it would not be smart to transfer 20A through a 2x2 connector of that series, as your 20A exceeds the recommended maximum of 2x9A = 18A

Thanks for all this info. A couple questions. In a couple points you mention how many amps certain connectors could handle. But does it matter what the total wattage is? How do they rate a connector at 10A? Wouldn't it matter if it was 1V/10A vs 1000V/10A?
 

Offline gcewing

  • Regular Contributor
  • *
  • Posts: 197
  • Country: nz
Re: Consequences of changing Voltage and Amps question
« Reply #8 on: April 18, 2019, 02:07:16 am »
If a connector is rated at 10A, it will be because someone has worked out that this is how much current it can pass before it heats up too much, due to the resistance of the contacts.

A connector also has a voltage rating, which is the maximum voltage it can withstand between contacts before some kind of insulation breakdown occurs.

The voltage and current ratings are completely independent. A connector rated for 1V is extremely unlikely to withstand 1000V, even if the current rating is the same.
 
The following users thanked this post: figure1a

Offline Shock

  • Super Contributor
  • ***
  • Posts: 4232
  • Country: au
Re: Consequences of changing Voltage and Amps question
« Reply #9 on: April 18, 2019, 11:15:41 am »
Thanks for all this info. A couple questions. In a couple points you mention how many amps certain connectors could handle. But does it matter what the total wattage is? How do they rate a connector at 10A? Wouldn't it matter if it was 1V/10A vs 1000V/10A?

Components that heat up are likely to specify the maximum power rating as it allows you easily to identify and chose a higher rated component.

With say a piece of insulated wire you may have a voltage rating, current rating and temp (among others). The current rating is closely linked to the composition and the diameter of the conductor because of it's current handling ability, and as you reach the maximum current limit heat also becomes a factor. You would not want to exceed it's maximum operating temp due to breaking down the insulation. Voltage for the same reason has a threshold before the insulation is compromised.

So while you might not have a power rating you have other ratings that signify important limits which should be documented in the components datasheet.

Another important factor in choosing components is derating. This is deliberately selecting a component that is exceeds the design specification and is done to prevent early failures, provide additional overhead, robustness and safety to the product.
Soldering/Rework: Pace ADS200, Pace MBT350
Multimeters: Fluke 189, 87V, 117, 112   >>> WANTED STUFF <<<
Oszilloskopen: Lecroy 9314, Phillips PM3065, Tektronix 2215a, 314
 
The following users thanked this post: figure1a

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Consequences of changing Voltage and Amps question
« Reply #10 on: April 19, 2019, 07:06:54 pm »
As long as you stay within the allowable maximums for both voltage and current, increasing the voltage will increase the amount of power that a connector can handle. If it's rated at say 5A and 50V then you will be limited to 25W at 5V but the same connector could carry 250W at 50V. Exceeding the current rating will cause it to overheat, exceeding the voltage rating may cause it to flash over or arc when disconnecting it, or it may just be unsafe. For example when dealing with voltages higher than 48V or so one has to make sure that live contacts cannot be easily touched. Then when you get into hundreds or thousands of Volts arcing and corona starts to become a concern.
 
The following users thanked this post: figure1a


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf