Author Topic: What is the difference in a 24V application vs 12V where power is constant?  (Read 7923 times)

0 Members and 1 Guest are viewing this topic.

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us

I'm trying to understand what type of wiring is different in a 12V application vs a 24V, such as a 24V light vs a 12V light? Are they "made" differently, and how?

If a light only uses 12V 2A of power max,  then why would a 24V volt 1A power supply damage the 12V light?
 

Offline mgwalker95

  • Contributor
  • Posts: 28
  • Country: us
The light might have components that are only rated for 12V. For example a LED driver might not be able to withstand 24V.
 

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us
The light might have components that are only rated for 12V. For example a LED driver might not be able to withstand 24V.

See, that's what I am not understanding. If the 24V power supply can only supply 1 Amp and the 12V device calls for maximum of 12V at 2 Amps, why would that harm it when it is getting the same current? The 24V power supply can only push 1A current. It's the same amount of current for both 12V and 24V. The power is the same.

I understand, I think, why a constant Amp 24V and 12V supply would be incompatible, at the same amps, the 24V supply is overheating the 12V device--giving it twice the power.
 

Offline TimNJ

  • Super Contributor
  • ***
  • Posts: 1720
  • Country: us
Really depends on what kind of device you're talking about in particular. If you're talking about an incandescent lamp, then the simplest answer is that incandescent lights are not constant power devices by themselves. An incandescent light is more or less just a resistor, albeit a non-linear one. So, a 12V-rated light attached to a 24V power supply wants to draw about twice the current it is rated for, almost certainly destroying it. This assumes the 24V power supply can supply the additional current.

You suggested powering a 24W light bulb rated for 2A @ 12V. This means the filament is about 6-ohms (once hot). What happens when you attach a 24V supply across a 6-ohm "resistor"? For an ideal voltage source, the current should go to 24V/6-ohms = 4A. But your power supply is only rated for 24V @ 1A.

Two issues:

1. Your power supply is overloaded because the load demands 4A, but it can only supply 1A, so it will probably go into a shut-down protection mode, or it will be damaged.
2. If the power supply could supply 4A, then you'd be running the lamp at 4x the rated power, since 24V * 4A = 96W! That will certainly destroy the lamp

Two ways to drive a 12V lamp with a "24V" power source:

1. Use a constant voltage/constant current (CV/CC) power supply. These power supplies operate in constant voltage mode up until the load draws a predetermined amount of current. As the output current attempts to go beyond this threshold, the power supply counteracts by reducing its output voltage in order to maintain the set-point current. In this case, if you'd need a CV/CC power supply rated for 24V @ 2A. With a 2A set point, the output voltage will be at 12V. Might as well just use a regular 12V power supply.

2. Use PWM to set the average power @ 24W. Since running a 12V lamp @ 24V results in 4x the rated power dissipation, you can run the lamp at 25% (1/4) duty cycle to get the same effective output power. Usually there should be no problem running the filament at a higher voltage (within reason), but would not advise trying to run a 12V lamp at 240V with PWM, for other reasons.
 
The following users thanked this post: DW1961

Offline TimNJ

  • Super Contributor
  • ***
  • Posts: 1720
  • Country: us
The light might have components that are only rated for 12V. For example a LED driver might not be able to withstand 24V.

See, that's what I am not understanding. If the 24V power supply can only supply 1 Amp and the 12V device calls for maximum of 12V at 2 Amps, why would that harm it when it is getting the same current? The 24V power supply can only push 1A current. It's the same amount of current for both 12V and 24V. The power is the same.

I understand, I think, why a constant Amp 24V and 12V supply would be incompatible, at the same amps, the 24V supply is overheating the 12V device--giving it twice the power.

A typical 24V power supply cannot magically maintain 1A output if there's a load attached to it that wants to draw more than that. So, it will go into a protection mode, not regulate at 1A.
 
The following users thanked this post: DW1961

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
A light doesn't "use" 12V. It will draw the amount of current it needs when fed the voltage it is rated for. A higher voltage will cause it to draw more current than it is designed to and it will burn out. A device is designed around a specific voltage and you must give it a voltage within the range it is designed to accept in order for it to work properly. A device designed to accept 24V will consume half the current as one designed to accept 12V so you can carry more power over the same size wires, switches, etc.


Things get a little more complex in the case of devices that use switchmode converters as a lot of LED lighting products do. As you increase the input voltage to one of these the current it draws will drop as it maintains the same power, however if the voltage goes higher than the components used can tolerate something will fry.
« Last Edit: July 22, 2020, 04:11:18 am by james_s »
 
The following users thanked this post: DW1961

Offline Tom45

  • Frequent Contributor
  • **
  • Posts: 556
  • Country: us
A typical 24V power supply cannot magically maintain 1A output if there's a load attached to it that wants to draw more than that. So, it will go into a protection mode, not regulate at 1A.

And the converse is true too.

If the load is drawing 1 amp from a 24 volt supply then that load is 24 ohms.

If the load was 240 ohms, the current from the 1 amp rated 24 volt supply would only be 0.1 amps.

The 1 amp rating is the maximum the power supply can provide to a load without getting into trouble. But that doesn't mean that it is always going to be providing 1 amp no matter what. In fact the only time it is providing 1 amp is with a 24 ohm load. It might be that the original poster doesn't understand that.
 

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us
Really depends on what kind of device you're talking about in particular. If you're talking about an incandescent lamp, then the simplest answer is that incandescent lights are not constant power devices by themselves. An incandescent light is more or less just a resistor, albeit a non-linear one. So, a 12V-rated light attached to a 24V power supply wants to draw about twice the current it is rated for, almost certainly destroying it. This assumes the 24V power supply can supply the additional current.

You suggested powering a 24W light bulb rated for 2A @ 12V. This means the filament is about 6-ohms (once hot). What happens when you attach a 24V supply across a 6-ohm "resistor"? For an ideal voltage source, the current should go to 24V/6-ohms = 4A. But your power supply is only rated for 24V @ 1A.

Two issues:

1. Your power supply is overloaded because the load demands 4A, but it can only supply 1A, so it will probably go into a shut-down protection mode, or it will be damaged.
2. If the power supply could supply 4A, then you'd be running the lamp at 4x the rated power, since 24V * 4A = 96W! That will certainly destroy the lamp

Two ways to drive a 12V lamp with a "24V" power source:

1. Use a constant voltage/constant current (CV/CC) power supply. These power supplies operate in constant voltage mode up until the load draws a predetermined amount of current. As the output current attempts to go beyond this threshold, the power supply counteracts by reducing its output voltage in order to maintain the set-point current. In this case, if you'd need a CV/CC power supply rated for 24V @ 2A. With a 2A set point, the output voltage will be at 12V. Might as well just use a regular 12V power supply.

2. Use PWM to set the average power @ 24W. Since running a 12V lamp @ 24V results in 4x the rated power dissipation, you can run the lamp at 25% (1/4) duty cycle to get the same effective output power. Usually there should be no problem running the filament at a higher voltage (within reason), but would not advise trying to run a 12V lamp at 240V with PWM, for other reasons.

OK, so I think I understand. I was talking about apples to apples, that is, a DC light for both 24 and 12V. I didn't mean to mix constant and nonconstant power devices. (At least not yet, and probably never! :) )

I'm starting to understand now given your explanation about regardless of 24v Amps or 12V amps, amps are amps, and so the 1A 24V supply would still be called upon to deliver 2A, and thus exceed the supply's rating, with the above consequences.

"Usually there should be no problem running the filament at a higher voltage (within reason), but would not advise trying to run a 12V lamp at 240V with PWM, for other reasons."

I appreciate you going into details bout how to limit power draw from the device (light) because you are explaining to me that you can use a 24  on a 12 device if you could limit the devices power needs. Question: Could you run a resistor in-between the 24V device to limit power to the device, while still running 24V instead of 12V? I'm assuming that would be inefficient because of heat loss?

So, it's not about volts, but as I assumed, it's about getting the correct power to the device. It's about heat destroying things, not anything inherently different about 12 vs 24V? In other words, it's about how one delivers power vs the other, and power = heat?
 

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us
The light might have components that are only rated for 12V. For example a LED driver might not be able to withstand 24V.

See, that's what I am not understanding. If the 24V power supply can only supply 1 Amp and the 12V device calls for maximum of 12V at 2 Amps, why would that harm it when it is getting the same current? The 24V power supply can only push 1A current. It's the same amount of current for both 12V and 24V. The power is the same.

I understand, I think, why a constant Amp 24V and 12V supply would be incompatible, at the same amps, the 24V supply is overheating the 12V device--giving it twice the power.

A typical 24V power supply cannot magically maintain 1A output if there's a load attached to it that wants to draw more than that. So, it will go into a protection mode, not regulate at 1A.

Thanks for stating that out-rightly. I had gotten it from your other post as well!
 

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us
A typical 24V power supply cannot magically maintain 1A output if there's a load attached to it that wants to draw more than that. So, it will go into a protection mode, not regulate at 1A.

And the converse is true too.

If the load is drawing 1 amp from a 24 volt supply then that load is 24 ohms.

If the load was 240 ohms, the current from the 1 amp rated 24 volt supply would only be 0.1 amps.

The 1 amp rating is the maximum the power supply can provide to a load without getting into trouble. But that doesn't mean that it is always going to be providing 1 amp no matter what. In fact the only time it is providing 1 amp is with a 24 ohm load. It might be that the original poster doesn't understand that.

I understood that yes, if you mean that the device has to want 1 Amp before the supply gives it, then yes, I understand that. It could just as well be giving .5 Amps or whatever. Is that what you meant?
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
OK, so I think I understand. I was talking about apples to apples, that is, a DC light for both 24 and 12V. I didn't mean to mix constant and nonconstant power devices. (At least not yet, and probably never! :) )

I'm starting to understand now given your explanation about regardless of 24v Amps or 12V amps, amps are amps, and so the 1A 24V supply would still be called upon to deliver 2A, and thus exceed the supply's rating, with the above consequences.

"Usually there should be no problem running the filament at a higher voltage (within reason), but would not advise trying to run a 12V lamp at 240V with PWM, for other reasons."

I appreciate you going into details bout how to limit power draw from the device (light) because you are explaining to me that you can use a 24  on a 12 device if you could limit the devices power needs. Question: Could you run a resistor in-between the 24V device to limit power to the device, while still running 24V instead of 12V? I'm assuming that would be inefficient because of heat loss?

So, it's not about volts, but as I assumed, it's about getting the correct power to the device. It's about heat destroying things, not anything inherently different about 12 vs 24V? In other words, it's about how one delivers power vs the other, and power = heat?


Yes, you can absolutely use a resistor in series with a resistive load like an incandescent light bulb. The problem is if you have a bulb that draws 1A at 12V (so it consumes 12 watts) and you want to run it from a 24V supply you need a resistor that drops 12V across it leaving 12V remaining for the bulb. This resistor will be carrying the same 1A as described by Kirchoff's laws, meaning that the resistor will be burning up another 12 watts, consuming the same amount of power as the bulb but turning it into useless heat. There are active devices that do this too, they're called voltage regulators. A linear regulator is essentially an automatic resistor, it adjusts to whatever load is connected to keep the voltage constant, but it still burns up the excess voltage as heat. A switchmode regulator is more complex but can transform voltage to current or vice versa.

Current causes heat. Power is a product of voltage and current together, so if you double the voltage you can carry the same amount of power with half the current. Half the current means half the heat, wires can be smaller for the same amount of power, this is why power is transmitted long distances at very high voltages, 80,000V or more and then stepped down by substations closer to the point of use.

It may be more clear to think it terms of mechanical systems. Power is a product of torque and speed. The strength required of a shaft depends on the torque, so if you want to transmit a lot of power at a low RPM you need to use a big strong shaft that can take the torque without twisting into a pretzel. If you spin the shaft at a much higher speed you can use a gearbox at the other end to exchange some of that speed for torque, same amount of power but now you can use a much smaller and lighter shaft, smaller joints, less support structure, everything can be smaller, lighter and cheaper which becomes more significant as the distance you want to transmit the power increases. Voltage is a lot like speed, current is a lot like torque. A clutch that slips to regulate the output speed is like a resistor or linear regulator, a gearbox is like a switchmode regulator.
« Last Edit: July 22, 2020, 05:09:31 am by james_s »
 
The following users thanked this post: DW1961

Offline DW1961Topic starter

  • Frequent Contributor
  • **
  • Posts: 753
  • Country: us
OK, so I think I understand. I was talking about apples to apples, that is, a DC light for both 24 and 12V. I didn't mean to mix constant and nonconstant power devices. (At least not yet, and probably never! :) )

I'm starting to understand now given your explanation about regardless of 24v Amps or 12V amps, amps are amps, and so the 1A 24V supply would still be called upon to deliver 2A, and thus exceed the supply's rating, with the above consequences.

"Usually there should be no problem running the filament at a higher voltage (within reason), but would not advise trying to run a 12V lamp at 240V with PWM, for other reasons."

I appreciate you going into details bout how to limit power draw from the device (light) because you are explaining to me that you can use a 24  on a 12 device if you could limit the devices power needs. Question: Could you run a resistor in-between the 24V device to limit power to the device, while still running 24V instead of 12V? I'm assuming that would be inefficient because of heat loss?

So, it's not about volts, but as I assumed, it's about getting the correct power to the device. It's about heat destroying things, not anything inherently different about 12 vs 24V? In other words, it's about how one delivers power vs the other, and power = heat?


Yes, you can absolutely use a resistor in series with a resistive load like an incandescent light bulb. The problem is if you have a bulb that draws 1A at 12V (so it consumes 12 watts) and you want to run it from a 24V supply you need a resistor that drops 12V across it leaving 12V remaining for the bulb. This resistor will be carrying the same 1A as described by Kirchoff's laws, meaning that the resistor will be burning up another 12 watts, consuming the same amount of power as the bulb but turning it into useless heat. There are active devices that do this too, they're called voltage regulators. A linear regulator is essentially an automatic resistor, it adjusts to whatever load is connected to keep the voltage constant, but it still burns up the excess voltage as heat. A switchmode regulator is more complex but can transform voltage to current or vice versa.

Current causes heat. Power is a product of voltage and current together, so if you double the voltage you can carry the same amount of power with half the current. Half the current means half the heat, wires can be smaller for the same amount of power, this is why power is transmitted long distances at very high voltages, 80,000V or more and then stepped down by substations closer to the point of use.

It may be more clear to think it terms of mechanical systems. Power is a product of torque and speed. The strength required of a shaft depends on the torque, so if you want to transmit a lot of power at a low RPM you need to use a big strong shaft that can take the torque without twisting into a pretzel. If you spin the shaft at a much higher speed you can use a gearbox at the other end to exchange some of that speed for torque, same amount of power but now you can use a much smaller and lighter shaft, smaller joints, less support structure, everything can be smaller, lighter and cheaper which becomes more significant as the distance you want to transmit the power increases. Voltage is a lot like speed, current is a lot like torque. A clutch that slips to regulate the output speed is like a resistor or linear regulator, a gearbox is like a switchmode regulator.

Very good explanation. I understand.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf