Author Topic: AC voltage drop - Transformer vs Capacitor - which is best?  (Read 7065 times)

0 Members and 1 Guest are viewing this topic.

Offline espresso4jbTopic starter

  • Contributor
  • Posts: 10
  • Country: us
AC voltage drop - Transformer vs Capacitor - which is best?
« on: February 04, 2018, 02:16:45 pm »
Underlying question:  Why not use a capacitor to drop AC voltage a set amount as opposed to a transformer

When I posed the question around dropping AC voltage without noise in a prior topic https://www.eevblog.com/forum/beginners/capacitor-to-buckdrop-ac-power-from-120v-to-~80v/ the vast majority of responses directed me to use a transformer in either a buck or auto-transformer configuration as opposed to a capacitor which was the original part of the question.  :-//

So, I ended up building both alternatives to determine which would work better and put them into a test rig to see what I could find.  Both alternatives met the required goal and were within a couple volts AC RMS of each other - attached is a picture of the basic test rig.  There was one exception to the attached diagram, the "115 VAC : 36 VAC" transformer ended up putting out in excess of 40 VAC with a 117VAC supply, so per the advice I ended up putting it in an autotransformer configuration instead and the output voltage was right on the money for comparison.

As you can see from the attached scope snapshots, the key difference between the output of the two methods is a phase shift with the capacitor and no phase shift with the transformer.  Otherwise output is comparable and the result appears near identical.

Although I did not measure it over any reflective amount of time, I would expect there would be some small loss on the transformer because of the resistance in the winding and related small amount of heat dissipation.  I was not able to identify any heat dissipation in the capacitor and would blindly theorize (and wait for said correction) that it would at least appear to be a more efficient solution, even if only by a marginal amount.

One thought that came to mind was operating life.  Anecdotal evidence says a motor run capacitor should last anywhere from 5 to 20 years - this is at full load in varying temperature environments.  This vendor site - http://www.illinoiscapacitor.com/tech-center/life-calculators.aspx suggests that at the stated load and controlled temperature, the capacitor would last at least 40 years and potentially 100's or 1,000's.  Oh, there is also the downside of the capacitor being charged even when disconnected, which is easily address with a switch between the leads which also allows for a "Hi/Lo" setting when connected.

Finally, cost and weight favor the capacitor - it is easily half the price of a transformer ($3-$7 vs $10-$20+), a smaller or more convenient package and much lighter so can be attached more easily or integrated into the power supply cord.

So, some of my questions:
  • Why not use a capacitor over a transformer?  What are the downsides?
  • In what applications would the one be a significantly better choice than the other?
  • Any real-life experiences with capacitors and why they are not a good idea?
 

Offline Ice-Tea

  • Super Contributor
  • ***
  • Posts: 3083
  • Country: be
    • Freelance Hardware Engineer
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #1 on: February 04, 2018, 02:26:57 pm »
A capacitor will offer a certain impedance at a certain frequency. It will not change. So if your load is known and unvariable, this is an interesting option for sure. However, if your load changes, your voltage changes.


Offline espresso4jbTopic starter

  • Contributor
  • Posts: 10
  • Country: us
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #2 on: February 04, 2018, 02:55:13 pm »
Great point, for this particular application the load is a known, constant resistive load - I did not try putting different loads on and measuring the response.
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19612
  • Country: gb
  • 0999
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #3 on: February 04, 2018, 03:10:15 pm »
If the load resistance and supply frequency are fixed, the capacitor will be better than the transformer. Another option is to replace the capacitor with an inductor, which will be more efficient than a transformer, but still less so, than a capacitor.

If the load impedance or supply frequency are variable, the transformer is the only sane way to do it. Another issue with the capacitor is, if the load is inductive, the voltage can actually be boosted, causing excessive current flow, arcing over and overheating.
 

Offline Vtile

  • Super Contributor
  • ***
  • Posts: 1144
  • Country: fi
  • Ingineer
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #4 on: February 04, 2018, 03:13:57 pm »
Transformer is typically built close to an "ideal voltage source" arrangement (because we typically work in voltage is master, current is slave world), while capacitors are merely a voltage divider with all related issues. It might or might not be against the needed properties.
« Last Edit: February 04, 2018, 03:15:43 pm by Vtile »
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9614
  • Country: gb
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #5 on: February 04, 2018, 03:18:25 pm »
I suppose you could say that using a large capacitive dropper is slightly antisocial, power-factor wise. You'd hardly be alone in that these days though!  ::)
« Last Edit: February 04, 2018, 03:21:29 pm by Gyro »
Best Regards, Chris
 

Offline fourtytwo42

  • Super Contributor
  • ***
  • Posts: 1189
  • Country: gb
  • Interested in all things green/ECO NOT political
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #6 on: February 04, 2018, 05:17:32 pm »
So, some of my questions:
  • Why not use a capacitor over a transformer?  What are the downsides?
  • In what applications would the one be a significantly better choice than the other?
  • Any real-life experiences with capacitors and why they are not a good idea?
I would have thought the key issue that is not even on your list is input to output isolation. That is a major reason why transformers are used.
 

Offline SeanB

  • Super Contributor
  • ***
  • Posts: 16302
  • Country: za
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #7 on: February 04, 2018, 05:50:09 pm »
Drawback of the capacitive dropper is that it will be sensitive to harmonics on the supply line, passing them through to the load by acting as a high pass filter, and thus it will apply a higher voltage to your resistive load than if the supply is a pure sine wave. You will have slightly more heating in your load, as your capacitor will be filtering the harmonics on the mains and dissipating them in the resistor, cleaning up your mains supply slightly.

A downside is that the switch on spikes will be passed through almost unaltered, along with any mains noise, so if the load is sensitive to peaks ( like a low voltage light bulb) it will experience overvoltage events whenever something is turned on or off on the circuits fed by the same power transformer. Most motor run capacitors do not include a parallel resistor across the capacitor to discharge it, while those used for across the mains power factor correction in discharge lamps do, which does mean that the motor capacitor can hold a charge for a long time after disconnecting power, as in all cases the motor run capacitor is shunted by a lowish resistance motor winding, either directly or by a starting switch for motor start capacitors, but those motor start capacitors do often have the resistor, because of the duty cycle and permitted power application time limits they also have, to get such a high capacitance in such a small package at the rated voltage.

Lifetime wise motor run capacitors are most certainly not recommended for use across the mains, they need to have an inductive current limit to keep self heating within limits, and mains rated capacitors are designed for both safe disconnection in case of failure ( they blow the terminals off relatively safely by breaking the contact connections inside the case end, so need a minimum specified end clearance as seen in the spec) and to fail as open circuit when they self heal or go short circuit internally, by blowing a weak link or simply burning off the end of the winding. Of course with a plastic cased one they often will disintegrate from UV exposure, so need to be in a separate enclosure and not accessible from the outside without tools or turning off the power.
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9614
  • Country: gb
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #8 on: February 04, 2018, 06:33:47 pm »
So, some of my questions:
  • Why not use a capacitor over a transformer?  What are the downsides?
  • In what applications would the one be a significantly better choice than the other?
  • Any real-life experiences with capacitors and why they are not a good idea?
I would have thought the key issue that is not even on your list is input to output isolation. That is a major reason why transformers are used.

The OP should have qualified that the subject of his previous thread was a mains rated heater. Actually, there's no reason why he couldn't simply have continued that thread!  :-\
« Last Edit: February 04, 2018, 06:35:46 pm by Gyro »
Best Regards, Chris
 
The following users thanked this post: fourtytwo42

Offline espresso4jbTopic starter

  • Contributor
  • Posts: 10
  • Country: us
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #9 on: February 04, 2018, 10:27:53 pm »
Thanks for the answers, insightful.  Only reason I did not continue the original thread is that this lead me to a new question and clearly I missed the importance of the load type that in this case one of the key factors is a known, constant load (which was directly related to that original thread).  When I get a chance I will try the capacitor configuration under some different loads and see how it changes; frequency would still be constant as I noted that can make a considerable difference.

One other thing I would like to point out is that in order to get the required voltage (around 80-85 VAC), it was necessary to either use the transformer in either buck or autotransformer mode based on what is generally available, losing isolation advantages.  So in this rather narrow example, spikes and ripple would be passed on in either case and the capacitor may actually dampen/smooth some of the noise.
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19612
  • Country: gb
  • 0999
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #10 on: February 04, 2018, 10:47:29 pm »
So in this rather narrow example, spikes and ripple would be passed on in either case and the capacitor may actually dampen/smooth some of the noise.
No a series capacitor would do the reverse. It would form a high pass filter, so any higher frequency noise would be passed to the heater, more than the mains frequency. This won't be a problem on standard mains, but you'll find if you ran the same circuit on a modified sine wave inverter, it would run hotter, than it would on a sine wave.

A transformer or plain inductor would do a better job of filtering out high frequency noise, because core and skin effect losses increase with frequency.
 

Offline MrAl

  • Super Contributor
  • ***
  • Posts: 1476
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #11 on: February 05, 2018, 05:49:50 pm »
Hello there,

One of the drawbacks to a series capacitor solution vs transformer is there is no galvanic isolation with a single capacitor.  Even with two capacitors it will not achieve this end goal.  Some electronics forums even go so far as to ban discussion of these circuits which are collectively called, "transformerless power supplies", or, "offline power supplies".

With a regular two winding transformer that passes the UL test we get two forms of isolation:
1.  The two windings are not electrically in contact with each other.
2.  The two windings are not physically in contact with each other.

#1 just means that there is no current path between windings so chance of shock from the line itself is reduced.
#2 means that the two windings are not wound one on top of the other as in industrial circuits, but are actually separated by a distance along the core.  That means they cant touch even if the insulation breaks down, and also the arc-over distance is very large.

So these two combined make for a very safe power supply, and that is required in almost all wall warts made today.

Now with a single capacitor you dont have this kind of isolation because for one thing one side is always connected to the line.  If that one wire happens to be the neutral wire, you might be ok, but if the plug is pulled out and put in in reverse, you end up with the full line voltage on that one line which can cause a HOT to GROUND shock to the user.

The other wire isnt that much better anyway though, because the capacitor has the ability to pass a significant current level for a short time which is also dangerous, but even more to the point is the cap can short out and that is a typical failure mode for caps, so you get shocked that way too.

The other view to the cap power supply is that it is used in a lot of professionally designed equipment, so how can this happen?  It just so happens that when this technique is used, the equipment must provide other means to protect the user.  For example, plastic shafts on potentiometers that prevent any direct contact to the internal circuit.

The usual recommendation is that beginners to electronics should not use these cap circuits.  That is probably because there is a certain expectation of safety associated with any voltage level.  For example, if we hear, "it has a 12v output", then we think it is probably safe to use as any other power supply.  In fact, a cap power supply that puts out 12vdc may in fact also be able to put out 120vac with respect to earth ground, so it becomes a safety hazard.

So in the end if you dont realize the danger then dont use it. If you do, then be careful.
The rule of thumb is keep one hand in your pocket while making voltage measurements.  In other words, never touch the live circuit with both hands at the same time.  Also make sure your feet are not grounded :-)

Good luck to you.

« Last Edit: February 05, 2018, 05:53:23 pm by MrAl »
 

Offline espresso4jbTopic starter

  • Contributor
  • Posts: 10
  • Country: us
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #12 on: February 05, 2018, 06:42:03 pm »
MrAl - All makes good sense and my understanding of the general direction of conversation in this thread that a transformer-design power supply is generally more predictable, likely more reliable and well known and avoids having to have as fine a control over the load side of the circuit.  I also expect that using a capacitor for a large voltage drop (for example, 120VAC to 12VAC) is likely looking for trouble as besides the charge held in the capacitor, a failure could send the full voltage down the circuit - which is unlikely to happen with a transformer-based circuit because of the isolation as you noted.

For small voltage changes, I found it difficult to source transformers (e.g., 120VAC:85VAC) and if using in a buck/boost/autotransformer configuration, you are losing at least some of the isolation characteristics.  Perhaps those transformers could be sourced and one has to focus more on the ratio/windings than the stated rating, but still my observation was that there were not a lot of small drop transformers - either the need is not there or other mechanisms are used to address that type of need.  I also expect that many of the wall warts available today are in fact switching power supplies or other types of hybrid supply and not necessarily isolating transformer based (I would expect cost to be the driving factor and isolation through insulation as opposed to electrical design or components), again, I have not seen many appliances or equipment that would need, say, 90VAC power supply - they also align to what is commonly available (i.e., 5V, 12V, 24V, 48V, 36V, etc.).

My point is that a small voltage change is likely more of a fringe use case and that the general rule of thumb of match your need to a suitable transformer-based solution fits broader use cases.  Use a transformer if you can ... if you cannot, build in the necessary safeguards into the design to account for not having one.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16729
  • Country: us
  • DavidH
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #13 on: February 05, 2018, 07:32:03 pm »
Drawback of the capacitive dropper is that it will be sensitive to harmonics on the supply line, passing them through to the load by acting as a high pass filter, and thus it will apply a higher voltage to your resistive load than if the supply is a pure sine wave.

A good example of this is if you used the output of a modified sine inverter which produces a sort of square wave.  A sine wave inverter would have no problem though.

I have often used a high voltage capacitor to control the speed of a shaded pole motor and never had a problem but there the power levels are very low and the capacitance is low enough that film capacitors are practical.  A capacitor would also be a good choice to lower the brightness and extend the operating life of an incandescent lamp if a rectifier diode was not so effective at the same thing and inexpensive.

For a heavy load like a heater, I would always expect a transformer to be used.
 

Offline MrAl

  • Super Contributor
  • ***
  • Posts: 1476
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #14 on: February 06, 2018, 06:16:08 am »
MrAl - All makes good sense and my understanding of the general direction of conversation in this thread that a transformer-design power supply is generally more predictable, likely more reliable and well known and avoids having to have as fine a control over the load side of the circuit.  I also expect that using a capacitor for a large voltage drop (for example, 120VAC to 12VAC) is likely looking for trouble as besides the charge held in the capacitor, a failure could send the full voltage down the circuit - which is unlikely to happen with a transformer-based circuit because of the isolation as you noted.

For small voltage changes, I found it difficult to source transformers (e.g., 120VAC:85VAC) and if using in a buck/boost/autotransformer configuration, you are losing at least some of the isolation characteristics.  Perhaps those transformers could be sourced and one has to focus more on the ratio/windings than the stated rating, but still my observation was that there were not a lot of small drop transformers - either the need is not there or other mechanisms are used to address that type of need.  I also expect that many of the wall warts available today are in fact switching power supplies or other types of hybrid supply and not necessarily isolating transformer based (I would expect cost to be the driving factor and isolation through insulation as opposed to electrical design or components), again, I have not seen many appliances or equipment that would need, say, 90VAC power supply - they also align to what is commonly available (i.e., 5V, 12V, 24V, 48V, 36V, etc.).

My point is that a small voltage change is likely more of a fringe use case and that the general rule of thumb of match your need to a suitable transformer-based solution fits broader use cases.  Use a transformer if you can ... if you cannot, build in the necessary safeguards into the design to account for not having one.

Hi,

So it appears that you are looking for a way to build a power supply that can put out any voltage you choose not just some standard like 12vac or 24vac.  That's understandable.

In that case you could use a variac and isolation transformer.
If this if for production, then you might consider winding your own transformers too.

David Hess brought up a good point too, and that is that the current of the cap supply is usually quite low, so if you need high power then you need to move to an AC converter or just build your own transformer.

Many people build their own transformer stated with an old microwave oven where they remove the transformer and remove the secondary and wind their own secondary.  That's another idea.


 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9614
  • Country: gb
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #15 on: February 06, 2018, 10:02:57 am »
@MrAl:

If you look back at the original thread, you will see that the OP is trying to achieve a drop in output of a mains rated (and insulated) heating element. He is comparing capacitor drop and autotransformer methods of achieving this.
Best Regards, Chris
 

Offline MrAl

  • Super Contributor
  • ***
  • Posts: 1476
Re: AC voltage drop - Transformer vs Capacitor - which is best?
« Reply #16 on: February 06, 2018, 07:41:22 pm »
Hi,

Ok thanks.

It will take a pretty big cap to drop a voltage with a higher current flow.  For example 100uf.

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf