Electronics > Beginners
How is the current rating of a transformer determined?
(1/3) > >>
ledtester:

Suppose I have a transformer described as "15 V @ 400 mA".

I know the "15 V" is an RMS value, so is the "400 mA" also an RMS value?

What is the test circuit used to determine the current rating? I'm envisioning a variable resistor is placed across the output of the transformer and adjusted until the output voltage reads 15 V RMS. Then the current through the resistor is measured in which case the current would be an RMS value as well.
Cerebus:
Yes, the current rating will be RMS.

There isn't really a test circuit. Transformers are designed to meet a defined steady state temperature rise over ambient with their rated load. The time constant associated with the heating can be very long so testing this with a dummy load would be a potentially very slow process.
Jwillis:
I like to use a little program called Xformer Designer.For the program you only need need to measure the core area of the transformer the volts you apply at 50 or 60 Hz .The flux is usually 1 for laminates and 1.5 to 4 for ferrite . The flux determines the winding's but not the VA .The current can be determined by the power(VA) divided by the Voltage.
 For example you have a core area of 4 cm square .You apply 15 volts at 60Hz with a flux of 4(ferrite) .you get around 12 VA
with   I = VA / V    or   I= 12VA / 15   I = 800ma   Maximum .Your amperage will change with the volts applied .More volts less amperage.
So the amperage of a transformer before saturation is determined by the core area .But not necessarily just the core area because the gauge of the wire used also determines the maximum current of the transformer.Smaller wire is higher resistance hence the less current before the wire heats up and burns .
Handy little program and its fairly accurate.Mine doesn't seem to work with standard measure but does with metric.
Benta:

--- Quote from: ledtester on April 19, 2018, 05:28:02 pm ---What is the test circuit used to determine the current rating? I'm envisioning a variable resistor is placed across the output of the transformer and adjusted until the output voltage reads 15 V RMS. Then the current through the resistor is measured in which case the current would be an RMS value as well.

--- End quote ---

A power transformer is a voltage source, where the output voltage is defined by the primary/secondary turns ratio. You do not "adjust" the voltage by varying the load, the voltage can be regarded as fixed (although you should monitor it).
For an empirical test, you place a resistive load across the secondary and observe the temperature rise inside the transformer. When this reaches a certain level, eg, 50 deg C, you have found the maximum RMS output current.
woodchips:
Possibly.

Transformers are very often rated by the DC voltage and current out. The snag is that both of these are very dependent on the type of rectifier, half wave, full wave, bridge etc and by the smoothing, capacitor or inductor.

The RS components catalogue used to have a nice little design guide for their transformers, do they still exist?


Navigation
Message Index
Next page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod