Author Topic: How is the current rating of a transformer determined?  (Read 3066 times)

0 Members and 1 Guest are viewing this topic.

Offline ledtesterTopic starter

  • Super Contributor
  • ***
  • Posts: 3036
  • Country: us
How is the current rating of a transformer determined?
« on: April 19, 2018, 05:28:02 pm »

Suppose I have a transformer described as "15 V @ 400 mA".

I know the "15 V" is an RMS value, so is the "400 mA" also an RMS value?

What is the test circuit used to determine the current rating? I'm envisioning a variable resistor is placed across the output of the transformer and adjusted until the output voltage reads 15 V RMS. Then the current through the resistor is measured in which case the current would be an RMS value as well.
 

Offline Cerebus

  • Super Contributor
  • ***
  • Posts: 10576
  • Country: gb
Re: How is the current rating of a transformer determined?
« Reply #1 on: April 19, 2018, 05:42:57 pm »
Yes, the current rating will be RMS.

There isn't really a test circuit. Transformers are designed to meet a defined steady state temperature rise over ambient with their rated load. The time constant associated with the heating can be very long so testing this with a dummy load would be a potentially very slow process.
Anybody got a syringe I can use to squeeze the magic smoke back into this?
 

Offline Jwillis

  • Super Contributor
  • ***
  • Posts: 1710
  • Country: ca
Re: How is the current rating of a transformer determined?
« Reply #2 on: April 19, 2018, 08:47:08 pm »
I like to use a little program called Xformer Designer.For the program you only need need to measure the core area of the transformer the volts you apply at 50 or 60 Hz .The flux is usually 1 for laminates and 1.5 to 4 for ferrite . The flux determines the winding's but not the VA .The current can be determined by the power(VA) divided by the Voltage.
 For example you have a core area of 4 cm square .You apply 15 volts at 60Hz with a flux of 4(ferrite) .you get around 12 VA
with   I = VA / V    or   I= 12VA / 15   I = 800ma   Maximum .Your amperage will change with the volts applied .More volts less amperage.
So the amperage of a transformer before saturation is determined by the core area .But not necessarily just the core area because the gauge of the wire used also determines the maximum current of the transformer.Smaller wire is higher resistance hence the less current before the wire heats up and burns .
Handy little program and its fairly accurate.Mine doesn't seem to work with standard measure but does with metric.
 

Offline Benta

  • Super Contributor
  • ***
  • Posts: 5875
  • Country: de
Re: How is the current rating of a transformer determined?
« Reply #3 on: April 20, 2018, 09:59:36 am »
What is the test circuit used to determine the current rating? I'm envisioning a variable resistor is placed across the output of the transformer and adjusted until the output voltage reads 15 V RMS. Then the current through the resistor is measured in which case the current would be an RMS value as well.

A power transformer is a voltage source, where the output voltage is defined by the primary/secondary turns ratio. You do not "adjust" the voltage by varying the load, the voltage can be regarded as fixed (although you should monitor it).
For an empirical test, you place a resistive load across the secondary and observe the temperature rise inside the transformer. When this reaches a certain level, eg, 50 deg C, you have found the maximum RMS output current.
« Last Edit: April 20, 2018, 10:03:01 am by Benta »
 

Offline woodchips

  • Frequent Contributor
  • **
  • Posts: 594
  • Country: gb
Re: How is the current rating of a transformer determined?
« Reply #4 on: April 20, 2018, 05:36:04 pm »
Possibly.

Transformers are very often rated by the DC voltage and current out. The snag is that both of these are very dependent on the type of rectifier, half wave, full wave, bridge etc and by the smoothing, capacitor or inductor.

The RS components catalogue used to have a nice little design guide for their transformers, do they still exist?


 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14199
  • Country: de
Re: How is the current rating of a transformer determined?
« Reply #5 on: April 20, 2018, 06:13:20 pm »
The current rating is RMS. The liming effect is usually temperature rise, not the drop in voltage. There will be some drop in voltage at the rated current (especially with smaller power ones). So usually the turns number is usually calculated so that at the nominal current (sine) the nominal voltage is reached.

The current rating is not a fixed limit, more like a limit to reach a given temperature rise set by the temperature class. So the same transformer can have different current limits for different temperature class specs.
 
The following users thanked this post: ledtester

Offline Benta

  • Super Contributor
  • ***
  • Posts: 5875
  • Country: de
Re: How is the current rating of a transformer determined?
« Reply #6 on: April 20, 2018, 06:27:26 pm »
Transformers are very often rated by the DC voltage and current out.

Transformers are rated by AC RMS voltage out and AC RMS current out. It seems you're thinking about power supplies.  :palm:

 

Offline ledtesterTopic starter

  • Super Contributor
  • ***
  • Posts: 3036
  • Country: us
Re: How is the current rating of a transformer determined?
« Reply #7 on: April 21, 2018, 03:13:52 am »
Thanks everyone for your replies.

I found this guide for computing DC current based on your rectifier configuration:

http://www.hammondmfg.com/pdf/5c007.pdf

I'm wondering what they mean by "I D.C." - is this a max value? an RMS value?

The guide says that "V A.C." is an RMS value, and I presume that "I A.C." is also RMS value.

For instance, with the half-wave resistive load I know what the output voltage wave form looks like, so in what sense is "I D.C. = 0.64 X Sec. I A.C."?

Oh - and I also found this commentary associated with a transformer being sold on ebay:

Quote
...
(1) 30 VCT TRANSFORMER 115V Prim. 30VCT @ 3A SEC. NOS. 12 Inch Long Leads, 2-7/8 H x 3-3/8 W x 2-7/8 D. Weighs 4 Pounds. Bolt pattern 2-13/16 x 1-1/2 for #10 Screws. These transformers are "House Numbered" so no specs have been found. Here is what I was able to get with some experimenting.
No Load Voltage 33 VAC
25 Ohm Load 31.8 VAC @ 1.27 A
12.5 Ohm Load 31 VAC @ 2.48A
8.3 Ohm Load 30 VAC @ 3.61 A

I'm basing my specs on two things, the 20 watts per pound rule ( 4lbs x 20 W = 80 watts) and the 10% voltage drop rule which occurs at the 3.6 Amp level. Backing off that level a little gives you 90 Watts from 30 V x 3 Amps. The transformer was operated at the 12.5 Ohm Load Level for an hour and the temperature never exceeded the 105 F level.
...

 

Offline schmitt trigger

  • Super Contributor
  • ***
  • Posts: 2222
  • Country: mx
Re: How is the current rating of a transformer determined?
« Reply #8 on: April 21, 2018, 03:18:50 am »
BTW, ledtester, I also have a LED test box identical to the one that you show in your avatar.

Can't remember how much I paid for it, perhaps $5 US, and it is money extremely well spent! I use it all the time.
 

Offline ledtesterTopic starter

  • Super Contributor
  • ***
  • Posts: 3036
  • Country: us
Re: How is the current rating of a transformer determined?
« Reply #9 on: April 21, 2018, 05:02:43 am »
BTW, ledtester, I also have a LED test box identical to the one that you show in your avatar.

Can't remember how much I paid for it, perhaps $5 US, and it is money extremely well spent! I use it all the time.

Yeah, I got mine years ago when blue and white leds first starting appearing on ebay. This was before the Chinese distributors were selling directly on ebay - you had to go through an importer. I was a little disappointed to find out that it just consisted of a bunch of different value resistors instead of precisely controlled currents, but then I guess it would have cost more than $5.

I'm not sure you can get this particular model anymore. All I can find now are the ones with a curved end and a big red button (see attached pic).

A few other similar projects and products which might be of interest:

- http://www.robotroom.com/LED-Tester-Pro-1.html
- https://www.ebay.com/itm/Precision-LEDs-Tester-Light-Emitting-Diode-Test-Box/400158105115




« Last Edit: April 21, 2018, 05:53:55 am by ledtester »
 

Offline schmitt trigger

  • Super Contributor
  • ***
  • Posts: 2222
  • Country: mx
Re: How is the current rating of a transformer determined?
« Reply #10 on: April 23, 2018, 02:08:57 pm »
There are of course, much better LED testers nowadays.

But for my purpose, identifying the LED color, also a relative (visual) brightness check, it does its job.
 

Offline IanMacdonald

  • Frequent Contributor
  • **
  • Posts: 943
  • Country: gb
    • IWR Consultancy
Re: How is the current rating of a transformer determined?
« Reply #11 on: April 24, 2018, 07:02:44 am »
Main  point to realise is that all rectifier and reservoir capacitor combinations have a less than unity power factor. The lower the series resistance of the winding and diodes, and the larger the reservoir cap, the worse this will be. Thus in real world use with a DC load the max current will be considerably less than with a resistive AC load.

Modern tendency is simply to accept this situation, but in the early days the tendency was to use a 'pi filter' consisting of a small reservoir cap, series resistor or choke and smoothing cap. Which reduced the transformer loading for the same smoothing effect.

I recall one model of portable TV that drew about 2A/12V but could easily burn out a 6A bridge rectifier. Reason was the use of huge reservoir cap and a transformer with very low winding resistance. Solution was to fit a 25A bridge.
« Last Edit: April 24, 2018, 07:05:38 am by IanMacdonald »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf