### Author Topic: Confusing Results measuring Wall Wart Power Supply Performance...  (Read 5350 times)

0 Members and 1 Guest are viewing this topic.

#### Jon Chandler

• Frequent Contributor
• Posts: 539
##### Confusing Results measuring Wall Wart Power Supply Performance...
« on: March 18, 2011, 02:25:58 pm »
I'm testing a bunch of power supplies for an article I'm working on.  I made a number of measurements and realized that my instrumentation choices weren't the best, so I switched things around to optimize the results and now I am truly confused.  This is a question deep into EE territory....

My test is to measure input and output parameters vs load in 10% increments from 0 - 130% of rated load.  I'm using a Kill-A-Watt unit to measure AC input (voltage, current, watts, VARS and power factor), a Fluke 45 to measure output voltage and another meter to measure output current.  Additionally, because the little 5v switchers draw so little current to be at the bottom range of what the Kill-A-watt can measure, I'm also using a meter to measure AC current.

I did a bunch of readings with my Fluke 79 Series II measuring AC current but after I completed those readings, I realized that the Fluke is not true RMS, and with the waveform of the switchers, this is probably a requirement.  I switched meters around, using a BK Precision 391A, with is true RMS, to measure the AC input current.

Now I am truly confused by the results.  If I look at input power calculated by multiplying the current read on the Fluke (non-true RMS) by the AC line voltage, the calculated result is close to the "watts" indicated by the Kill-A-Watt and my efficiency for a newish 12 volt switcher is in the mid 80% range.

On the other hand, if I calculate the input power by multiplying the BK Precision current (true RMS) by the line voltage, these readings are consistent with the VAR readings of the Kill-A-Watt, and my efficiencies are around 45%.

It doesn't make sense to me that using a true RMS current reading should give a result in VARS since it knows nothing of the power factor, etc.  Is there some flaw in my logic? I don't have that optimal 4 DVMs that Dave recommends or I'd use both meters to measure input current simultaneously to verify the measurements

Thanks for any ideas,

Jon

The data is shown below.

#### Simon

• Global Moderator
• Posts: 17967
• Country:
• Did that just blow up? No? might work after all !!
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #1 on: March 18, 2011, 03:12:41 pm »
that's interesting, maybe worth putting a resistor in series with the AC input and putting a scope over it to see exactly what the input current draw is like

#### oPossum

• Super Contributor
• Posts: 1446
• Country:
• Very dangerous - may attack at any time
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #2 on: March 18, 2011, 03:19:19 pm »
Measuring RMS volts and RMS amps will allow VA to be calculated (just multiply)

Measuring AC wattage requires a watt meter because math has to be done on each sample taken, not the RMS result of many samples. The phase relationship is important - the sign of V and A at every point in time.

#### Jon Chandler

• Frequent Contributor
• Posts: 539
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #3 on: March 20, 2011, 12:35:34 pm »
My EE friend made some comments that woke up some dormant brain cells and I see the errors I've made.

Apparent power = VA = volts x amps.

This is exactly what I am measuring with the RMS meter measuring current, and the numbers are in good agreement with those from the Kill-A-Watt.

True power is the instantaneous reading of volts x amps and requires a power meter.  That I somehow got readings when using the non-true-RMS Fluke 79 Series II to measure current that agree with the true power reading from the Kill-A-Watt was a coincidence in my sample of 1.

The question still remained...when calculating efficiency (i.e., output power / input power), is the correct answer to use input watts or VA for the calculation.  The revelation hit.  It depends on the CURRENCY being used to pay for the power.  If you are a residential customer paying the power company only for real power, your currency is cash and the efficiency calculated from real power is the correct answer.  On the other hand, if you're making power from a generator or a sine inverter (i.e., a UPS), your currency is electrons.  The important consideration is the efficiency based on VA since this is the power you have to supply.

A linear supply loaded above about 30% may have a power factor of 80% compared to a switcher with a power factor of around 50%.  If you're making your own power and paying for the VA with electrons, the linear supply will be the better alternative.  If your load drops down to near-zero much of the time, the switcher will win hands down.

Jon

#### tecman

• Frequent Contributor
• Posts: 444
• Country:
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #4 on: March 20, 2011, 01:02:21 pm »
When measuring efficiency, you need to use watts in, not VA.  You pay the electric company for watts, not VA.  Likewise a generator or other non utility power source will consume watts, not VA.  VA can become important in some examples as the VA will be higher than the watts if the power factor is anything less than one.  This implies that the current is higher, although not in phase.  The higher current will result in higher losses and greater I^2*R heating.  This is rarely an issue in any residential scenario, however in large industrial users, bad power factor means that the utility must supply transformers, switchgear and wiring for currents that are improportionally large compared to the watts consumed.  Many large industrial users pay a penalty for low power factor, and therefore try to correct the power factor to reduce costs to the utility.

paul

#### Zero999

• Super Contributor
• Posts: 19836
• Country:
• 0999
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #5 on: March 20, 2011, 04:13:51 pm »
You pay the electric company for watts, not VA.  Likewise a generator or other non utility power source will consume watts, not VA.  VA can become important in some examples as the VA will be higher than the watts if the power factor is anything less than one.  This implies that the current is higher, although not in phase.  The higher current will result in higher losses and greater I^2*R heating.  This is rarely an issue in any residential scenario, however in large industrial users, bad power factor means that the utility must supply transformers, switchgear and wiring for currents that are improportionally large compared to the watts consumed.  Many large industrial users pay a penalty for low power factor, and therefore try to correct the power factor to reduce costs to the utility.
It it's that inconvenient for the power companies to supply low power factor loads then why don't they just start charging for VA used instead? People would soon do something about it when they realise they're paying 25% more than they should, if their power factor is 0.75.

#### Jon Chandler

• Frequent Contributor
• Posts: 539
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #6 on: March 20, 2011, 04:53:52 pm »

Likewise a generator or other non utility power source will consume watts, not VA.

In at least one case, I've found this to be untrue.  I have a 1200 VA (hmm...it's actually rated in VA) UPS true sine wave inverter powered from 2 large deep-cycle 12 volt batteries in series to power network gear, tankless gas hot water heater and the furnace blower when the power goes off.

The furnace blower is only 1/8 horsepower or a little less than 100 watts.  However, when it's running, it puts a huge load on the UPS (more than half its rated output) because of its power factor.  Based on watts, it should only load the UPS to about 10% of its capacity.

The power company has to generate enough power to cover the VA load even though residential customers don't pay for it directly...but they do pay in terms of higher rates for increased generating costs and infrastructure.

#### tecman

• Frequent Contributor
• Posts: 444
• Country:
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #7 on: March 20, 2011, 09:01:29 pm »

Likewise a generator or other non utility power source will consume watts, not VA.

The furnace blower is only 1/8 horsepower or a little less than 100 watts.  However, when it's running, it puts a huge load on the UPS (more than half its rated output) because of its power factor.  Based on watts, it should only load the UPS to about 10% of its capacity.

The power company has to generate enough power to cover the VA load even though residential customers don't pay for it directly...but they do pay in terms of higher rates for increased generating costs and infrastructure.

First you motor problem is likely due to the fact that induction motors have very bad power factor, which improves and heads towards 1.0 at full load.  Lightly loaded they can be 0.4 or less.  This means that it is drawing high current for a light load.  Your small motor is likely drawing much higher current than you would think.

Power companies do oversize somewhat, but since the majority of PF issues are related to large industrial users, they let the user correct or pay for the power factor, rather than higher energy costs.  Additionally the utilities frequently add capacitors to the grid to improve PF.  This is an additional cost to the utility, but generally not a big one.

paul

#### Kiriakos-GR

• Super Contributor
• !
• Posts: 3525
• Country:
• User is banned.
##### Re: Confusing Results measuring Wall Wart Power Supply Performance...
« Reply #8 on: March 20, 2011, 09:16:37 pm »
The Kill-A-Watt its one unreliable device for home use , not an instrument !!

The error in it, its too high so to be used for comparisons.

Smf