Electronics > Projects, Designs, and Technical Stuff
Confusing Results measuring Wall Wart Power Supply Performance...
Jon Chandler:
I'm testing a bunch of power supplies for an article I'm working on. I made a number of measurements and realized that my instrumentation choices weren't the best, so I switched things around to optimize the results and now I am truly confused. This is a question deep into EE territory....
My test is to measure input and output parameters vs load in 10% increments from 0 - 130% of rated load. I'm using a Kill-A-Watt unit to measure AC input (voltage, current, watts, VARS and power factor), a Fluke 45 to measure output voltage and another meter to measure output current. Additionally, because the little 5v switchers draw so little current to be at the bottom range of what the Kill-A-watt can measure, I'm also using a meter to measure AC current.
I did a bunch of readings with my Fluke 79 Series II measuring AC current but after I completed those readings, I realized that the Fluke is not true RMS, and with the waveform of the switchers, this is probably a requirement. I switched meters around, using a BK Precision 391A, with is true RMS, to measure the AC input current.
Now I am truly confused by the results. If I look at input power calculated by multiplying the current read on the Fluke (non-true RMS) by the AC line voltage, the calculated result is close to the "watts" indicated by the Kill-A-Watt and my efficiency for a newish 12 volt switcher is in the mid 80% range.
On the other hand, if I calculate the input power by multiplying the BK Precision current (true RMS) by the line voltage, these readings are consistent with the VAR readings of the Kill-A-Watt, and my efficiencies are around 45%.
It doesn't make sense to me that using a true RMS current reading should give a result in VARS since it knows nothing of the power factor, etc. Is there some flaw in my logic? I don't have that optimal 4 DVMs that Dave recommends or I'd use both meters to measure input current simultaneously to verify the measurements :)
Thanks for any ideas,
Jon
The data is shown below.
Simon:
that's interesting, maybe worth putting a resistor in series with the AC input and putting a scope over it to see exactly what the input current draw is like
oPossum:
Measuring RMS volts and RMS amps will allow VA to be calculated (just multiply)
Measuring AC wattage requires a watt meter because math has to be done on each sample taken, not the RMS result of many samples. The phase relationship is important - the sign of V and A at every point in time.
Jon Chandler:
My EE friend made some comments that woke up some dormant brain cells and I see the errors I've made.
Apparent power = VA = volts x amps.
This is exactly what I am measuring with the RMS meter measuring current, and the numbers are in good agreement with those from the Kill-A-Watt.
True power is the instantaneous reading of volts x amps and requires a power meter. That I somehow got readings when using the non-true-RMS Fluke 79 Series II to measure current that agree with the true power reading from the Kill-A-Watt was a coincidence in my sample of 1.
The question still remained...when calculating efficiency (i.e., output power / input power), is the correct answer to use input watts or VA for the calculation. The revelation hit. It depends on the CURRENCY being used to pay for the power. If you are a residential customer paying the power company only for real power, your currency is cash and the efficiency calculated from real power is the correct answer. On the other hand, if you're making power from a generator or a sine inverter (i.e., a UPS), your currency is electrons. The important consideration is the efficiency based on VA since this is the power you have to supply.
A linear supply loaded above about 30% may have a power factor of 80% compared to a switcher with a power factor of around 50%. If you're making your own power and paying for the VA with electrons, the linear supply will be the better alternative. If your load drops down to near-zero much of the time, the switcher will win hands down.
Jon
tecman:
When measuring efficiency, you need to use watts in, not VA. You pay the electric company for watts, not VA. Likewise a generator or other non utility power source will consume watts, not VA. VA can become important in some examples as the VA will be higher than the watts if the power factor is anything less than one. This implies that the current is higher, although not in phase. The higher current will result in higher losses and greater I^2*R heating. This is rarely an issue in any residential scenario, however in large industrial users, bad power factor means that the utility must supply transformers, switchgear and wiring for currents that are improportionally large compared to the watts consumed. Many large industrial users pay a penalty for low power factor, and therefore try to correct the power factor to reduce costs to the utility.
paul
Navigation
[0] Message Index
[#] Next page
Go to full version