Measuring power is kind of hard. You really need an instrument for the purpose -- a Kill-A-Watt (or clone thereof) does a surprisingly good job, and is cheap. The problem is multiplying voltage and current together instantaneously, then averaging to get real power. (Averaging or RMS-ing voltage and current separately, then multiplying, gets apparent power.)
Otherwise, you need a good current probe (low phase shift), and a scope with multiplication math function.
BTW, note that core loss is due to applied voltage, not load current -- you get the most sensitive measurement with zero load, so that the real power is due to only primary DCR (which you can subtract) and core loss.
Another way is to null the reactive power, then assume apparent power = real power. This misses harmonic power, but if the transformer is far from saturation, it will be small. (Saturation means a peaky current waveform, at the primary, with no load on the secondary.) You can reach a null by adding capacitance in parallel with the primary, until the input current is at a minimum. At that point, the current will be real, and V*I will be real power.
Or you can use a completely different method. Measure the transformer temperature, after a few hours time to let it stabilize. Measure the temp rise, unloaded, at rated primary voltage. Then, apply DC to the primary, such that it reaches the same temperature rise. (This can take a while, because you're adjusting the DC, waiting a few hours to see if you got the right temp rise, and repeating.) DC is real power by definition (there is no reactive power at 0Hz
), so by matching the power dissipation, you're using the transformer itself, and a thermometer, as a wattmeter.
(In all cases, subtract primary DCR loss. This actually underestimates the copper losses, because there will be some eddy current / skin effect / proximity effect loss or ACR increase, even at 50/60Hz. If it's not a huge transformer, it should still be within 10%, though.)
Tim