Thanks but I'm really only interested now in what the correct current measurement is from a AC Voltage source would be.
Please see above...
Is it the absolute average current
Or
Is it the RMS current
If it is the RMS current, where does the wasted energy go?
The only path is through the 1
load, so how does 500A(rms) magically transform in to 160A(avg), which represents a difference in the calculated power.
So somehow, all that energy on the load magically vanishes?
Does anyone know the answer to this?
Is this AC voltage source actually putting out only the absolute average current?
This current is about the same as the DC average current, which conserves energy.
If that is true, then why is power at our houses being measured as RMS current?