Just wondering what is the best way to measure the steady state electrical efficiency of a low voltage Led Driver.
I haven't got a programmable load.
I am assuming all the waveforms (Vin, Iin, Vout, Iout) are not going to be either pure dc or a low frequency AC sinusoid.
Vin should be solid in this case at 24v.
Assuming Iin, Vout and Iout have a non zero AC component as well as the DC component. The switching frequency on the Driver is around 100khz.
What would be a good way to measure the efficiency.
1% accuracy would be nice, but I doubt I could get that off my cro anyway.
I have 2 meters a Fluke and a BK prescision, and an old HP 1742a scope and another old 20mhz analogue Cro.
I can get access to a couple more cheap meters to, if that would help.
I was thinking of just capturing all the non pure dc waveforms and multiplying them out to give both Input and Output power as a function of time.
Then I was thinking of averaging these values.
Anyone got a better idea