I have a stepup transformer driving a test load (46nF + 16 ohms), with a 1:10 turns ratio. The idea is that near resonance (~54kHz here) it will draw minimal current/power.
I've measured the transformer using an LCR meter (at the frequency range of interest) and with parasitics (leakage inductance), have the following circuit:

(I measured leakage by shorting one side of the transformer, and measuring inductance of the other, and vice versa; is that right?)
I'm sensing current using a small resistor on the ground side. (0.05ohm)
This all makes sense, the simulation shows a dip in current at around 54kHz. I would have expected the output to be flatter, but I guess the leakage inductance has a big effect. The result is here:

Anyway, when I build and measure it (source is an old power amp, not quite flat but OK), I get the following:


The gain looks reasonable, but why isn't current dipping near resonance? It's also trending up with frequency, rather than down.
Clearly something is wrong with my model, or my build, any ideas? I've double-checked everything I could think to. Waveforms are clean and undistorted.