I'm trying to design a HV differential probe as a project which should teach me more about op amps (so far I've only covered just simple 10's kHz amps). After some background reading, I settled on trying to achieve the following specs:
+-500V input (AC+DC)
20MHz bandwidth (-3dB)
+-5V output (100:1 attenuation)
And I've created the following circuit as a starting attempt:

With some op amps I have chosen for their high input Z, high bandwidth, and relatively low cost. When I perform an AC analysis the circuit seems to have a very flat gain with <1dB of ripple - the following says I should expect values very close to 5V up to just over 20MHz.

However, when I perform a transient simulation at 20 MHz I see the output is much smaller when compared to 5 MHz.
20MHz:

5MHz:

Why am I seeing this difference between simulation types in LTSpice? Or rather, how should I be interpreting these two different simulations? Presumably I'm not understanding the output of the AC analysis.
I've also included the schematic file for LTSpice if anyone would like them.