I understood that one accurate method of analyzing signals coming from a DUT with 50 ohm impedance output on a scope with only a 1meg input was to use a 50 ohm feed through adapter at the scope input, in essence putting 50 ohm termination in parallel with the scope 1meg input so that the DUT sees almost 50 ohm input (49.997 ohms). If the input signal is a pure 10Mhz sine wave, would the scope still give you an accurate Vp-p reading? I tried this on my Rigol 11002D using an accurate Tektronix 10x (20dBm) feed through attenuators with the scope set to 10x input and using a direct 50 ohm to 50 ohm BNC cable. With an verified, calibrated Signal generator at 10Mhz precisely at 20.0 mW (+13 dBm) I verified the output on a HP 438A powermeter. Then I verified then 10x attenuator when inserted correctly dropped 20 dBm on the power meter to -7dBm.
However on the Rigol I found the -7dBm signal was reading 2x higher V-p than expected. I thought it was a problem with the Rigol so I repeated the experiment on my old vintage Tek 7904A also using a high impedance vertical amp with the 50 ohm 10x feedthru and I also found the 50 ohm signal at twice the Vp-p of what it should be.
Is the principle of using a 50 ohm feed-through on a high impedance scope input just unreliable for measuring signal voltage levels within any degree of accuracy? Is the problem that the scope still sees an impedance mismatch?
tnx
john