what is the best mode for the input on the GTC and the best way to measure it before applying it to the device?
I haven't fully explored all the options, but what is the default input impedance? And in that mode, what is the highest voltage level, 3.5V as below? I have power meters including HP sensors, but they are all 50ohm, I believe. I would like to check the voltage input because for instance, my 5061b cesium PPS is 10V, if I remember correctly. So if the GTC can be run in 50ohm, is that the preferred safest mode and if so, then the numbers below for 50ohm would be applied. Or would it be better to measure it with a scope in high impedance mode? I can't remember how the HP power meters read PPS, if it is accurate or not. I know some of the sensors measure heating but probably not those I have.
Absolute maximum short term DC input level: between -10 and +10 Volt
Recommended maximum DC input level: between -3.5 and +3.5 Volt
Absolute maximum short term AC input level: 10 Vpp
Recommended maximum AC input level: 3.5 Vpp <--- this one and the one above seem to contradict each other.
Absolute maximum short term power into 50 ohm: +20 dBm
Recommended maximum power into 50 ohm: +10 dBm
Trigger level,: between 0 and 3.5 Volt
Input frequency A input: between 0.1 Hz and 7 GHz (for ZC407) or 12 GHz (for ZC412)
Input frequency B input: between 0.1 Hz and 350 MHz (max 50 MHz without prescaler)
Input trigger maximum time interval: No limit when measuring timestamp.
Input impedance: 1 MOhm/2 pF or 50 ohm (above 100 kHz)
Input hysteresis: 5 mV
Minimum input level in 50 Ohm: -20 dBm
Power meter range in 50 ohm input mode: -20 dBm to +20 dBm between 100 kHz and 1 GHz (see frequency dependent power level deviation chart below)
Thanks,
Jerry