Just to fill in the "Sanity Check" data base, I pulled my 53310A off the shelf and did some tests as well. First of all, it's good practice to take screenshots (either via GPIB or photo) after pressing the "Status" button since this will display all relevant configuration info of the MDA at the right menu area.
I found that the "quality" of the input signal can make a big difference regarding the standard deviation of the measured frequency. Always chose your input and configure it as required. I used in all the tests the "find 50% threshold" function to get good trigger quality. I also found that a fast square signal at the input will provide a significantly better standard deviation than a sine signal. Since the internal standard timebase is more than an order of magnitude worse jitter-wise than a high precision external reference (I used a
Rubidium reference equipped with an Efratom LPRO101), it's always a good idea to use a good external one. I cannot tell for the internal ovenized oscillator option of the 53310A since my specimen came without it.
In order to sanity check the MDA stand-alone, it's sufficient to route the "Ref out" at the rear of the instrument to the input(s). Even without an external reference connected, I found the standard deviation in fast histogram mode to be in the ballpark of 10mHz (timebase is irrelevant in this mode).
I did the following tests as shown in the attached photos in sequence:
1. Channel A connected to REF Out at the rear of the MDA. REF In connected to a 7dBm sine output of Rubidium source. Fast histogram mode.
2. Channel B connected to REF Out at the rear of the MDA. REF In connected to a 7dBm sine output of Rubidium source. Fast histogram mode.
3. Channel A connected to 5Vpp square output of Rubidium source. REF In left open (internal standard reference of MDA used). Fast histogram mode.
4. Channel A connected to 5Vpp square output of Rubidium source. REF In connected to a 7dBm sine output of Rubidium source. Fast histogram mode.
Findings: For all practical means, there's no difference between A and B inputs on my MDA. Standard deviation if measuring REF Out differs only slightly if external or internal reference oscillator is used (~10mHz vs. ~7.5mHz). If internal reference is used and external highly accurate signal is measured, standard deviation due to jitter of the reference is way worse (~250mHz in case of the standard internal reference). Signal quality at the input, especially level and rise/fall times (obviously) make a difference in signal jitter (7dBm sine ~7.5mHz, 5Vpp square ~5.3mHz)
Previous experiments and measurements with this instrument had revealed that the MDA is so sensitive that the BNC cables used can make a difference, even though relatively high level signals are measured. For certain, highly accurate timing measurements, there are few - if any - modern replacements available that perform better. Especially considering the "street prices" one has to pay for the MDA...
Cheers,
Thomas
P.S. Options 30 and 31 are mutually exclusive, yet Option 31 provides the functionality of Option 30, though with different hardware.