Hi All
I've just bought a Pendulum CNT-90 but I can't seem to get it to work as expected. I'm inputting a 10MHz square wave @280mv pp from CH1 on my Siglent SDG1025 which is clocked by my BG7TBL gpsdo into the 10Mhz Ext Ref input. The display shows EXT ref to show it has accepted the signal.
The I connected CH2 of my sig Gen to Input A on the CNT-90 and supplied a 10MHz sine wave at 5v.
The issue is that whatever gate speed/resolution I choose, the last couple of digits on the CNT-90 dance around by about 20 counts. I had assumed that if the extref is essentially the same clock, the display should be crack on and the display would be. for example, 10.000 000 000 000 but even on faster gate times like 100mS, the last couple if digits dance around, effectively showing more error for faster gate times.
I've played around with different voltage levels for ext ref and sine/square and while these do make a small difference, the display doesn't zero.
Am I missing something here?
Thanks
Steve
This question also concerns me.
I use an external 10 MHz source and divide the signal, once on the external reference input and once on the measuring input of the Tek FCA 3000 (aka Fluke PM 6690/Spectracom CNT 90)
If I collect measured values over a longer period of time (in this example, 1 second gate time, n = 1500 samples, Trigger "MANUAL) and the mean value I expect exactly 10 MHz, possibly also +/-2 on the LSD.
In my guess I assume that sources of error such as jitter of the 10 HHz source, trigger errors of the counter or other noise have random properties.
This should exclude the fact, that the same deviation is always measured.
But no matter what 10 MHz source I'm trying to do, the FCA 3000 always measures something too high in this configuration.
I'm not sure if I have a mental error or if the counter has a systematic measurement error. I can't find a good explanation for that.
The question is also, which measurement is correct?
The pictures in the Appendix show the statistics with smart frequency activated and below without smart frequency.
With smart frequency, as a continious time stamping counter and linear regression, the error is of course further shifted into the area of some uHz.
But the reading is always slightly higher than 10 MHz. I have repeated this process often. I have never seen a mean value below 10 MHz.
However, this should be the case for random errors. Or??
Perhaps the experts here in the forum can explain this well.
Jörg