EEVblog Electronics Community Forum
Products => Test Equipment => Topic started by: Stuey on April 02, 2014, 08:13:31 pm
-
My new X2024 has a probe compensation frequency of 1.0012 kHz, and I'm not sure why. I checked it with a DMM, and it agreed with this value, so the issue seems to be with the signal and not the way the scope software measured it.
I just find it strange, since my Rigol has a spot-on test frequency of 1.0001 kHz.
Any ideas as to why this might be? Thanks!
-
My 3054A is 1.0012kHz too, not that I'd ever bothered to measure it accurately until now.
It's probably just some convenient integer divisor of an internal clock somewhere. Nothing to worry about.
-
My new X2024 has a probe compensation frequency of 1.0012 kHz, and I'm not sure why. I checked it with a DMM, and it agreed with this value, so the issue seems to be with the signal and not the way the scope software measured it.
I just find it strange, since my Rigol has a spot-on test frequency of 1.0001 kHz.
Any ideas as to why this might be? Thanks!
As on all scopes - this signal is NOT intended to be a frequency reference or a voltage reference. It is ONLY designed to be used for compensating the frequency response of passive 10x probes. Thus, the frequency and amplitude of this signal is typically given as ~approximate~ values.
-
My 3054A is 1.0012kHz too, not that I'd ever bothered to measure it accurately until now.
It's probably just some convenient integer divisor of an internal clock somewhere. Nothing to worry about.
Thanks! I'll push it from my mind.
As on all scopes - this signal is NOT intended to be a frequency reference or a voltage reference. It is ONLY designed to be used for compensating the frequency response of passive 10x probes. Thus, the frequency and amplitude of this signal is typically given as ~approximate~ values.
I know, but the test signal is typically more exact. Thanks for the reassurance!
-
Hey Stuey - you got me thinking about the stability of the measurement, so I tried a couple of things, and I think that the 1KHz cal sig is spot on, it is the 'Meas' function that is not real stable.
The first pic (using 'Meas', normal Acq Mode) shows 3 V P-P instead of 2.5. Looks like it is measuring the noise as well. It shows a freq of 1.0011 to 1.0012 KHz where the last digit fluctuates.
The second pic (using 'Meas', averaging Acq Mode) shows 2.48 V P-P (that's better!). It shows a stable freq of 1.0015 KHz (hmmm... not as good...)
The third pic (using 'Cursors', normal Acq Mode) shows 2.50000 V P-P and a stable freq of 1.0000 KHz.
The fourth shows the same as the third, but using averaging Acq Mode (much easier to set the peak to peak cursors accurately).
The Cursors have to be manually positioned, but on a stable sig like the cal, they can be dialed in very precisely, and I think give a much better answer than the Meas function gives.
I am glad you asked this question to get me thinking!
Sooo... Does my explanation hold up to Engineering scrutiny? (as a non-engineer, I often misinterpret findings like this...)
-
In general, the built in measurements should be more accurate, because they should be operating on samples with a finer timing interval than what's shown on the display. For example, the display might have 1000 points across the display, while the acquisition will have more samples available over the same time period.
Of course, I say "should" because some scopes make measurements on the displayed data points instead of the full set of acquired waveform points. I'm not sure how yours works.
Also, many scopes can give you an average/mean of the measured value, which should lend some stability and noise reduction in your measurement.
Flickering least significant digits in a measurement is absolutely normal due to noise, etc.
Cursor measurements appear more stable because they are limited often to the screen resolution, which is much coarser than the waveform sample timing resolution, thus noise won't change the values.
And finally, the probe compensation signal is built with a very simple (inexpensive) oscillator, so it's frequency is very often not exact or stable. Thus, the automatic measurement is likely correct.
-
The frequency resolution will be good when you are using a hardware triggered frequency counter. I'm not sure if the 2000X has one.
All other measurements will be calculated in software from the sampled signal on the screen and will be subject to errors caused by the sampling frequency and resolution of the analog to digital converter. The unchanging 1.0000KHz reading is because the cursors are placed exactly 1ms apart, same for 2.50000Vpp. If you adjust them so they are 1 pixel out a large error is introduced.
You can reduce the error in your cursor measurements by making 1 cycle of the signal take up more space on the screen. Adjust the vertical so the top and bottom of the signal are almost at the top and bottom of the screen and adjust the horizontal scale and offset so you have just one rising and falling edge on the screen.
-
I once set up a 4 Ch scope with Cal input and various other 1 KHz waveforms for a post here on the forum.
The various waveforms were supplied from an AWG.(new)
To get stable waveforms with simple triggering(1 ch) the AWG had to be tweaked ~10-20 Hz.
In some CRO's the cal voltage and/or frequency WAS adjustable however.
IME the cal frequencies DO vary a bit, but as has been said, it is not a precision frequency reference and should NOT be thought of as one.
But it's handy sometimes if you haven't got a Sig gene or AWG.
-
In general, the built in measurements should be more accurate, because they should be operating on samples with a finer timing interval than what's shown on the display. For example, the display might have 1000 points across the display, while the acquisition will have more samples available over the same time period.
Of course, I say "should" because some scopes make measurements on the displayed data points instead of the full set of acquired waveform points. I'm not sure how yours works.
Also, many scopes can give you an average/mean of the measured value, which should lend some stability and noise reduction in your measurement.
Flickering least significant digits in a measurement is absolutely normal due to noise, etc.
Cursor measurements appear more stable because they are limited often to the screen resolution, which is much coarser than the waveform sample timing resolution, thus noise won't change the values.
And finally, the probe compensation signal is built with a very simple (inexpensive) oscillator, so it's frequency is very often not exact or stable. Thus, the automatic measurement is likely correct.
Thanks for the insight Alan. I put a 1 KHz wave gen sig on channel 2 and did the same exercise. With this (presumably) more precise signal, the Measurement function gives me 1.0000 KHz with no last digit flutter so that corroborates what you are saying. Thanks for setting me straight!
-
I always regard built-in measurements as nothing more than a quick and easy convenience, nothing more. They are NOT guaranteed to be accurate, and they're certainly not as accurate as the measurements you can make using manual cursors.
Bear in mind that not only are they relatively inaccurate, but also that averaging them over time doesn't result in a guaranteed improvement. If a measurement is systematically out, because of, say, a quantisation error, then averaging won't reduce that error.
When I measured the frequency of my scope's cal output, I did it as follows:
- Position one cursor on the rising edge, using a fast time base to get it precisely at the point where it crosses a grid square half way up. (The best point is usually the point where the vertical slew rate is highest, because the uncertainty in the time at which it crosses the grid square is minimised).
- Switch to the other cursor, which ensures the position of the first cursor remains precisely fixed in time even though it's scrolled off the edge of the display. This is a useful characteristic of the 3000X scope.
- Adjust the time base to make the next rising edge visible, placed the second cursor on it, then zoom right in to get it in exactly the same position relative to the waveform as the first.
The frequency is 1.0012 kHz. Not 1.0011 or 1.0013.
-
I hooked up a counter to the Wave Gen output and got 1,000 Hz (old, but just calibrated, reads to nearest Hz). I then connected it to the Probe Comp output and got 1,001 Hz. The scope has one more digit of precision and shows 1.0000 kHz and 1.0012 kHz respectively, so Alan is right - the Meas function of the scope is more accurate than me manually setting the cursors (at least in this instance). This was not intuitive to me, but with Alan's explanation, I get it now.
Either way, the accuracy and precision of the scope are PLENTY good for what I am doing, I was just interested in Stuey's original question and really appreciate all the responses and the transfer of knowledge. This forum is addictive and forcing me to re-think many things that I thought I knew (I blame you for that Alan... :D)
-
With 3 or fewer wavelengths acquired, there is strong agreement between automatic and manual measurements. When zoomed in to where there's 1 wavelength, there is absolutely not enough wiggle room where I can fault the way the scope takes measurements. There is not enough delay in the rise or noise, to be able to say the signal is closer to 1.0000 kHz than the measurements indicate.
I have adopted the apparently generally accepted understanding that the test signal isn't going to be spot-on, but I appreciate your help in trying to sort things out!