I have decided to get a bit more in depth about the workings of a modern DSO. I learned about the Nyquist sampling theorem along the way and about two samples minimum with evenly spaced timing.
In the absence of aliasing, yes, but see below about unintended sources of aliasing.
From what I can glean, a decent sine waveform without aliasing can be had at a max bandwidth of 1/4 to 1/5 the sample rate and 1/5 is preferred.
It depends on who you ask and this brings up the question is *why* it requires 4 or 5 samples per cycle when Nyquist theory says it should only require 2. See below.
Am I seeing this right?
The commonly missed issue is that there are other sources of aliasing. The input signal chain is not perfectly linear producing harmonic and intermodulation distortion. The ADC especially is not perfectly linear and this is especially a problem with interleaved ADCs. The result is that even with a input signal which is completely below the Nyquist frequency, the ADC's non-linearity mixes the input signal with the sample clock producing sidebands and some of those will be above the Nyquist frequency. (1) This is especially a problem as the input signal approaches the Nyquist frequency.
The result of this aliasing can be seen in the sin(x)/x reconstruction (2) as a wobble (3) in any waveform whether it is below the Nyquist frequency or not. (5) The solutions for this are better ADC linearity, a better sampling clock, and a higher sample rate. On oscilloscopes which support true ETS (equivalent time sampling), sample rates are so high compared to any input signal that this result of aliasing of the mixing products is completely absent and it is significantly reduced even at moderately higher sample rates.
(1) The non-linearity and mixing products produced in the ADC cannot be removed with an anti-aliasing filter.
(2) This leaves out other sources of error like Rigol's defective sin(x)/x reconstruction filter. I suspect Rigol simply truncated the results for higher throughput. As has been pointed out several times in the forums, the sin(x)/x reconstructed waveform should pass through all of the original sample points and Rigol's markedly does not.
(3) I like to call it wobbulation. (4) It is especially apparent with older DSOs that use interleaved ADCs with poor linearity when sampling in real time instead of equivalent time. Modern integrated ADCs have better linearity so suffer from this to a lessor extent but it is still very apparent at an oversampling ratio of 2.5.
(4) On old DSOs with low acquisition rates, the displayed waveform wobbles. On modern DSOs with fast acquisition rates and index graded displays, the fast edges of the waveform look smeared. The extra smearing is *not* noise; it is the result of aliasing.
(5) A common calibration and adjustment test for ADC linearity is to use a pure sine wave *above* the Nyquist frequency to make the intermodulation worse and easier to see. With a perfect ADC, a waveform without the signs of aliasing should be displayed.