Scope reads:
3.440V at 1GSa/s, Sinc
2.080V at 250MSa/s, w/o Sinc
Sink it in. ~2V vs ~3.5V!!! When twisting knobs that should only affect signal fidelity, not amplitude.
Ummm... let's see if I've got this straight:
What you're saying is that when you sample a signal then
don't do a proper sinc reconstruction, the answer is wrong? Is that correct?
Mr. rf-loop seems to agree:
First 3 images:
200ns/div ; Sin(x)/x OFF scope show normal level (as also with more low speeds)
100ns/div ; Sin(x)/x OFF level drops down and then stay same level from this speed down to 5ns/s
005ns/div ; Sin(x)/x OFF level drops down
Next 3 images:
200ns/div ; Sin(x)/x ON level ok (as also with more low speeds)
100ns/div ; Sin(x)/x ON level ok
005ns/div ; Sin(x)/x ON level ok
Also if think typical criteria for sin(x)/x interpolation there is 250MSa/s and input is 70MHz sinewave. Samplerate/input frequency = 3.57 what is ok. Even with 100MHz it is still in acceptable range (2.5)
sin(x)/x=ON gives correct answers? Go figure.
Worse: You're doing this "analysis" on an oscilloscope that
only lets you disable sin(x)/x when it knows that there's not many samples per MHz (ie. when it goes down to just 2.5 samples per wave), all the rest of the time sin(x)/x is
forced on. (I wonder why Rigol would do that? Hmmm...
)
And (b): You have to use "dot" mode
to even see the problem - which nobody in their right mind would ever do.
ie. This has no practical effect on any usage scenario that would happen in real life.
Me? I think the real problem here is in the heads of the people who waste everybody's time with this junk (and on a $350 oscilloscope, too!)