First: I am sorry if this belongs in the beginners section
Secondly: I am sorry for the somewhat clickbait title
Namely, I recently came across an interesting thing that I would like some more experienced people to explain/clarify to me:
Mem depth is set manually to 120k. Then, I apply 1Mhz to the input of the oscilloscope, successively a sinusoidal waveform, a rectangular waveform and a triangular one.
After pressing Auto on the oscilloscope, the oscilloscope correctly displays the waveform with a time base of 200ns and the frequency meter shows 1Mhz. However, when I manually extend the time base to 50ms, the oscilloscope shows the same waveform, with the frequency meter reporting a frequency of about 7Hz!
All this, of course, only with a manually set mem depth of 120Kbit (or less, on this particular oscilloscope).
The waveform is preserved (sine/rectangle/triangle), including voltage values.
I tried googling this phenomenon, but I guess I'm asking the wrong question.
I believe that this mystery applies and appears apparently on all digital oscilloscopes where we deliberately set a small mem depth.
Please explain it to me - thank you.