So the issue here is that Rigol gave you enough rope to hang yourself? While Agilent thought the user does not know any better and picked the "correct" sample rate for you
I'm not sure you're completely getting it. In general, there is no difference between, for example, sampling at 2GSa/s and keeping every 10th sample - or reducing the clock to the ADC and sampling at 200MSa/s - they are identical when doing
NORMAL sampling.
But for anti-aliasing purposes - which is a very specific need (I don't give a shit about anti-aliasing much of the time, such as when I'm using faster time base settings or close to the full sample rate) - then there is an advantage to oversampling and decimating, because you can vary the sample you keep - thus achieving random decimation and
STOCHASTIC sampling.
I want the Rigol to operate exactly as it does now -
unless I turn on Anti-Aliasing - at which point I want it to do 'real' anti-aliasing with stochastic sampling.
Ok . I'm curious what Agilent will show for 100 khz signal sampled at 100 ksps, at 10ms time base. but sounds like this picture will not be possible to obtain (since Agilent does not allow use override ).
It has nothing to do with overriding - you just reduce the timebase until the Agilent gives you 100kSa/s.
Edit: The problem with the Agilent X2000/3000 series' is that you can't turn OFF anti-aliasing -so while it works perfectly for eliminating the aforementioned aliasing, it also makes waveforms look 'lumpy' (due to the random decimation) when you 'zoom' in on frequencies that aren't represented with many samples (as in attached image). BTW, the 100kHz signal sampled at 100kSa/s on the Agilent
should look exactly like the image (unzoomed upper portion).
So ideally, you want to be able to turn on anti-aliasing when probing unknown signals at slower sample rates - then turn it off when you've settled on the correct time base for the signal.