I dont know if hi-res decimates the capture down to however many pixels wide the screen is or what...
Yes, this is what Rigols do, as already discussed. Being able to explicitly dictate cutoff frequency is a high-end/rare feature that is unreasonable to expect in a bottom-end Rigol. Even on Dave's agilent, you only kinda get to indirectly manipulate the "cutoff frequency" by changing the record length.
I mean, to deliberately oversimplify things, the whole point of Hi-Res mode is to get rid of the fuzz. If you're complaining that Hi-Res mode is giving you a non-fuzzy line, maybe that's a PEBKAC and you shouldn't be using High-res mode!
Eh, I've done data acquisition for years, when I know the signal coming in, and see a sample rate on the screen, I have expectations. If the scope is only showing 1024 or however many samples over 20ms, it should show 51.2ks/s. I don't care what the ADC is running at, I care what is being acquired. I shouldn't have to track down what hidden/non-indicated setting is on reducing the sample rate to some unknown quantity. Cutoff freq isn't needed but setting the decimation to 2^n or displaying the decimation really shouldn't be difficult.
On a spec-an I use regularly, there is 16 letters in a box, and with a glimpse I can tell if the detector is normal, max hold, min hold, power rms, and which of 4 averaging modes, and a few other settings for each of the 4 traces. An 'N' 'P' 'A' 'H' next to the sample rate wouldn't hurt anyone