I suspect maybe with lower resolution you can sample stuff faster, but nah... sacrificing so much vertical resolution seems so not right.
If you look at how an Analogue to Digital Convertor functions, it should become obvious that resolution takes time!
Broadly (and this the very broad begineers ADC explanation!) an ADC, is a comparitor. It has an architecture that roughly speaking has the voltage to be measured on one input, and then it switches in a certain proportion of a reference voltage, and checks is REF > IN ??
Because the voltage comparison takes a finite time to do, the more comparisons you do, the longer it takes. Of course, with a higher number of comparison voltages, you can quantise the input signal to a high resolution.
In reality, a scope uses a fancy analogue front end with different gain/attenuation settings to help minimise the downsides of having a low resolution ADC.
For example, if you wanted to measure say a 10V signal, with an 8bit ADC, the maximum resolution is 10/1024 (~9.8mV). So that fixes your resolution, so if you were to put say a 100mV input voltage into the same circuit, you wouldn't be able to see the difference between say 98 and 102 mV. However, switch in some gain, and you also increase your effective resolution. A gain of say 10, means you can now resolve 0.98mV on signal up to 1V.
And this is what a scope does. As you change the vertical scaling, from say 5mV/div to 50mV/div, you are not just changing the input range, but also the input resolution. This mostly works, because in general, the lower the voltage, more more we care about high resolution. ie a circuit that is fed by 1000V probably doesn't care if it gets 990 or 1010V, (+-1%)it still works fine. But a circuit that is processing 10mV would certianly care about +-10V......
Having said all that, as ADC improve, and get faster and faster, modern DSO's are starting to have simpler and simpler (ie cheaper!) front ends, and use the greater resolution in the ADC to get sufficient capability.