Is this something that is used in, lets say oscilloscopes? I have next to no knowledge in how they work, only that they use fast but low resolution ADCs.
Now lets talk multimeters.
The high resolution (and low res) multimeters I have used have a couple of ranges, usually in [max ADC input] x10, x1, 1/10, 1/100 and so on.
The x1 range is the optimum, the others are "the next best thing". Most bit loss happens when your measurement is the furthest away from the range.
Wouldn't it sometimes be nice to have a feature that automatically amplifies the input signal, in order to make that x0.1 or whatever signal=x1, and then compensate for that gain in post?
Does this cause more problems than what is (sorry..) gained?