It is all about signal and noise and the problem is either real or aesthetical.
If a slow signal comes to logic gate, there will be digital flicker on the output, which might mess up the otherwise correctly working circuits (real problem). In some cases it might also start oscillating. The standard solution is schmitt trigger inputs (with hysteresis), so the rising edge and falling edge thresholds are not the same. Logic gate is like 1 bit ADC. And the principle is same, does not matter how much bits there are. At least the LSB bit will flicker if signal is randomly crossing the threshold of that bit. Some amount of noise is fundamental part of nature, so this is unavoidable.
If analog signal is periodically digitized, there will be bit flicker which is not correlated to the change of a true signal. Bit reduction or rounding does not remove that, because bit flicker will unavoidably propagate to higher bits at some threshold points. Slow sampling or long averaging does remove flicker in a sense that the true signal is expected to change during that time, so it masks the flicker. But it might not be an optimal, preferred solution. Even if the problem is purely aesthetical, like outputting the data to the screen, LED bar graph, things like that. Personally I do not like transition flicker. The solution is to measure the noise and apply just enough hysteresis to cancel it. Too much hysteresis is of course destroying signal information, so it is important to not overdo. The same hysteresis idea like in the example algo I posted above can be used to round to float, not just integer.