hopefully folks won't deride this as a silly question... but what practical usage advantage does a 12-bit scope have over an 8-bit scope?
it seems to me that 12-bits gets you a 16x vertical zoom advantage, but i can't see that as being something that would be of much value in most usage cases. and how much useful information do those extra 4 bits really carry?
First we should consider the fact that even on an entry level DSO, the vertical screen height (in pixels) is at least twice as much as the net number of levels in an 8-bit ADC. Thus, every data point is actually drawn as a line that is two (or even more) pixels high if you utilize a full screen waveform display.
This is an issue especially for the XY-mode, where this problem exists not only for the vertical direction, but also horizontally. Each display dot is actually a square of 2 x 2 pixels on an 8-bit scope. The sole reason why it still looks decent is the DPO technology, mixing a picture from thousands of single acquisitions.
The above have been only the visual effects so far.
Dynamic range is another important consideration, especially when using math, as has already been mentioned before. It makes a difference if we have a guaranteed genuine dynamic range of 49 dB or 72 dB – and this difference can be demonstrated quite easily. I have once shown a few examples in the SDS2000X HD thread…
a distinct disadvantage seems to be the loss of features at a given price point, as manufacturers need to cut costs elsewhere to pay for the 12-bit hardware. for instance: all else being equal, would you rather your 4-channel scope have a single shared 12-bit ADC, or 4 independent 8-bit ADC?
Well, this is not a valid question. If you need (or even only want) 12 bits, then you don't care.
Apart from that: few DSOs have independent ADCs per channel these days. Most of the better ones have two ADCs for four channels. At Siglent, the only exception currently is the SDS6000A (8-bit) and the SDS6000 H10/12 Pro (10/12 bit, available in China only), which use four independent 5 GSa/s ADCs.
i also wonder if much of the advantage of a 12-bit ADC could be achieved with front-end improvements that allow precise DC offsetting (even if only on 1 of the 4 channels) that gives you a view into a narrow vertical band (for example, dial in a 5v offset so you you can look at a 4.95v to 5.05v band without distortions due to over-driving of the front end).
A precise offset would be nothing new. I've already demonstrated several times, how we can measure for instance 205V with better than 0.08% accuracy even with the entry level SDS1104X-E (8-bit).
The overload issues on the other hand, are inherent to the split path input buffer design, universally used in all higher bandwidth scopes. This won't change anytime soon, because it is the prerequisite for accurate and stable DC offsets.