Do a one-shot capture, export it to CVS, then do a histogram with 256 bins to see if you have any missing bins. Missing bins would indicate missing counts.
OK, here we go. I connected a ramp signal to channel 1 of my DS1054Z and captured a single shot in normal acquisition mode, sin(x)/x off, dot display. Screenshot and CSV data are attached. The CSV data has 1500 points, as expected from the sample rate (I had all 4 channels on to get a sparse dot display, enabling me to see the individual dots on screen better). But it only has 200 different vertical values, all spaced equidistantly without extra gaps.
So it seems that the DS1054Z limits its data acquisition and display to a sub-range of 200 counts out of the full 256, only. This of course maps nicely onto the 400 pixel net vertical range of the display. Cheating just a little bit; we've got ourselves a 7.6 bit ADC here...
-- Edit: Something strange going on in addition here. I had intentionally set the ramp amplitude to clip at the beginning and end of the displayed range, to establish a baseline. But I just realized that
(a) the CSV data do not show the clippling, but a strange dip and then a continuation of the linear ramp, and that
(b) I can switch the scope to the next lower sensitivity, while still showing the captured single-shot scan, and it will then display without the clipping.
So do the 200 levels in the CSV actually correspond to a larger vertical range, which even goes beyond what's on the screen in the screenshot below? But that would not make sense; what about the nice "200 counts = 400 pixels" correspondence?? Confused; need to look more closely...