Well yes, an analog scope as you know, cannot have quantization noise. But my scope has an 8 bit vertical resolution, that's 256 resolution yet the display has a vertical pixel resolution of 480. There are quantization effects that are undoubtedly present, my question was to anyone, especially professional engineers, whether there are ever cases, domains, situations where an analog scope (perhaps one of the more recent analog high end scopes) is advantageous.
There may well be no such situations at all, ever, but I don't know so wanted to see if any experts here could point out examples.
And what do you thing CRT scope has no noise? How thick is the trace? There is a limit on "analog resolution", it is called noise floor.
It is same as "megapixel wars" in SLR cameras, where saying "film is analog" doesn't mean it has unlimited resolution. It doesn't, it has grain. And "analog resolution" (sharpness) of lenses is also a thing...
Decent DSO of today will have less noise (despite 8 bits) than most CRT scopes with exception of few special ones.
And higher bit (10bit and up) will have more detail than any analog CRT scope.
And FFT, math, measurements, analysis, mask test, single shot capture, decodes... And huge screens in comparisons. Networking, remote control, easy saving of data and screenshots...
There is only one thing where CRT is still better : X-Y mode for scope art.. And as someone here mentioned, some of the fancy CRT modular scopes had some special modules that have no equivalent, but that is not a analog/digital difference.
Those a niche enough there is no imperative for manufacturers to invest into developing that anmore. If there was need for it, it would be made.