It has become a fashion to brag about resolution (e.g., R&S claims "up to 18bits" for their MXO4 and 5), but not to give ENOB numbers. That is deliberately misleading. Keysight and even Rigol are serious and
have their ENOB values documented. Ironically, R&S even has a paper discussing their pro scopes and why ENOB is important. Only their own marketing department did not get it
for the MXO4 and 5.
The R&S paper is here:
https://scdn.rohde-schwarz.com/ur/pws/dl_downloads/dl_application/application_notes/1er03/ENOB_Technical_Paper_1ER03_1e.pdf
R&S is "selling" Hires mode as real bits..
How do you figure, given that the MXO 4 and 5 have 12-bit resolution all the time (the native resolution of the ADCs) and have a "high def" mode you can enable on top of that?
Lets stick to physics:
- There are never more bits than an ADC can provide. Everything else is done by boxcar averaging and oversampling. While this can reduce noise, it will only make the inherent nonlinearities of the ADC more visible

These nonlinearities create spikes in the spectrum, automatically reducing dynamic range and also the resulting ENOB.
Most serious ADC chips have ENOB specs, depending on sampling rate, signal frequency, amplitude, ... The R&S MXO uses a Texas Instrument ADC (not even a bad one), and this one has ENOB values around 10
according to its datasheet. So - soory - no magic anywhere, except for the marketing guys who may plead incompetent.
An extreme (low frequency) example of a huge discrepancy between ADC resolution bits and useful bits are 24-32bit ADC used for digitizing. *Several* of the lower bits just drown in noise, still these ADCs are in the worlds best multimeters.
Its the linearity that makes an ADC premium.