It claims 200MHz bandwidth, so expect a similar noise level.
Both instruments have excess noise. For reference, my TDS460 has something like 0.8mV RMS (350MHz BW, 50 ohms).
What's ideal?
Nothing is perfectly clean, there is thermal noise everywhere. Electrons undergo random walks, leading to a noise voltage and current (the ratio being the system impedance; in this case, 50 ohms) proportional to sqrt(BW). This is 0.919 nV/rtHz, or for 350MHz BW, 17uV RMS.
The average op-amp is in the 10nV/rtHz range, give or take. (By the way, noise also goes as sqrt(R), so this would be typical noise for a 5kohm resistor.) So you can see, it's fairly tricky to measure true thermal noise.
Likely, quantization noise dominates in most instruments. An 8 bit ADC is typical, with a few LSBs of internal noise. ADCs typically take a 1Vpk signal, so 1 LSB = 3.9 mV.
But scopes also have preamplifiers, since 2mV/div (~20mV FS) is 50 times too small to read properly in the ADC. Even with the ADC being so noisy, most of the excess noise will be due to the amplifier. (If the input stage is a modest 5nV/rtHz, rolling off with a 200MHz passband, you expect 70uV input referred noise, or 3.5mV at the ADC, comparable to 1 LSB.)
Understandably, any teeny fraction of supply ripple or noise on the front end (uV worth!) will contribute large errors on these small scales, so you want to make sure those circuits are well designed -- they dictate the true performance of the system!
By the way, you don't need to sample at the full bandwidth to measure the noise level of the system. If you have an old school DSO, you can measure RMS with input GND'd, and see that the noise level is essentially constant on all ranges, except for the shortest time/divs (where samples are being interpolated, and you're seeing the roll-off of the system bandwidth). This is because the entire noise bandwidth is aliased into the sample band. Regardless of sample rate (given that it's low enough), the expected value of a given sample is V = <v_n>, and you're just doing statistics on V. (If the sample rate is comparable to BW, V is no longer uncorrelated, but changes less between nearby samples, because of bandwidth. This is good, because we need Fourier and statistics domains to agree!)
On a modern DSO (with large memory and DPO effects and stuff), the sample rate and bandwidth may be quite different from the old school minimums, but the same measurement can still be made.
If "High Res" mode is used, the sample rate is pushed slightly higher than what's visible, and a DSP filter is applied. If the sample rate happens to extend above the bandwidth, then no aliasing occurs, and noise is particularly low (having the best tradeoff against bandwidth). Otherwise, the noise level (in terms of nV/rtHz) will not be the noise floor of the system, but BW/Fs times greater, because all the noise above Fs gets folded into the pass band (at, say, Fs/8 or whatever the High Res filter is doing). Noise still goes down, because bandwidth goes down. But this is certainly something a modern DSO can have an advantage on.
Speaking of bandwidth, pay close attention to the bandwidth setting as you zoom in -- many instruments did not offer full bandwidth on their most sensitive scales! I think a number of current Tektronix scopes still do this. I forget if the DS1054Z does.
(Even back in the Tek 475 days -- mine gets extra-noisy on the 2mV/div range, because that range enables an extra 2.5x preamp in the signal chain. The offset also shifts, for the same reason. Though the bandwidth remains, so that's nice.)
Tim