There is a reason that has to do with modern DSO design.
Starting in the 1980s, analog oscilloscopes and DSOs changed to using some variation of a two path high impedance buffer, doing away with the temperature compensated dual JFET design. One consequence of this is that the cutoff frequencies and gain of the two paths must be matched to prevent a mid-band discontinuity in the frequency response. Early implementation had a calibration step to adjust this, but modern DSOs usually have a fixed calibration which cannot be adjusted. In practice the discontinuity is always present to some degree, but I have seen examples of modern inexpensive DSOs where it is particularly bad.
I probably shouldn't have said DC, but even there the issue with the DC/LF amp section on modern scopes may not lead to as serious a problem measuring RMS voltage as it might first appear because the step response distortion can be caused by phase shift just as well as amplititude variation. I gave it a try using an AWG set to 5.000V and my 8506A in parallel with two Siglent DSOs, an SDS1104X-E (with the problematic LF distortion) and an SDS2354X+. I tested at 10, 100, 1k, 10k, 100k, 200k, 500k an 1M Hz. Results:
FREQ 10 100 1000 10k 100k 200k 500k 1M
8506A 4.9939 4.9958 4.9955 4.9982 4.9957 4.9866 4.9599 4.9089
1104X-E 5.02 4.98 5.03 5.02 5.04 5.03 5.03 4.97
2354X+ 5.015 5.008 5.027 5.041 5.050 5.051 5.035 4.967
I think the input capacitance of the 8506A was loading the AWG down at 1MHz, removing it brought the values on the scopes back up. As you can see, either would be adequate for the OP's purpose--and more generally, they exceed any other method I have of measuring AC voltage once you get over 500kHz. Unfortunately I can't push much further because my AWG is not quite as good when it comes to flatness. 1% is 0.09dB and that's a lot to ask for from sig-gens or spectrum analyzers. I also did sweep tests on both scopes 0.1 to 100Hz . The "problem" scope has just-noticeable uneven amplitude on the low frequency sweep and the other scope is as flat as you could possibly hope for. Pictures attached.
So for instance a Racal Dana 9301 has a flat bandwidth to 500 MHz, where "flat" means that no amplitude correction is required because it is within the 1% measurement error of the instrument. The HP 3406A seems to be a little less than this, but the manual is difficult to read on this point. A digital storage oscilloscope would require a bandwidth of roughly 5 GHz to achieve this with single pole response and nobody has proposed using such an instrument.
I had a look at the manuals for those instruments because I haven't seen them before. Yes, the "sampling front end" is a specific subset of sampling systems and it does have the advantages you state, but my larger point was that the sampling system itself in a DSO or the 3458A has the same inherent linearity for the same reasons. It's the input and scaling circuitry that limits BW. The thermal converter on my 8506A, for example, would have nearly infinite BW and could be used like an HP thermal RF power meter if there were a way to accurately get a scaled signal to heat up the AC resistor heater in the sensor. The sampling front end avoids any input circuit issues because it puts the sampling up front. Of course the trade-off is the limitations on input voltage and input impedance. Still, even with all that the HP has a best-case accuracy of 3% and the Racal Dana is 1.5% of reading plus 1% FS, so 2.5% at best--0.125dB for the HP and 0.09dB for the Racal Dana. Now given the 500MHz+(but not LF) that those specs apply to, I have to concede I don't know anything that can do that to 0.125dB. IDK how flat a 5GHz scope would be to 500MHz, but I'm suspecting it won't be a under 0.5dB. But for 0 to 1MHz, I think the cheap DSO has them beat.