... Why don't they do that, especially for such a premium amplifier?
Is the reason cost, or board real estate, or something else? Or, is there some performance reason why it can actually be detrimental in certain cases? The filter will have a little insertion loss at the target frequency, but it's not that large. For instance, for the SCLF-10+, it's only about 0.5 dB at 10 MHz.
The reason they do not include a low pass filter on the outputs to reduce noise and distortion is that the filter would have to have a low enough cutoff frequency to be useful, but the low cutoff frequency would introduce variable phase distortion, so now the phase would vary over time and temperature.
This is also why a bandpass filter is a terrible idea if phase is to be preserved.
As a compromise, a higher cutoff frequency could be used to minimize the phase distortion, and notch filters added at the low harmonics.
In my uses I would rather have a clean square wave with minimal filtering. This way I get minimum phase noise at the clock edges. If the distributed clock is filtered down to a sine wave then any coupled noise, or even the thermal noise of the receiving device, will add to the clock jitter. If you can't stand the harmonics, then filter at the receive end.
Fast enough edges create their own problems. Transmission line loss increases at higher frequencies, and there is a phase distortion component as well from a change in propagation velocity versus frequency. Combined these lead to transmission line "dribble-up" when fast edges are used and the rise/fall time becomes slower and non-linear. Frequency components beyond the TEM propagation mode for the cable also get mangled arguing for smaller cable which further increases dribble-up.
Using the fastest possible comparator on the receive side also increases jitter because of increased sensitivity to high frequency noise.