It's somewhat about the technology that would have been available at the lab at the time. What they're really doing with the distortion analyzer is measuring the purity of the sine wave output. In 1974 when that manual was written, spectrum analyzers were not readily available.
Notice that they specify distortion is "less than 3%; i.e. greater than 30 dB below the fundamental."
If you have an oscilloscope with an FFT function, you can measure that directly now.
At those levels, if the sine wave on an oscilloscope looks good, then it's probably within that specification.
Actually, there were plenty of spectrum analysers around, but not many targeting the lower frequencies
Standard Audio generators used in Broadcast service produced distortion figures around 0.01% "standing on their heads"---- many could do much better!
The normal companion to generators like those was the "Noise & Distortion Meter" or "N & D set".
These were made by many companies, so you are not stuck with HP.
In operation, the device reference was set in "wide band" mode, then the fundamental was "nulled out " using a very sharp tuneable rejection filter.
Any remaining reading was the THD (plus noise) of the DUT.
It was easy enough to measure the DUT noise separately, then subtract that from the result.
(Some industries measured both parameters in dB to simplify this step.)
Noise was usually so low that it could be neglected, especially with sources.
The Hearing Industry used a different approach, & used a selective millivoltmeter to tune to each of the harmonics in turn, thus giving the user a good idea of the frequency distribution of Harmonic Distortion, when calibrating Audiometers