The service manual has a good performance verification procedure. Really though, I'd measure the ripple on the supplies, and if fine, don't mess with caps etc.
Run a loopback test with a known good BNC cable (many are garbage, and a sensitive instrument like this falters with those). Verify frequency and AC/DC levels, then measure distortion. At 1 kHz, 3 V output, you should measure around 0.0020% distortion (THD+N) with the 80 kHz low pass enabled. Maybe a little lower with the 30 kHz low pass. At full range, maybe 0.01% or higher. Mine doesn't have the 400 Hz low pass (does yours?) but that can drop a couple more ppm from the measurement by reducing AC mains hum. The loopback test basically exercises the entire instrument, so if it gives good results, you can have confidence it works well.
The SM has a procedure to tune to notch filter by adjusting a single trimmer. This is best done with an oscilloscope on the monitor output. Trigger the scope from the oscillator signal, but display only the monitor signal. The optimum set point is easily visualized on the scope.
The weak point of this machine is its oscillator. With a cleaner oscillator, the measurement floor can be as good as 0.0005 to 0.0007% THD+N (80 kHz LP).