My basic tests would be:
- With knobs set to max, does it output close to its maximum voltage or somewhat higher? Check the output for ripple.
- If you turn down the voltage knob, does the voltage go down? Go all the way down to zero.
- Load it with something that will draw close to the maximum current. Ideally something that can draw 20 V / 0.6 A (= a fairly beefy 33 Ohm resistor or electronic load) and 40 V / 0.3 A (= the 133 Ohm resistor). If that is not available, then testing the maximum current at a lower voltage should also give you a decent indication. Measure ripple again under load (will likely be higher than ripple with no load).
- Test the current limit. Do not short the power supply before doing this, because it could blow the pass transistor if the current limit is faulty. For a 0.6 A current limit, I would use something like a 5 Ohm power resistor and set the supply to 2.5 V. 0.5 A should now flow. Increase the output voltage, and the current limit should kick in sometime before reaching 5 V. For 0.3 A you would use a 2.2 Ohm resistor.
Obviously for a dual power supply you would perform those tests twice.
I am sure in the manual you will find a much more complete performance verification that will verify every spec in the datasheet. This would be my off-the-cuff test. I would use an electronic load instead of messing with different power resistors, but resistors will work just as well.
Paralleling multiple resistors is fine, but try to have some air between them. Running them close to the maximum is also fine, but make sure you do not touch them or put them on something that might melt. They can get quite hot, in excess of 100°C.
Can you quantify 'quite a bit'? I turned on a fairly similar HP 6200B, and it drifted from 15.123 V to 15.067V in thirty minutes (most of it in the first ten minutes). Fifteen minutes later it is down another 6 mV (ambient temperature also went down). Drift like that is normal during warm-up. And even then, these are not precision voltage sources. 0.1 % + 5 mV drift at 20 V is 25 mV. Add to that changes in load (if any), ambient temperature or line voltage. Is your DMM stable enough to measure 10 mV change on a 20 V signal? What happens if you warm up, measure it, and measure it again after a couple of hours without significant fluctuations in ambient temperature, is it then more than 25 mV off?