Yes, I suspected from the beginning (my earlier post) that it might have been an issue with mismatched impedances. Although I am still a bit confused by
your experiments. So here's what I think you should be seeing.....
Most signal and function generators have a source impedance of 50 ohms. The voltage that the user sets on them (and thus what is displayed on the unit)
usually assumes a 50 ohm load. If the load is anything but 50 ohms, a voltmeter will show a value that does not match what is shown on the unit.
For instance, if you dialed 0.223Vrms on your signal generator, you would see on an AC voltmeter (with megohms of input impedance) the following: 0.223V
if you use a 50 ohm termination, 0.446V if the line is left unterminated, and 0.446*(600/650)=0.412V with a 600 ohm termination. There is nothing wrong
with either the generator or the voltmeter.
The confusion becomes apparent when you have a piece of test equipment that was specifically designed for audio applications. For historical reasons that
I can't remember, 600 ohms is the usual standard, not 50 ohms as it is in the RF world. This confusion manifests itself when one tries to use dBm. Whereas
a 50 ohm system equates 0.223Vrms with 0 dBm, a 600 ohm system equates sqrt(600*0.001)=0.775Vrms with 0 dBm. A lot of my meters actually
have "0 dBm = 1 mW at 600 ohms" printed right on the meter face.
So, returning now to the 288 set to 50 ohms unbalanced. If you were to select 0 dBm and feed an AC voltmeter that uses the 1 mW at 600 ohms dBm
definition, you would get the following: 1) If you terminated the line to the meter in 50 ohms, 20log(0.223/0.775) = -10.8 dBm; 2) if you terminated
the line in 600 ohms, 20log(0.412/0.775) = -5.5 dBm; and 3) if you left the line unterminated, 20log(0.446/0.775) = -4.8 dBm. Note: all I did was
use the voltages I computed in the previous paragraph referenced to the 0.775V.
It can, of course, now get even more confusing if you are hooking the generator up to something other than a scope or meter with high input impedance.
For instance, if you were hooking up to a standard spectrum analyzer with 50 ohm input impedance, you would not use a 50 ohm termination. All that
would do is parallel in another 50 ohm resistor, making the effective input impedance 25 ohms, and the spectrum analyzer would read incorrectly.
The moral to the story: Volts are volts. As long as the AC voltmeter has very high impedance, the voltage shown on the meter will always be correct. It
will only match what is shown on the 288 if you have the line terminated in 50 ohms. When you decide you want a result in dBm, you always have to ask
yourself what the meter's reference value is. Otherwise you are guaranteed to find yourself confused.
As far as hooking up to audio equipment, if you don't know the actual input impedance of the equipment, I would tee-in a scope or voltmeter (with no
termination) to monitor the desired voltage you want. Ideally you would want to match the source equipment to the load equipment (as it reduces things
like ringing or other distortions), but this may not be possible in every instance. I would specifically stay away from dBm (or dBu, or dBV, etc.) unless you
compute the actual voltage it is supposed to be and then dial it in while watching the scope/meter. Hopefully your tape decks tell you what voltage they
want, not dB values.
But I'm glad the 288 seems to be working!