...you will see that I point out that there is no correlation between dBFS and dBu and that it depends on the equipment under test.
And I inferred that his particular setup is specified to produce the +4dBu signal from a -20dBFS test input, or in any case that is his goal. So he's calibrating his DAC/DAW/whatever to that level.
I hope the attached image is more clear for you.
As an experiment I changed the -20dBFS to -19.77 dBFS and got a 1.277 VMRS reading.
Yes, that helps--typos can be a problem! Your readings are all OK and consistent enough to conclude that your setup meets your goal as accurately as you could hope for, better than 0.5dB.
I have not seen a modern AC voltmeter that measured peak voltage, or was not calibrated in V RMS.
Some are "average responding" and the calibration converts the mean absolute value measured by the circuit to RMS voltage, assuming a sine wave.
Others are "true RMS" and measure the RMS voltage directly, within limits for crest factor, etc.
Since generator produce clean sine, it doesn't matter which units it uses for Voltage tune. You can easy convert RMS to Peak or Peak to RMS, because you know that the waveform is clean sine
For RMS to Peak Voltage conversion just multiply RMS value by sqrt(2)
For Peak to RMS Voltage conversion just divide Peak value by sqrt(2)
I want to thank you all for your input, patience and help. I know everyone was trying to assist - some with different perspectives.
For me this is resolved - there was a slight discrepancy with the readings but I round that off to my TRS cable and the Multimeter itself.
I was able to get the values I wanted, namely, spot on 1.228 and 0.775 - but rather than use -20dBFS I adjusted the sine wave level to -19.77 and the meter read 1.228 - and for 0.775 I again had to change -16dBFS to -15.8dBFS. Not optimal - but the world isn't perfect and the point of the tests was to establish whether my Outputs were calibrated to -20 or -18...and I can safely say they were calibrated for -20dBFS.