I'm attempting to use a HP33120a AWG as a calibration source to verify frequency response of an ADC in a machine vibration measurement device. The idea is that the calibrated AWG is to be used in place of a physical vibration standard (shaker table) to quickly verify the measurement electronics without dismantling the transducer (iepe, piezo with inbuilt buffer amp). Anyway I just noticed that the amplitude setting on the HP assumes a 50R load impedance and appears to be open loop, meaning I can't believe its displayed amplitude when terminating with resistance other than 50R. On HighZ mode, my ADC loads the signal noticeably. We are dealing with frequencies only up to 5kHz over a couple of metres RG56 so I'm ignoring transmission line effects. A 100Hz 1v RMS sine wave gives significantly different (+/- 5%) when measured simultaneously with a Rigol 1054z and a Fluke 87iv. So I have 3 numbers that don't match despite all instruments being in Cal.
ADC input impedance is 100k.
Now questions.
-For the sig gen driving my ADC, am I better to terminate AWG with 50R, or to just take double the indicated amplitude as the real value?
-If I terminate, I assume I need a precise 50R terminator, what's the common practice?
-Fluke and Rigol disagree by 5%, any ideas what is going on, perhaps I'm expecting too much of the TRMS calculations in the scope?