I've been watching a few videos about how to use the decibel system for evaluating signal levels.
Notably EEVBlog #49 and several from Alan Wolke's great channel (
https://www.youtube.com/user/w2aew/videos).
These tutorials made me want to try this for real with some actual signals.
I have access to some test gear:
- DSO with built-in signal generator - Keysight EDUX1052G
- Attenuator box - Marconi TF2163S (July 1969 - Apollo 11 vintage! In beautiful condition too)
- Spectrum Analyser - TinySA (tinysa.org)
Here's what I'd like to do (in fact I've already tried but I didn't understand my results)...
- Generate a sinusoidal signal: 10 MHz, 1 mV p-p, no DC offset, output load set to "50 Ohm"
- Pass that through the attenuator box set to 30 dB or similar
- Measure the resulting signal using the spectrum analyser, set to have no internal input attenuation, display units set to dBmV
My thinking was thus: If I set the spectrum analyser to use dBmV (1 mV reference) units, and give it a signal from the sig-gen of 1 mV, then I should see a spike on the display which matches the attenuation on the Marconi.
But that's not what I see. I have the Marconi attenuator set to 30 dB, but the spectrum analyser tells me that the peak is at -40 dBmV.
So to my mind there is a discrepancy of 10 dB between the sig-gen and the spectrum analyser. I must be missing something of course.
When I change the attenuation on the Marconi, the spectrum analyser's measurement changes linearly.
In other words the measured signal is always 10 less than the attenuation...
- 10 dB attenuation measures as -20 dBmV
- 20 dB attenuation measures as -30 dBmV
- 30 dB attenuation measures as -40 dBmV
- 47 dB attenuation measures as -57 dBmV ... etc
So where is this 10 dB difference coming from? I'm using relatively short coax cables and proper coax connectors (N-Type, BNC, SMA with 50 Ohm adapters throughout). None of this is hacked together.
What am I doing wrong or misunderstanding?
Thank you for reading, and your time.