I have an RTL-SDR dongle hooked up to HDSDR. The spectrum display shows decibels of power on the signal. I understand they aren't calibrated, but is there any information on what they are relative to? It doesn't seem to be dBm.
I ask because I'm trying to look at a signal that, on the scope, shows as 1.8Vpp (but not centered on ground, rather 0-1.8V) Which should be about 9dBm. On the HDSDR it's coming in around -50dB. (the signal is 50 Ohm source and it's fed to the dongle via a 50-75Ohm adapter, since the dongle is 75 Ohm). Admittedly, it's an ugly square wave, so lots of the power could be in the various harmonics its throwing (and it's throwing a lot of them), but that much?
I don't have any other way to measure power that low at those frequencies - my power meter can only resolve down to about .1 Watts - 20 dBm.
It can't be relative because there is such a large gain range that you can adjust the receiver.
You should be using another means of measuring your signal generator. A detector and meter. Your scope is much better.
Not a TV receiver. The way the front end of the little $5 dongles are designed, its just not linear.
Also you would need an attenuator.
Not so much measuring as trying to get a feel for how it's all working. Certainly the RTL-SDR isn't a measurement tool - it's not even close to being calibrated to much of anything. But as I don't have a real spec an at this point, it lets me see the signal quality at least somewhat (and holy crap the signal quality is terrible - but that's expected at this point).
You should fool around a bit with the rtl_power utility, part of the basic rtl-sdr install.. However, you'll soon find that it really can only give you a relative measurement on many signals. The algorithms it uses for gain adjustment are not fully known. The dynamics of how it adjusts itself are not favorable to that kind of measurement. Someone else could probably explain how it works better than I. Longish time constants are also involved at a level thats not accessible to the receiver's user.
Also, 1.8 v. peak to peak - if it was a radio signal (which would be more of an approximation of a sine wave) would be several orders of magnitude stronger than most of the signals a TV receiver is used to. That said, I have been able to get signals that strong out of HF bands, using a high-Q magnetic loop, a tuned circuit resonant on that frequency, which acts as a preselector also, only letting that frequency through, so they are not unheard of, (unfortunately, because signals that strong overload receivers).
Few receivers- certainly not the one in the RTL2832 chip in direct sampling mode, which is not capable of that much dynamic range, can handle that much voltage without being completely overloaded.
So you absolutely have to attenuate them, a lot. You can make a L-pad with a variable potentiometer, it doesnt have to be fancy. If you do that, when you need to, you can handle both strong and weak signals, by riding the gain. That can work pretty well.
My understanding is that 1.8vPP is 9dBm. As far as I can tell, the input of the rtl-sdr is rated for right about the same 9 dBm (this is a coincidence, I didn't design for that or anything).
At that, it's coming in at -50dB on the scale, way lower than what I usually get with even a very crappie antenna.
Now, I have some attenuation couplers. Before plugging signal straight in, I first used a -20dB coupler, with source connected to the coupler "in", a 50 Ohm terminator on the "out", and the 50-75 adapter, then the dongle, on the coupled output. I couldn't pick out the signal from the noise floor. I then tried a -10dB coupler. Same results - couldn't see the signal.
Now, connecting straight, with just the 75-50 adapter, I'm at least sering some sort of signal...
Note that I have no idea how the DC offset of the signal factors into all this.