I just built an RF power detector using an Analog Devices 5906 evaluation board (I knew I would screw it up if I tried to make the RF circuit part myself). The detector itself is spec'd for 10Mhz to 10Ghz, and good down to about -65 dbm at the lowest frequencies, and -30 dbm at 10 Ghz. It seems to be working fine, but I'm puzzled about the readings I'm getting. There seems to be a general background in my area of around -40 to -45 dbm, which applies whether in my house with all devices turned off, or around my neighborhood, or a couple miles up in the hills in back of us with widely scattered houses only. I would expect a background < -100 dbm (effectively zero on my meter) when not anywhere near a transmitter. Is my expectation totally wrong? The nearest cell tower (that I know of) is on the top of a hill but quite far away (a couple miles at least). I've checked the circuit carefully, and if I disconnect the antenna, the reading does go to zero. (The antenna is from WA5VJB, and is spec'd for 700Mhz to 26Ghz.) Is there something about summing the signals from multiple low level sources across a wide bandwidth that adds up in a non-intuitive way? Or, maybe it's black helicopters (there is an AFB about 20 miles away).