Think of a 25watt CW signal... That's 25 watts power at "one" frequency.
Then think of an 25 watt FM modulated signal where the baseband signal is a 0.1Hz sinewave that shifts the carrier slowly between 100Mhz to 100.1Mhz (100Khz bandwidth)
Basically you have a CW signal that drifts back and forth between 100Mhz and 100.1Mhz every 10 seconds. Now do it faster.
To get better receiver sensitivity do I need to narrow my receiver bandwidth?
To get better receiver sensitivity do I need to narrow my receiver bandwidth?
The problem is: If you make the receiver bandwidth narrower than the occupied bandwidth of the useful signal, then you will also lose parts of the useful signal (which you probably don't want). So the only thing that really helps is a narrower bandwidth of the signal (then the receiver bandwidth can be narrower, too), but this means that it can carry less information per unit of time.
To get better receiver sensitivity do I need to narrow my receiver bandwidth?
The problem is: If you make the receiver bandwidth narrower than the occupied bandwidth of the useful signal, then you will also lose parts of the useful signal (which you probably don't want). So the only thing that really helps is a narrower bandwidth of the signal (then the receiver bandwidth can be narrower, too), but this means that it can carry less information per unit of time.
Note, however, that in information theoretic terms (i.e. Shannon's Law) the signal bandwidth is not the same as the bandwidth occupied by the transmitter/receiver.
The classic example of that is DSSS where "The direct-sequence modulation makes the transmitted signal wider in bandwidth than the information bandwidth." https://en.wikipedia.org/wiki/Direct-sequence_spread_spectrum
To get better receiver sensitivity do I need to narrow my receiver bandwidth?
The problem is: If you make the receiver bandwidth narrower than the occupied bandwidth of the useful signal, then you will also lose parts of the useful signal (which you probably don't want). So the only thing that really helps is a narrower bandwidth of the signal (then the receiver bandwidth can be narrower, too), but this means that it can carry less information per unit of time.
Note, however, that in information theoretic terms (i.e. Shannon's Law) the signal bandwidth is not the same as the bandwidth occupied by the transmitter/receiver.
The classic example of that is DSSS where "The direct-sequence modulation makes the transmitted signal wider in bandwidth than the information bandwidth." https://en.wikipedia.org/wiki/Direct-sequence_spread_spectrum
Yes, of course. I did not want to go that far, as even the "simple case" was obviously not completely understood.
You have a 100W transmitter. You transmit a 100Hz wide signal. I feel like the amplitude is 1W/Hz and your signal strength is strong and should go further than say...
You have a 100W transmitter. You transmit a 1,000,000Hz wide signal. I feel like the amplitude is 0.0001W/Hz and your signal strength is weaker overall for the same distance.
Calculating RF path loss doesn't seem to take into account signal bandwidth. It says for a given distance/frequency/power both of those signals will attenuate the same (or thereabouts given the low/high frequency edges).
To get better receiver sensitivity do I need to narrow my receiver bandwidth?
Even given receiver/bandwidth sensitivity I just can't see how the signal attenuates the same no matter if the power is spread over 100Hz or 1,000,000Hz. Help me out here.