Hi,
Here is a beginners question and despite having some experience with TV, CATV, SAT equipment (reception and broadcast), as well as other test equipment, I am not embarressed to ask this question:
What does 50 Ohm or 75 Ohm on cables, input connectors, etc., really mean?
I know that for networking, it was decided to use 50 Ohm cables and termination, while for TV applications 75 Ohm seemed to be a better choice.
My questions:
1) If I measure the same signal on a TV field meter (75 Ohm) or a spectrum analyzer (50 Ohm) I get the same reading, considering the precision of the equipment used. I know that "it is not a measurable difference for these applications" - well, why are then two different values being used?
2) I have a 50/75 Ohm coupler from HP, which makes no difference if inserted or not, apart from the 2dB loss of just inserting it.
3) What does 50/75 Ohm mean? Is that the resistance of the cable? Of the terminator? What if the cable is 1m or 100m long?
Any insight and simple to understand explanation is more than welcome!
Regards,
Vitor