Hi all!
My question is: Why are frequency reference inputs (e.g. 10MHz) not always 50 ohms?
I've been wondering about this for ages, but have never found an answer. The only reasonable explanation I can come up with is to be able to daisy-chain devices, with a terminator at the last device in the chain.
I just tested this with my beautiful, old Agilent 53131A, and it's not even close to 50 ohms. I did this by hooking a sig. gen. to a tee at the freq ref input of the 53131, then to a scope with another tee. I then tested with and without a 50 ohm terminator at the scope. Instead of halving, it only shrunk about 10% or so.
Any help would be appreciated!