So when I read the sepc for, say, a function generator it might be 10Vpp max, 50 ohm output. But that is spec'd for the output from the device, after the resistor in the output stage. Here the spec is 5V output high, but before the internal 50 ohm resistor. Why the difference in spec? Why would you spec the output voltage for a system that isn't terminated into the same impedance?
Thanks again.
Unless otherwise stated, the output voltage is specified with no load. The output impedance is the internal resistance of the output stage. In some devices, it's not consistent and can change with the power supply voltage (look at the specification for a CMOS logic ICs, such as the CD4011 or 74HC00) or frequency, as is the case with an audio amplifier.
There aren't different specifications and there is no requirement that the output is terminated into the same impedance, unless stated otherwise. In an ideal world, the terminating impedance should be as high as possible to avoid voltage drops. The output voltage can be calculated using Ohm's law.
Work out the current:
I
OUT = V
SUPPLY /(R
OUT + R
LOAD).
Now V
OUT is simply equal to the supply voltage minus the voltage dropped inside the output resistance.
V
OUT = V
SUPPLY - R
OUT * I
OUT.
Signal generators typically have a 50Ω or greater output impedance to protect the output from damage. It limits the short circuit current and any voltage spikes when driving an unterminated piece of co-axial cable.