So we got our new cables with 50 ohm terminators. Just some preliminary testing without the 450 ohm series resistor gave us this result (Ch1 is old 75ohm cable, Ch2 is new cable)
So we've gotten rid of a lot of the high frequency crap that was throwing off our measurements before, probably just from eliminating reflection. The result overall looks very good! The only thing strange is I was expecting more DC attenuation from the 50 ohm terminator; but aside from the high frequency noise the signal amplitude looks pretty much the same. Is the 450 ohm series resistor strictly necessary? I'm going to have to attenuate the +12V signal regardless, due to the terminator's 2W power rating, but if I could skip it on our +5V and +3.3V signals that would be great.
EDIT: Actually I'm going to have to attenuate all of them anyway or the power consumption will throw off our efficiency measurements. Still would like to know for the record anyway.
EDIT 2: Also, if it isn't clear we're using a 50 ohm pass thru terminator, then 1M termination on the scope. The reasoning being that if the technician accidentally applies a large DC voltage, or if DUT isolation breaks down, we'd rather lose a $25 terminator than a channel on our $3000 scope.
EDIT 3:
Whipped up a perfboard prototype of the test point with 450 ohm resistor. Results exceed expectations, though I'm not sure if my amplitude is strictly accurate.
Ch1 = 50 ohm terminated cable with 450 ohm series resistor, X10 attenuation
Ch2 = 50 ohm terminated cable, X1 attenuation
Ch3 = 75 ohm unterminated cable, X1 attenuation