EEVblog Electronics Community Forum
Products => Test Equipment => Topic started by: Zeyneb on July 08, 2022, 06:02:52 pm
-
Hi there,
I do have the picotech 3206MSO oscilloscope. This one has 200 MHz bandwidth. I would like to measure signals with short rise times. In the old book High-Speed Digital Design from Martin Graham and Howard Johnson they recommend to build an 20:1 attenuation transmission line probe. You know with a 950ohm resistor in series. I'm now looking what I would need to buy to make this.
At least I should buy these feed-through 50ohm bnc terminators. Well at least Pico Technology offers these as TA051 for 1 GHz bandwidth, these are $21, significantly less expensive than the Pomona ones which are about $50. Then some RG174 or RG58 coax. Digikey offer the Huber+Suhner brand sold per meter. These are rated for RF. The question is would it be hard to put a bnc connector on the end of it and maintain the performance? Or would it be better to buy a coax that already has the bnc connector.
If it is fine to do myself where can I find good instructions to get the expected performance?
-
The problem with a 20:1 probe is that loading is high, better find a FET probe from eBay or similar
-
To mount a BNC connector to a cable one needs a crimping tool to do a proper job. Otherwise it is not that simple to make a good lasting cable. I bought a toolkit for this a long time ago, back in the day that ethernet still used coax cable to make networks.
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1532527;image)
-
A crimping tool and crimp BNC connector for RG-174 sized cable works well; I have done it several times. I actually prefer clamp BNC connectors which are available for RG-174. Either will work to 1 GHz and beyond without issues.
A low-z probe will work, but at only 200 MHz, an active probe is feasible for home construction.
-
If you want to get the best from a passive z0 probe I'd recommend using decent cable and try and avoid cheap crimped BNC connectors. The VSWR of a basic crimped BNC can be quite poor up at UHF.
To minimise mismatch uncertainty issues the VSWR of the cable has to be good, the connector at the far end of the cable has to have low VSWR and the termination at the far end also has to have low VSWR. If the test gear at the far end is a basic 200MHz scope with a 1M input then I'd recommend having a good 10dB attenuator inline before the external 50R termination at the scope input. This will prevent the poor VSWR of the terminated scope from introducing too much uncertainty.
-
A low-z probe will work, but at only 200 MHz, an active probe is feasible for home construction.
Interesting, would there be some internet source to explain how to make such an active probe?
If you want to get the best from a passive z0 probe I'd recommend using decent cable and try and avoid cheap crimped BNC connectors. The VSWR of a basic crimped BNC can be quite poor up at UHF.
To minimise mismatch uncertainty issues the VSWR of the cable has to be good, the connector at the far end of the cable has to have low VSWR and the termination at the far end also has to have low VSWR. If the test gear at the far end is a basic 200MHz scope with a 1M input then I'd recommend having a good 10dB attenuator inline before the external 50R termination at the scope input. This will prevent the poor VSWR of the terminated scope from introducing too much uncertainty.
I'm no expert in RF. I don't understand what VSWR means. Would you take a look at Huber+Suhner, Inc. part RG_174_/U to see if that would be acceptable performance? If not any other coax recommendation?
https://ecatalog.hubersuhner.com/media/documents/datasheet/en/pdf/22510040
-
Irrc, this is one of the better Z0 probe construction threads, but a forum search will bring up many more... https://www.eevblog.com/forum/projects/lo-z-probe/ (https://www.eevblog.com/forum/projects/lo-z-probe/)
-
If not any other coax recommendation?
I used RG316 cable for mine but RG174 will probably be fine for use up through VHF. The real risk is the quality of the connector you put at the far end. If it has a poor match to 50 ohms (poor VSWR) then the probe will introduce lots of mismatch ripple. I used RG316 and a decent Suhner connector (not a crimped connector) for my x100 probe. I use this for looking at digital waveforms with my old 500MHz scope. The probe itself has 30dB loss and I add a decent 10dB attenuator after it to get the 40dB loss on the VNA as below.
For general purpose stuff it works fine although I keep meaning to make a proper version in a decent holder. The other niggle with probes like this is that it can be very difficult to get a reliable connection to the test point and the the PCB ground at the same time. This connection has to be good over a huge bandwidth for both the tip connection and the ground at the same time. I ended up fitting some really tiny 'spring' tips to my probe. These are about 0.5mm diameter but the centre part can spring in and out like a trombone. The inner part that springs is like a needle and is very sharp. This makes a huge difference in terms of reliability of measurement.
I took a plot up to 1GHz on 5dB/div but the scope only has a 500MHz bandwidth. See below. Note the lack of any mismatch ripple. It isn't quite compensated at 1GHz but it is useable up to about 3GHz for general faultfinding if used with a spectrum analyser for example.
-
I also made a precision preamp that can be used with it when connected to a VNA or a spectrum analyser. It can also be used with the 500MHz scope. This old preamp was mainly used as a preamp for my old HP8568B spectrum analyser when doing noise figure measurements with a noise source. The preamp noise figure has to be very low (just under 3dB) and the input VSWR has to be very low and the response has to be very flat from LF up to about 1GHz with no dips or ripple anywhere.
The input VSWR of the preamp is very low and is lower than 1.05:1 from 1MHz up to about 500MHz.
So this old preamp was designed with some care to achieve the results below. When used with the Z0 probe and the scope I don't need the 10dB attenuator but I have to fit a 2dB attenuator after the preamp to get 0dB net gain from about 300kHz to 500MHz.
-
A low-z probe will work, but at only 200 MHz, an active probe is feasible for home construction.
Interesting, would there be some internet source to explain how to make such an active probe?
Up to 200 MHz the simplest implementation would probably be a non-inverting current feedback operational amplifier configured for x2 gain driving a double terminated line to the 50 input of the oscilloscope. Some parts are fast enough to do this. The feedback resistor value can be raised to lower bandwidth from maximum and tweak the impulse response.
A lower bandwidth but step up might be the circuit shown below from Troubleshooting Analog Circuits by Robert Pease. It is slower, but the input capacitance is very low which has advantages.
Either solution has a limited but still useful input voltage range. Increasing the input voltage range requires adding a compensated voltage divider to the input which is a common feature of active probes which often have x10 attenuation.
If not any other coax recommendation?
I used RG316 cable for mine but RG174 will probably be fine for use up through VHF. The real risk is the quality of the connector you put at the far end. If it has a poor match to 50 ohms (poor VSWR) then the probe will introduce lots of mismatch ripple. I used RG316 and a decent Suhner connector (not a crimped connector) for my x100 probe. I use this for looking at digital waveforms with my old 500MHz scope. The probe itself has 30dB loss and I add a decent 10dB attenuator after it to get the 40dB loss on the VNA as below.
It is not an issue up to 200 MHz. The oscilloscope response is likely to have more variation than common coaxial cable and the connector will produce.
Tektronix encountered the problem you describe when they made their 7104 1 GHz scope discovering that "50 ohm" cable and "50 ohm" BNCs are not 50 ohms. They made custom BNCs for exactly 50 ohms to go in their 1 GHz 7A29 vertical amplifier.
-
The problem with a 20:1 probe is that loading is high, better find a FET probe from eBay or similar
Numbers, not adjectives, please.
Include both resistance and capacitance; the latter is very significant at frequencies over 100MHz or so.
When doing modelling, don't forget the ground lead inductance.
If using a 50ohm terminator at the scope (rather than a pukka 50ohm input such as the Tek 485), don't forget that it will still have the scopes input capacitance in parallel with the 50ohm. That can be significant. Modelling and simulation wil show the relative importance of various parameters.
-
It is not an issue up to 200 MHz. The oscilloscope response is likely to have more variation than common coaxial cable and the connector will produce.
Yes, it probably doesn't matter that much below 200MHz. In case it's useful to anyone I kept plots of the change I saw when going from a cheap BNC crimp to a decent RF connector when using RG316 cable. The plots cover up to 1.5GHz and are shown below. There's a significant difference up at UHF but less so below 200MHz.
At this stage I hadn't compensated the response but you can see how much the choice of connector can change the probe response in terms of mismatch ripple.
The other issue will be the quality of the BNC 50 ohm termination and how it and the input impedance of the scope may contribute to yet more mismatch ripple. it really is up to the individual to decide what is acceptable when the complete probe is tested.
-
To try and put some numbers behind this stuff you can crudely estimate the peak to peak mismatch ripple if you know the VSWR looking into the tip end of the coax cable with the 950R resistor removed. Assuming the cable itself is decent this will be defined by the quality of the BNC connector, the 50R BNC termination and the impedance of the scope input.
The input impedance of a typical 200MHz scope is going to be fairly low up at VHF so it can easily degrade the VSWR. The source impedance looking into the cable at the tip end is typically about 1000 ohms because of the series 950R resistor. Ideally, the cable will look like a resistive 50R so the VSWR of the source resistor is about 20:1. In reality, the input to the cable will not be a resistive 50R because the scope and the termination won't be a very accurate 50 ohms.
If the source VSWR is 20:1 and the VSWR looking into the cable is 1.3:1 then there will be about +/- 1dB ripple if the cable is long enough to explore all phase angles. If the cable VSWR is 2:1 then the ripple (uncertainty) increases to about +/- 3dB. I've seen a few probes like this with about +/- 2dB ripple up at UHF because the design hasn't achieved a low VSWR looking into the coaxial cable.
-
The other reason for my choice of RG316 cable is that the dielectric material is PTFE so it doesn't melt with soldering. RG174 has a polyethylene dielectric so it melts really easily if soldering the braid or the inner connector.
RG174 is much more flexible though so if you can avoid melting the dielectric it is still a good choice for a 200MHz probe especially if you ever want to be able to fold it all up into a storage pouch.
-
To try and put some numbers behind this stuff you can crudely estimate the peak to peak mismatch ripple if you know the VSWR looking into the tip end of the coax cable with the 950R resistor removed. Assuming the cable itself is decent this will be defined by the quality of the BNC connector, the 50R BNC termination and the impedance of the scope input.
The input impedance of a typical 200MHz scope is going to be fairly low up at VHF so it can easily degrade the VSWR. The source impedance looking into the cable at the tip end is typically about 1000 ohms because of the series 950R resistor. Ideally, the cable will look like a resistive 50R so the VSWR of the source resistor is about 20:1. In reality, the input to the cable will not be a resistive 50R because the scope and the termination won't be a very accurate 50 ohms.
If the source VSWR is 20:1 and the VSWR looking into the cable is 1.3:1 then there will be about +/- 1dB ripple if the cable is long enough to explore all phase angles. If the cable VSWR is 2:1 then the ripple (uncertainty) increases to about +/- 3dB. I've seen a few probes like this with about +/- 2dB ripple up at UHF because the design hasn't achieved a low VSWR looking into the coaxial cable.
The above may not apply to an oscilloscope at all because of how they are calibrated. The calibration source for transient response at the input connector is a fast reference edge. When the transient response calibration is done, the ripple should be removed by the opposite ripple in the response of the oscilloscope, but that does not account for reflections in the cable and who knows how long that is? At higher frequencies no cable is used, and at lower frequencies it should not matter if the pulse source has a reverse termination to absorb the reflection, and the oscilloscope may not be fast enough to see it anyway. At higher frequencies the probe itself does part of the transient response calibration for the oscilloscope, which is why Tektronix releases otherwise identical probe revisions for different oscilloscope models, or at least they used to.
Probes themselves are calibrated with a fast edge and 25 ohm source impedance from a 50 ohm termination. How realistic is that? Probing digital lines is usually about 60 ohms. As a practical matter probe bandwidth depends on source impedance. Probe bandwidth itself is a questionable concept; it is better thought of as the bandwidth below which the probe will perform as specified and faithfully reproduce the signal.
Where it gets really interesting is that different manufacturers are not identical. In the past Tektronix calibrated their probes so that the oscilloscope shows the signal with the probe loading. HP calibrated probes to show what the signal should look like without the probe. Who knows what they do now.
So I would not worry about how accurate the 50 ohm termination is for a 200 MHz oscilloscope because performance depends on too many other things, and 200 MHz is just not that high. However there are plenty of reasons to want a low-z or active probe even at 200 MHz because the low input capacitance at the probe tip will make many measurements better, and some measurements possible. Try to get 1 GHz performance out of a 1 GHz oscilloscope and things are very different.
The other reason for my choice of RG316 cable is that the dielectric material is PTFE so it doesn't melt with soldering. RG174 has a polyethylene dielectric so it melts really easily if soldering the braid or the inner connector.
RG174 is much more flexible though so if you can avoid melting the dielectric it is still a good choice for a 200MHz probe especially if you ever want to be able to fold it all up into a storage pouch.
I agree completely. The various Teflon dielectric cables are easier to work with because the dielectric will not melt, but they are stiffer. RG-174 is good for flexibility, but can be difficult to solder. RG-178 is 50 ohm Teflon but even thinner than RG-316 or RG-174 so might have its place in some probing applications. Double shielded RG-223 and RG-400 are nice because the double shield makes the connector interface much tougher. In RF applications double shielded may be mandatory to prevent leakage.
-
The above may not apply to an oscilloscope at all because of how they are calibrated. The calibration source for transient response at the input connector is a fast reference edge. When the transient response calibration is done, the ripple should be removed by the opposite ripple in the response of the oscilloscope, but that does not account for reflections in the cable and who knows how long that is? At higher frequencies no cable is used, and at lower frequencies it should not matter if the pulse source has a reverse termination to absorb the reflection, and the oscilloscope may not be fast enough to see it anyway. At higher frequencies the probe itself does part of the transient response calibration for the oscilloscope, which is why Tektronix releases otherwise identical probe revisions for different oscilloscope models, or at least they used to.
I was really just describing how the frequency response of the scope can be affected by any mismatch at the scope end of the cable. There are equations for mismatch uncertainty that can show how much extra uncertainty (ripple) will be introduced if the cable is fairly long.
If it helps anyone see the point I'm making I did a quick video showing my old 500MHz Infinium scope looking at the swept output of my VNA on ports 1 and 2. The video starts with the green trace and this is a 50R to 50R connection to the scope. This shows a small amount of droop up at 500MHz. The scope response isn't flat to 500MHz as expected but the response is smooth.
The yellow trace shows a probe made with a series 1k resistor feeding about a 1m long RG174 cable that has a crimped BNC at the far end and I've selected the 1M input of the scope and fitted a commercial (unbranded) BNC through terminator at the scope input. There is a lot of ripple in the yellow trace and it's up to the individual to decide if this matters or not.
Not everyone will be limited to a 200MHz scope so I'm just showing everyone what can happen with a 500MHz scope.
This scope only has 8pF input capacitance and a 200MHz scope might have >15pF input capacitance. So the input VSWR of the 200MHz scope will be worse at 200MHz compared to the VSWR of the 500MHz scope at 200MHz.
https://www.youtube.com/watch?v=UrAXDzjJU_4 (https://www.youtube.com/watch?v=UrAXDzjJU_4)
This is a really old scope so sorry for the quality of the display.
-
If it helps anyone see the point I'm making I did a quick video showing my old 500MHz Infinium scope looking at the swept output of my VNA on ports 1 and 2. The video starts with the green trace and this is a 50R to 50R connection to the scope. This shows a small amount of droop up at 500MHz. The scope response isn't flat to 500MHz as expected but the response is smooth.
The yellow trace shows a probe made with a series 1k resistor feeding about a 1m long RG174 cable that has a crimped BNC at the far end and I've selected the 1M input of the scope and fitted a commercial (unbranded) BNC through terminator at the scope input. There is a lot of ripple in the yellow trace and it's up to the individual to decide if this matters or not.
If you had "Z0 cable"" -> 3dB pad -> 50ohm terminator -> 1M//xpF scope input, would the "ripple" be reduced as predicted by theory/simulation?
-
Why do you want to crimp? Jyst buy a premade cable of a length you heed with connectors and cut one connector off.
-
If you had "Z0 cable"" -> 3dB pad -> 50ohm terminator -> 1M//xpF scope input, would the "ripple" be reduced as predicted by theory/simulation?
It should be reduced a bit in theory.
The OP is looking at fast edges so I'd expect the discontinuity at the scope input to cause visible reflections if the mismatch/discontinuity was significant. If the cable had (say) 4ns delay then the impact of the discontinuity would appear at about 8ns from the start of the waveform edge as it has to make two trips along the cable to arrive back at the scope. If a fast clock was being examined rather than a single edge then I'd expect to see each discontinuity spoil the look of the waveform on the scope but it depends on the individual how significant this is to them. They might not care.
The input VSWR of my Infinium scope is very low when the internal 50R mode is selected but it does degrade up towards V/UHF. I would recommend placing an attenuator ahead of the scope input for critical measurements. This does demand a scope with a low noise input. Maybe try 3dB, 6dB or 10dB to see if it helps.
I found that using an SMA connector on the end of the cable, then an SMA attenuator followed by a decent SMA to BNC adaptor and then selecting the internal 50R termination works the best for me.
-
I didn't thought I would get so many responses on this topic.
Ok, my 200MHz scope input is 1Mohm and 13pF. I do have two channels.
Still, some parts of the discussions go over my head.
Maybe G0HZU is right and if RG-316 is easier to solder I might go for that. And I also agree with what Bud said and just buy a coax that has the BNC connectors.
I found Amphenol RF part 115101-01-M3.00. This might be the coax most suitable for me. Cut it in 2 equal lengths for my two channel scope.
This has 7 stranded wires. And as I have learned an 1k SMT resistor has less parasitic capacitance than an THT resistor. Would it be doable to solder an 0603 or 0805 chip resistor to this coax 7 strand conductor? Also the other end of the resistor is not a suitable probe tip. How to do all this in practical terms? I know I do like to use these E-Z hook clips:
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1533685;image)
But they add about 50mm stub length before I can have the 1k resistor. Would that be ok upto 200 MHz?
-
I didn't thought I would get so many responses on this topic.
Ok, my 200MHz scope input is 1Mohm and 13pF. I do have two channels.
Still, some parts of the discussions go over my head.
Maybe G0HZU is right and if RG-316 is easier to solder I might go for that. And I also agree with what Bud said and just buy a coax that has the BNC connectors.
But consider not cutting the cable at all. From Gyro's post https://www.eevblog.com/forum/projects/lo-z-probe/msg801772/#msg801772 (https://www.eevblog.com/forum/projects/lo-z-probe/msg801772/#msg801772)
(https://www.eevblog.com/forum/projects/lo-z-probe/?action=dlattach;attach=182472;image)
Or don't modify the cable but solder your resistor and ground stake to one of these, and attach the standard cable to that in the usual way. That gives you easily changeable tips.
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1533697)
-
The input VSWR of my Infinium scope is very low when the internal 50R mode is selected but it does degrade up towards V/UHF. I would recommend placing an attenuator ahead of the scope input for critical measurements. This does demand a scope with a low noise input. Maybe try 3dB, 6dB or 10dB to see if it helps. I found that using an SMA connector on the end of the cable, then an SMA attenuator followed by a decent SMA to BNC adaptor and then selecting the internal 50R termination works the best for me.
If the termination on the scope side of the cable isn't perfect (e.g. freed-through terminator on a capacitive high-Z input), then an additional 50 ohm source termination on the cable also helps to keep reflections from bouncing back and forth (i.e. using a resistive divider at the probe tip, instead of a series resistor only).
-
I do think it's worth it to try and minimise the VSWR at the scope end of the cable. I've also used the SMA end launch system shown by tggzzz and it means you can use a good quality RF cable. Minicircuits make an inline SMA DC block when using it with a spectrum analyser. It does become quite stiff to use if a really good cable is used.
Even with a 200MHz scope I think there can be issues with the integrity of the displayed waveform because the reflections that occur with a poorly terminated system happen within the cable and this is independent of the scope bandwidth. It's a function of the mismatch at the scope input. So a fast pulse edge will hit the discontinuity at the scope input and this will cause energy to slosh back and forth through the cable. This can cause a blip in the scope trace several nanoseconds after the first edge.
A lot depends on the rise-time of the pulse being measured. A 200MHz scope could have a rise-time of just under 2ns so when probing digital signal with a similar rise-time I think the above blip issue could occur. It probably won't be that significant but I'd expect it to spoil the look of the 'flat' section just after the pulse edge.
If measuring much slower rise-time waveforms (eg 10ns rise-time) then there probably won't be a problem. If using the probe for general sniffing of fast logic then I think even a 100MHz scope can be affected to some degree. If a really short cable is used (eg 30cm) then the blip reflection could begin to eat into the rising edge displayed on the scope but a lot depends on the rise-time of the waveform being measured.
-
This topic is a great use for spice, since it enables a quick sanity-check of your understanding of theory.
All simulations below are with a 6ns 50ohm delay line and a 450ohm resistor at the source end of the cable and a 50ohm terminator by the scope. The variants are time domain vs frequency domain, scope input capacitance 0.1pF (proper 50ohm scope input) 12pF ("high" impedance scope input), with or without a 50ohm termination at the source end of the cable, and source end termination but note scope end termination. Look at the schematic to see which is which.
Note the very significant differences in the frequency domain Y axis! (Click to embiggen :) )
What is and isn't acceptable is left as an exercise for the student.
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534483)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534507)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534489)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534513)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534495)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534501)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534525)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1534519)
-
If your scope does not have a 50 ohms input option (selected) then you do need to add something like this on the scope BNC connector:
https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE (https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE)
A 50 ohms coax cable connected to just a 1 Mohm scope input is a pure reflection point for sharp fast pulses. The coax cable is a transmission line, a waveguide. Waveguides shall be terminated, on both ends, with a resistor that is as many ohms as the waveguide’s characteristic impedance. 50 ohms that is for the coax you’re considering to use. For a Z0 probe, one can omit the 50 ohms resistor in the probe because there is no need to cancel echoes from what might travel as pulse/edge in the cable back from scope to probe. But then we must first guarantee that nothing could have bounced at the scope BNC input. The 50 ohms scope input resistor does give that guarantee. If it is there…
The spice models posted just above confirm it all…
-
Yes thanks, the downward blip seen in the first spice simulation is what I'd expect to see from a poorly terminated cable. The input capacitance of a typical 100-200MHz scope will probably be enough to cause this.
Note that the input impedance of a typical 200MHz bench scope is best modelled with something other than 1Meg in parallel with about 17pF. A better model is shown below. Up to maybe 300MHz the network below would model the input impedance of such a scope a lot better than a simple model of 1Meg || 17pF. To model it better, there needs to be some series resistance at the input and this is often in the range of 25R to 50R. I'm not sure if a 200MHz Picoscope will be like this though. Pico might use a different input to this.
-
What's the cause of the "extra" 25-50ohm resistor? If a length of lead, then why not an inductor?
-
Most scope inputs deliberately have some series resistance fitted near or at the input connector. This can act as a damping resistor.
-
The input resistor also acts as a fusible link. In the past, a 1/8 watt carbon composition resistor was commonly used directly at the BNC. Since the input impedance is nominally very high, the series inductance of a leaded part has little effect and this makes a convenient way to connect a chassis mounted BNC to the board with the input circuits like attenuators. Obviously this resistor is *not* present if an internal 50 ohm switchable termination is present and other arrangements are required. There is also likely a resistor directly in series with the FET gate at the FET after the input protection network.
Higher frequency probes may implement a t-coil termination at the BNC which has all kinds of advantages when driving a capacitive load. The oldest schematics of Tektronix high impedance *and* low-z probes show this but they were intended to be used with 1M inputs because internal terminations were not used then. The low-z probe has the termination built in and the t-coil drives the capacitive load of the 1 megohm input. The advantage is low capacitance just like a normal low-z probe, but it is of course also limited to lower voltages because of the low impedance termination and 1 megohm inputs have limited bandwidth. The t-coil provides the necessary broadband impedance matching to minimize the effect of the lumped capacitive load. I do not think anybody makes low-z probes like this anymore, but Tektronix still was in the 80s.
Nothing would prevent doing the same thing with a low-z probe intended to be used with an oscilloscope input that has a switched termination. I suspect Tektronix made a probe like this at some point for their 250 to 500 MHz instruments but a quick search did not find it. Details are scarce for early instruments which implemented switched termination, but they might have implemented the t-coil impedance matching internally. If so then they went to some effort to keep it secret; the relevant schematics that I have found suspiciously lack detail in that one area.
My fastest oscilloscope with an internal switched termination is only 300 MHz and I have never seen any extra aberration from using a 50 ohm source or probe interacting with the input capacitance. There is something there, but a fast high impedance probe shows the same thing.
-
Here's a old plot comparing a basic model against the input of a Tek 465 scope when measured with a VNA. The plot shows parallel resistance and parallel capacitance. You can see the input is only 1Meg || 20pF below about 1MHz.
By 100MHz the parallel input resistance Rp has fallen to about 160 ohms. The model shows very good agreement with the real scope.
-
My fastest oscilloscope with an internal switched termination is only 300 MHz and I have never seen any extra aberration from using a 50 ohm source or probe interacting with the input capacitance. There is something there, but a fast high impedance probe shows the same thing.
A typical shunt 15-20pF capacitance from a scope input will cause quite a discontinuity up at VHF. At the tip end of the cable is a 950 ohm resistor so the cable is therefore mismatched at both ends up at VHF. This means the initial positive edge of the pulse hits a shunt capacitance at the scope. This is a lower impedance than 50R so there will be an inverted reflection that lasts only a brief amount of time as the capacitance charges up. This inverted blip will travel back up the cable where it hits the 950R resistor. This huge mismatch causes nearly all of the blip to reflect back but this time it will stay inverted as 950R resistance is much higher than 50R so there is no waveform inversion with this reflection. If the cable has a delay of 5ns then after 10ns from the rising edge of the pulse the blip will arrive back at the scope input where it will put a little dent in the top edge of the waveform about 10ns after the rising edge seen on the scope just as in the spice simulations.
This all assumes the pulse has a fairly fast rise-time. I'd expect a rise time of 2-3ns to cause visible artefacts on a typical 200MHz scope. The Pico scope only has 13pF input capacitance so maybe it won't be as affected as some other scopes that can have 15-20pF input capacitance.
-
Quite obviously there are at least three different fundamental approaches when it comes to 50 ohms inputs:
- Providing no internal 50 ohms termination at all. Users are forced to use external through terminators. Depending on the actual input impedance of the scope, this might work if the requirements are low, but usually yields poor results. Btw, the popular solution with the BNC-T and external end-terminator works just as well in my experience. This is not too surprising, since the quality of the terminator becomes almost irrelevant in view of the input capacitance of the 1 meg scope input.
- Providing internal 50 ohms termination by a simple 50 ohms termination resistor switched in parallel to the input (by means of a relay). This can be as bad as the external termination. In any case it is far from ideal.
- Providing an internal 50 ohms termination with reduced input capacitance. This means more or less a dedicated 50 ohms signal path, but it’s up to the frontend designer how far that goes. Such a solution which avoids excessive input capacitance can provide a much better internal 50 ohms termination with reasonable VSWR.
I happen to have some old measurements. See first screenshot.
RL_M-Pico_Ext_C-SDS_Int_Y-SDS_Ext
It shows the internal 50 ohms termination of an old low end 300 MHz DSO (cyan) and compares it to an external through terminator (yellow). Internal is a little bit better than external, but both are of very limited use. Here the internal termination clearly was just a resistor switched in parallel to the input without minimizing the capacitance.
There appears to be some industry standard, where a VSWR of 1.5 is considered acceptable for a 50 ohms input of an oscilloscope. This is equivalent to a return loss of 14 dB. Using this criterion, the scope in question barely reached 100 MHz with its internal termination and the external one barely exceededs 50 MHz.
As a comparison, the magenta trace shows the external terminator on a 200 MHz PicoScope (3000 series). This proves that this approach can actually work, as long as the requirements are low. Better than 12 dB return loss (VSWR < 1.67) up to 500 MHz is quite a respectable result for just an external termination. Since the input capacitance is 14 pF as well, there must be a simple means like a 50 ohms series resistor in the input to mitigate the reflections.
By contrast, a modern upper entry level DSO like the SDS2000X Plus or HD provides a much better input match:
SDS2354X+_SWR_200mV
The maximum VSWR is 1.31, which is equivalent to 17.45 dB return loss.
-
I don't know why I've never noticed the series resistor before. Embarassing.
Here's a old plot comparing a basic model against the input of a Tek 465 scope when measured with a VNA. The plot shows parallel resistance and parallel capacitance. You can see the input is only 1Meg || 20pF below about 1MHz.
By 100MHz the parallel input resistance Rp has fallen to about 160 ohms. The model shows very good agreement with the real scope.
Good result; that't a good justification for my nanoVNA :)
-
Yes, I think this stuff should be predictable as long as the scope input impedance vs frequency is known.
To give a demo, I've measured my old Tek TDS2012 DSO input with a VNA and produced a model for the input to match the VNA measurement. See the first plot below. I then put this model into Genesys (I could have just used the raw s1p data rather than the model) and then ran a time domain analysis using a pulse with a 2ns risetime and a 950R probe tip with a cable 1.2m long feeding into the TDS2012 but with a 50 ohm termination modelled at the input.
The second plot below shows the simulation vs the actual measurement on a fast logic gate that has about a 2ns rise-time. I used a probe with a series 1k ohm resistor and 1.2m of RG174 cable followed by a crimped BNC and an unbranded 50R through terminator and this was fed to the scope input on channel 1. I think the agreement is quite good. The other trace on channel 2 of the scope is a measurement using my probe that is well terminated with an SMA connector, SMA attenuator, SMA through termination and a SMA to BNC adaptor at the scope input. This looks a lot better.
-
I loaded a different simulation engine into Genesys and this allows a real time analysis of the probe system. This simulator isn't as accurate as the previous transient simulator but it is really fast. Therefore, some people may find the simulation to be useful in understanding how the length of the probe cable and the capacitance of the scope affect the response seen on a scope.
Sorry, there's no sound but it's fairly obvious what is being done in the video. This should be a lot more insightful than looking at a few plots from a simulator. You can see I tinkered with the rise-time of the pulse and the length of the probe cable and the capacitance of the scope input. I also added a variable attenuator at the input at the end of the video.
https://www.youtube.com/watch?v=JWNhYGJHUqM (https://www.youtube.com/watch?v=JWNhYGJHUqM)
-
My fastest oscilloscope with an internal switched termination is only 300 MHz and I have never seen any extra aberration from using a 50 ohm source or probe interacting with the input capacitance. There is something there, but a fast high impedance probe shows the same thing.
A typical shunt 15-20pF capacitance from a scope input will cause quite a discontinuity up at VHF. At the tip end of the cable is a 950 ohm resistor so the cable is therefore mismatched at both ends up at VHF. This means the initial positive edge of the pulse hits a shunt capacitance at the scope. This is a lower impedance than 50R so there will be an inverted reflection that lasts only a brief amount of time as the capacitance charges up. This inverted blip will travel back up the cable where it hits the 950R resistor. This huge mismatch causes nearly all of the blip to reflect back but this time it will stay inverted as 950R resistance is much higher than 50R so there is no waveform inversion with this reflection. If the cable has a delay of 5ns then after 10ns from the rising edge of the pulse the blip will arrive back at the scope input where it will put a little dent in the top edge of the waveform about 10ns after the rising edge seen on the scope just as in the spice simulations.
This all assumes the pulse has a fairly fast rise-time. I'd expect a rise time of 2-3ns to cause visible artefacts on a typical 200MHz scope. The Pico scope only has 13pF input capacitance so maybe it won't be as affected as some other scopes that can have 15-20pF input capacitance.
Has anybody actually observed that with a real low-z probe when a feedthrough termination is used?
I thought Tektronix might have made a separate set of low-z probes for their early or late oscilloscopes that used feedthrough termination that include the t-coil termination to prevent the problem, but I have not been able to find them. The only low-z probes in that category are the ones I mentioned which work with a 1 megohm input and have the termination built in. I know those show no such glitch.
From what I remember, the t-coil termination sort of transforms the capacitive load into a resistive load.
-
This should be a lot more insightful than looking at a few plots from a simulator. You can see I tinkered with the rise-time of the pulse and the length of the probe cable and the capacitance of the scope input. I also added a variable attenuator at the input at the end of the video.
Thanks a lot. This is very enlightening indeed!
-
Does the Douglas Smith version with termination at the probe side perform better? (at the cost of more noise.)
-
My fastest oscilloscope with an internal switched termination is only 300 MHz and I have never seen any extra aberration from using a 50 ohm source or probe interacting with the input capacitance. There is something there, but a fast high impedance probe shows the same thing.
A typical shunt 15-20pF capacitance from a scope input will cause quite a discontinuity up at VHF. At the tip end of the cable is a 950 ohm resistor so the cable is therefore mismatched at both ends up at VHF. This means the initial positive edge of the pulse hits a shunt capacitance at the scope. This is a lower impedance than 50R so there will be an inverted reflection that lasts only a brief amount of time as the capacitance charges up. This inverted blip will travel back up the cable where it hits the 950R resistor. This huge mismatch causes nearly all of the blip to reflect back but this time it will stay inverted as 950R resistance is much higher than 50R so there is no waveform inversion with this reflection. If the cable has a delay of 5ns then after 10ns from the rising edge of the pulse the blip will arrive back at the scope input where it will put a little dent in the top edge of the waveform about 10ns after the rising edge seen on the scope just as in the spice simulations.
This all assumes the pulse has a fairly fast rise-time. I'd expect a rise time of 2-3ns to cause visible artefacts on a typical 200MHz scope. The Pico scope only has 13pF input capacitance so maybe it won't be as affected as some other scopes that can have 15-20pF input capacitance.
Has anybody actually observed that with a real low-z probe when a feedthrough termination is used?
I thought Tektronix might have made a separate set of low-z probes for their early or late oscilloscopes that used feedthrough termination that include the t-coil termination to prevent the problem, but I have not been able to find them. The only low-z probes in that category are the ones I mentioned which work with a 1 megohm input and have the termination built in. I know those show no such glitch.
From what I remember, the t-coil termination sort of transforms the capacitive load into a resistive load.
Tek 485 5ns/div with an HP10020A probe using a spear at the source (<500ps risetime).
First picture is with the 485's internal 50ohm attenuator. The inductive blip at the 3rd division indicates the probe cable length.
Second picture is with the 485's 1M input and an external 50ohm through termination.
Interpretation is left as an exercise for the student.
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1535239)
(https://www.eevblog.com/forum/testgear/making-an-201-coax-probe/?action=dlattach;attach=1535245)
-
The internal wire in the oscope probe's coax isn't made of copper, but it is a higher resistance wire usually, something like 200ohm, afaik..
-
The internal wire in the oscope probe's coax isn't made of copper, but it is a higher resistance wire usually, something like 200ohm, afaik..
In a standard "high" impedance *10 probe, yes.
Not in the HP10020, which is a Z0 probe. There the coax's conductor is <1ohm.
-
The Tek 485 is an interesting example because I think this scope switches in a completely different signal path when the internal 50 ohm termination is selected. There is a dedicated signal path that is optimised for 50 ohms and so this will show a resistive input without the shunt capacitance. I think this partly explains the big difference in the two scope plots by tggzzz. When the external termination is fitted in 1M mode the scope will have 20pF input capacitance and this explains why the blip is so prominent with this arrangement. I tried a similar test on my old Tek 465 scope last night using an external 50 ohm BNC termination and this showed the blip in the expected place. Sadly, the old 465 scope doesn't have an internal 50 ohm termination switch.
-
My fastest oscilloscope with an internal switched termination is only 300 MHz and I have never seen any extra aberration from using a 50 ohm source or probe interacting with the input capacitance. There is something there, but a fast high impedance probe shows the same thing.
A typical shunt 15-20pF capacitance from a scope input will cause quite a discontinuity up at VHF. At the tip end of the cable is a 950 ohm resistor so the cable is therefore mismatched at both ends up at VHF. This means the initial positive edge of the pulse hits a shunt capacitance at the scope. This is a lower impedance than 50R so there will be an inverted reflection that lasts only a brief amount of time as the capacitance charges up. This inverted blip will travel back up the cable where it hits the 950R resistor. This huge mismatch causes nearly all of the blip to reflect back but this time it will stay inverted as 950R resistance is much higher than 50R so there is no waveform inversion with this reflection. If the cable has a delay of 5ns then after 10ns from the rising edge of the pulse the blip will arrive back at the scope input where it will put a little dent in the top edge of the waveform about 10ns after the rising edge seen on the scope just as in the spice simulations.
This all assumes the pulse has a fairly fast rise-time. I'd expect a rise time of 2-3ns to cause visible artefacts on a typical 200MHz scope. The Pico scope only has 13pF input capacitance so maybe it won't be as affected as some other scopes that can have 15-20pF input capacitance.
Has anybody actually observed that with a real low-z probe when a feedthrough termination is used?
I thought Tektronix might have made a separate set of low-z probes for their early or late oscilloscopes that used feedthrough termination that include the t-coil termination to prevent the problem, but I have not been able to find them. The only low-z probes in that category are the ones I mentioned which work with a 1 megohm input and have the termination built in. I know those show no such glitch.
From what I remember, the t-coil termination sort of transforms the capacitive load into a resistive load.
That's interesting stuff, thanks! I'll have a play this evening with that t-coil circuit.
-
The Tek 485 is an interesting example because I think this scope switches in a completely different signal path when the internal 50 ohm termination is selected. There is a dedicated signal path that is optimised for 50 ohms and so this will show a resistive input without the shunt capacitance.
That's why I chose it to display the difference :)
There are indeed two signal paths with two attenutors, one 50ohm and one 1Mohm, selected by an RF relay at the front end that pops out if you overload the 50ohm path. (A greatly underrated feature!)
N.B. it isn't optimised for 50ohm, it is 50ohm :)
I think this partly explains the big difference in the two scope plots by tggzzz. When the external termination is fitted in 1M mode the scope will have 20pF input capacitance and this explains why the blip is so prominent with this arrangement. I tried a similar test on my old Tek 465 scope last night using an external 50 ohm BNC termination and this showed the blip in the expected place. Sadly, the old 465 scope doesn't have an internal 50 ohm termination switch.
With most scopes, including that majority with an internal 50ohm resistor banged in parallel with the 1Mohm input, the best that can be hoped for is to use an attenuator to reduce the effect of the 15pF+1m cable.
Lose gain, gain fidelity; that's always the story at RF. But hey, gain is cheap nowadays :)
-
If your scope does not have a 50 ohms input option (selected) then you do need to add something like this on the scope BNC connector:
https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE (https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE)
A 50 ohms coax cable connected to just a 1 Mohm scope input is a pure reflection point for sharp fast pulses. The coax cable is a transmission line, a waveguide. Waveguides shall be terminated, on both ends, with a resistor that is as many ohms as the waveguide’s characteristic impedance. 50 ohms that is for the coax you’re considering to use. For a Z0 probe, one can omit the 50 ohms resistor in the probe because there is no need to cancel echoes from what might travel as pulse/edge in the cable back from scope to probe. But then we must first guarantee that nothing could have bounced at the scope BNC input. The 50 ohms scope input resistor does give that guarantee. If it is there…
The spice models posted just above confirm it all…
But the thing is to increase its impedance as a pure 50Ω Z0 probe is practically useless; there's almost nothing you can observe with it. Put it on an XO µC pin and the oscillator stops. On a 100Ω LVDS signal, and the transceiver faults. On a device D+/D- USB pin, and the host wants to reset and reenumerate, and likely completely quarantines unless you're very quick to remove the probe. On an ethernet RMII interface and it goes belly-up. There are such extremely limited uses for it that its existence is mostly an academical novelty.
-
If your scope does not have a 50 ohms input option (selected) then you do need to add something like this on the scope BNC connector:
https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE (https://www.distrelec.nl/en/feed-through-termination-50-ohm-2w-rohde-schwarz-hz22/p/11085041?ext_cid=shgooaqnlnl-p-shopping-fallback&gclsrc=aw.ds&?pi=11085041&gclid=EAIaIQobChMIiK7X08Tu-AIVy513Ch2y8wCQEAQYASABEgJO2_D_BwE)
A 50 ohms coax cable connected to just a 1 Mohm scope input is a pure reflection point for sharp fast pulses. The coax cable is a transmission line, a waveguide. Waveguides shall be terminated, on both ends, with a resistor that is as many ohms as the waveguide’s characteristic impedance. 50 ohms that is for the coax you’re considering to use. For a Z0 probe, one can omit the 50 ohms resistor in the probe because there is no need to cancel echoes from what might travel as pulse/edge in the cable back from scope to probe. But then we must first guarantee that nothing could have bounced at the scope BNC input. The 50 ohms scope input resistor does give that guarantee. If it is there…
The spice models posted just above confirm it all…
But the thing is to increase its impedance as a pure 50Ω Z0 probe is practically useless; there's almost nothing you can observe with it. Put it on an XO µC pin and the oscillator stops. On a 100Ω LVDS signal, and the transceiver faults. On a device D+/D- USB pin, and the host wants to reset and reenumerate, and likely completely quarantines unless you're very quick to remove the probe. On an ethernet RMII interface and it goes belly-up. There are such extremely limited uses for it that its existence is mostly an academical novelty.
All tools have their limitations.
The (up to) 5kohm input impedance of an HP10020 Z0 probe is two orders of magnitude higher than a so-called "high" impedance *10 probe, and much more robust to misapplied voltages than an active probe.
-
I thought Tektronix might have made a separate set of low-z probes for their early or late oscilloscopes that used feedthrough termination that include the t-coil termination to prevent the problem, but I have not been able to find them. The only low-z probes in that category are the ones I mentioned which work with a 1 megohm input and have the termination built in. I know those show no such glitch.
From what I remember, the t-coil termination sort of transforms the capacitive load into a resistive load.
That's interesting stuff, thanks! I'll have a play this evening with that t-coil circuit.
There is quite a bit of documentation now on the t-coil derivation. Back when Tektronix first started using it, it was a big secret.
With most scopes, including that majority with an internal 50ohm resistor banged in parallel with the 1Mohm input, the best that can be hoped for is to use an attenuator to reduce the effect of the 15pF+1m cable.
Lose gain, gain fidelity; that's always the story at RF. But hey, gain is cheap nowadays :)
Double terminating the cable would halve the end impedance to 25 ohms, so the series resistor would be half value to keep the same attenuation. Is that such a big sacrifice? That would make it ... 225 instead of 450 or 2475 instead of 4950 ohms?
But the thing is to increase its impedance as a pure 50Ω Z0 probe is practically useless; there's almost nothing you can observe with it. Put it on an XO µC pin and the oscillator stops. On a 100Ω LVDS signal, and the transceiver faults. On a device D+/D- USB pin, and the host wants to reset and reenumerate, and likely completely quarantines unless you're very quick to remove the probe. On an ethernet RMII interface and it goes belly-up. There are such extremely limited uses for it that its existence is mostly an academical novelty.
Mostly they were used for measuring ECL signals where the impedance would be 25 to 50 ohms from the 50 to 100 ohm terminated microstrip. But the DC offset caused by a low-z probe was still a problem, so they had "offset" probes where the DC bias could be adjusted preventing interference with the level of the signal:
https://w140.com/tekwiki/wiki/P6230