Products > Test Equipment
Two Tone Test with Scope and SA
<< < (26/33) > >>
G0HZU:
It's always good to revisit stuff like this. As you can tell my experience is really with spectrum analysers, either old school swept types or modern versions that can do FFT or both swept and FFT. So I'm used to using the classic equations for stuff like this.

I've never had much joy using scopes to measure IMD but then again I've only ever tried this using fairly old scopes and the results were usually quite grim. Some of the screenshots of the modern scopes on this thread look to be very impressive in terms of what the FFT mode can display.

I'm not sure where the significant IMD gets generated in modern scopes. In a typical scope there is usually an impedance converter stage followed by a selectable preamp and then on to the ADC. I think some scopes also split the signal path into low frequency and high frequency paths so it's difficult to know how much the front end affects the IMD generated by the scope.

2N3055:

--- Quote from: G0HZU on June 15, 2022, 02:25:08 pm ---It's always good to revisit stuff like this. As you can tell my experience is really with spectrum analysers, either old school swept types or modern versions that can do FFT or both swept and FFT. So I'm used to using the classic equations for stuff like this.

I've never had much joy using scopes to measure IMD but then again I've only ever tried this using fairly old scopes and the results were usually quite grim. Some of the screenshots of the modern scopes on this thread look to be very impressive in terms of what the FFT mode can display.

I'm not sure where the significant IMD gets generated in modern scopes. In a typical scope there is usually an impedance converter stage followed by a selectable preamp and then on to the ADC. I think some scopes also split the signal path into low frequency and high frequency paths so it's difficult to know how much the front end affects the IMD generated by the scope.

--- End quote ---

Scopes definitely have DC-low frequency/high frequency dual path, you're right about that.
And Mike's intention wasn't to really use (though you could, within limits, of course) scope for measuring IMD of devices.

Two tone technique is routinely used to test linearity of ADCs (that is something Mike knows "a bit" about  :-DD), and he really wanted to test how linear is front end and ADC in new Siglent scopes. This has been ongoing discussion. Results show that they are, in fact, very good...

Thank you for your comments.. You really are RF guru like Performa said, and your contributions are very valued. Even if we get lost a bit sometimes..  :-DD

Best,
David Hess:

--- Quote from: G0HZU on June 14, 2022, 09:39:32 pm ---In the case of a DSO I guess a lot depends on the signal handling performance of every stage of the analogue front end before it gets into the digital domain. All these stages can generate IMD ahead of the ADC.
--- End quote ---

The impedance buffer and first stage contribute most of the distortion because they have to operate over the widest range.  There are circuit techniques like cascodes and bootstrapping to improve this considerably but they increase noise and limit bandwidth.  Integrated front ends have the additional problem of more parasitic elements and coupling; see below about dual integrated JFETs.


--- Quote from: G0HZU on June 15, 2022, 02:25:08 pm ---I've never had much joy using scopes to measure IMD but then again I've only ever tried this using fairly old scopes and the results were usually quite grim. Some of the screenshots of the modern scopes on this thread look to be very impressive in terms of what the FFT mode can display.

I'm not sure where the significant IMD gets generated in modern scopes. In a typical scope there is usually an impedance converter stage followed by a selectable preamp and then on to the ADC. I think some scopes also split the signal path into low frequency and high frequency paths so it's difficult to know how much the front end affects the IMD generated by the scope.
--- End quote ---

The impedance converter is what limits performance.  One reason DSOs for a long time were limited to 8-bits, and largely still are, is that the demands on the analog section limit performance to that level or lower anyway without a heroic design, and maybe even then.  Settling time tests of 12-bit DSOs show that they do *not* have 12-bit performance in all respects.  Usually they are not even close, but 12-bit operation is useful in other respects like noise, dynamic range, and marketing.  The performance of some modern DSOs operating as a spectrum analyzer is amazing except for low frequency noise.

One thing modern DSOs can do, and have done for quite some time at the high end, is apply digital correction but this brings up calibration issues.


--- Quote from: 2N3055 on June 15, 2022, 03:39:59 pm ---Scopes definitely have DC-low frequency/high frequency dual path, you're right about that.
--- End quote ---

All budget oscilloscopes have used a two path design going back to at least the first integrated JFET operational amplifiers.  It removes the expensive requirement for grading JFETs (1) to get good DC performance and later had the advantage of providing AC/DC coupling without a relay.  Some modern instruments might still use a single path design to support faster overload recovery.

(1) An integrated dual matched JFET cannot be used because of cross-coupling.
Performa01:

--- Quote from: David Hess on June 15, 2022, 06:41:58 pm ---
--- Quote from: G0HZU on June 14, 2022, 09:39:32 pm ---In the case of a DSO I guess a lot depends on the signal handling performance of every stage of the analogue front end before it gets into the digital domain. All these stages can generate IMD ahead of the ADC.
--- End quote ---

The impedance buffer and first stage contribute most of the distortion because they have to operate over the widest range.  There are circuit techniques like cascodes and bootstrapping to improve this considerably but they increase noise and limit bandwidth.  Integrated front ends have the additional problem of more parasitic elements and coupling; ...

--- End quote ---

Well, at least in case of the Siglent frontends discussed here (which have of course been designed with the assistance from LeCroy), all my tests strongly indicate that the input buffer / impedance converter is not the limiting factor. At least not the HF path, which becomes effective above a few kHz...

It is the PGA that is following the buffer stage. And how could we be surprised? It is a challenge to have a fully integrated variable gain amplifier with up to >2 GHz bandwidth (in case of the SDS6000), programmable in e.g. 2 dB gain steps over a range of 40 dB.

Like you said, true 12-bit performance in a wideband general purpose oscilloscope isn't going to happen anytime soon. But as I demonstrated, there can be sweet spots (read: PGA settings), where the distortion is at its minimum. Alter the gain by just one step and the distortion might rise by several dB immediately.

EDIT: look at the attached datasheet of a typical 900 MHz PGA like it my have been used for the current SDS2000X series. Look specifically at the distortion figures published on page 6. This proves that we actually need to look for sweet spots whenever we want a 12-bit like distortion performance near -70 dBc.
Performa01:

--- Quote from: G0HZU on June 15, 2022, 08:30:16 am ---If it helps I have an early/old Tek spectrum analyser here that has a digital IF. When used down at below 40MHz it is effectively a 40MHz baseband scope sampling at 102.4MHz but the display is trapped in the frequency domain. This is not a swept analyser, it samples the input and generates an FFT just like a scope.

If I inject two clean tones I can generate IMD in this analyser at about -83dBc if I drive it towards FSD of the ADC.

--- End quote ---

This performance got me curious. Of course, general purpose oscilloscopes cannot compete with dedicated higher end spectrum analyzers, even if they’re old. Yet I wanted to know how close a SDS2000X HD can get.

We want “near full scale” signals now, so the signals are 10 MHz with 20 kHz frequency offset and -10 dBm = 70.71 mVrms = 200 mVpp each, which results in a total amplitude of 400 mVpp.

First test at 50 mV/div. Visible screen range is exactly 400 mVpp. Yet the DSO already indicates possible overrange in its automatic measurements: >100.080001mV(↨)

See 1st screenshot.

SDS2504X HD_IMD_10MHz_-10dBm_50mV

IMD is about -70 dBc, so it is still worthy of a 12-bit oscilloscope. Even though it cannot touch this particular Tek instrument, there are quite some SA out there which would struggle to achieve a result like this at an input level as high as this.

The point is, that the SA is measured way below the 1 dB compression point of its input amplifier and/or mixer, whereas the SDS2000X HD has the maximum input at 100 mV/div (above that, the first attenuator kicks in). Consequently, 50 mV/div is just 6 dB below the designed “full scale” of the input buffer and PGA input. If we look at the datasheet I’ve attached in my previous post, we can see that the maximum permissible input of the typical PGA is 1V. Consequently, 400 mVpp (after the unity gain buffer) is already close to the permissible input voltage of that stage – certainly not 30 dB below.


Let’s try to get rid of the overrange warning and lower the input gain to 55 mV/div. Now we’ve found a sweet spot at IMD -77 dBc, which is not that far off when compared to the -83 dBc of the dedicated Tek SA.

See 2nd attached screenshot:

SDS2504X HD_IMD_10MHz_-10dBm_55mV

Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod