Author Topic: Dependence of a scope's frequency response on vertical attenuator setting  (Read 7887 times)

0 Members and 1 Guest are viewing this topic.

alm

  • Guest
In the thread Which oscilloscope is better?, I argued that to state that the analog front-end of the Rigol DS1052E has 160MHz bandwidth, you need to test every vertical attenuator setting:

However, I'm not sure doing a frequency sweep across different attenuator setting is necessary since the attenuator is calibrated in v/div and its accurate to within its spec, suggesting one setting is approximate good for all settings.
From past experiences, I got the impression that there was a significant difference in frequency response between various vertical attenuator settings. To quantify this, I tested the frequency response of a Tektronix TDS-220. This is a 100MHz/1GS/s DSO, the closest I could find to the current cheap low-end scopes by the likes of Rigol, although I would expect Tek's analog front-end to be superior to Rigol's. The relatively low bandwidth (100MHz) makes it easier to test, since you need to generate signals with a constant amplitude at frequencies beyond the scope's bandwidth.

I used a Tektronix SG-503 leveled sine wave oscillator (verified to be reasonably close to spec with a 500MHz DSO, but I don't have a spectrum analyzer or scope with bandwidth well beyond 250MHz to accurately verify the 3% amplitude flatness from 50kHz to 250MHz). It was connected to the scope with about a meter of RG-58 coax, and it was terminated with a Tektronix 011-0049-01 50-ohm feed-through terminator at the scope end (I only tested channel 1). The scope was set to average over 16 waveforms, and was set to measure frequency and Vp-p. I set the scope to each vertical setting, adjusted the SG-503 output amplitude to be slightly less than 8 divs high, and then increased the frequency (and adjusted the horizontal setting to show a few periods).

The first graph shows the frequency response at each vertical setting, the second shows the -3dB point versus the vertical setting. The vertical settings above 500mV/div were measured with slightly less accuracy, since the generator only goes up to ~5.5V, not enough for 8 divs. This made the (8-bit) measurements less accurate, especially at 5V/div.

Excluding the settings below 10mV/div (which are limited to 20MHz), the -3dB point varied between 119MHz and 158MHz (2V/div and 5V/div may have been even higher if the generator could generate a signal with a larger amplitude), so depending on the vertical setting, I could claim this scope has 158MHz bandwidth or 119MHz. For a proper measurement, you should actually sweep the environmental conditions over the full operational temperature/humidity range, but I don't have an environmental chamber. It's quite likely that my lowest -3dB point would have been even lower in that case, perhaps barely above the specified 100MHz.

I was impressed by the triggering, it had no problem triggering at 270MHz as long as the amplitude was at least 1.5 div (which required significantly increasing the vertical setting). At 270MHz, there is almost no increase in on-screen amplitude when increasing the vertical sensitivity from 20mV/div to 10mV/div, since the amplitude decreases from ~113mV to ~59mV (this difference is also visible in the first graph). From 50mV/div to 20mV/div, the amplitude decreases from ~189mV to ~113mV, not that much change on screen either. This is what prompted me to do these measurements.
 

Online EEVblog

  • Administrator
  • *****
  • Posts: 37740
  • Country: au
    • EEVblog
Nice data.
It is fairly common for the bandwidth figures of scopes to not apply to the lowest few ranges as you have found. They will often have different bandwidth (and noise) figures for these lower ranges.
Presumably because those ranges are amplified instead of simply attenuated down to a base level range.

Dave.
 

Offline tyblu

  • Frequent Contributor
  • **
  • Posts: 287
  • Country: 00
    • blog.tyblu.ca
Great writeup!
Tyler Lucas, electronics hobbyist
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Awesome data and test setup!  Thanks a million for clarifying the issue you raised.  There are very few DSO: gain vs frequency response analyses in any google search, so I presume these are very rare and worth a detailed expose.

If others have sources, please link them here, particularly say for LeCroy, R&S and other scope makers.

Agilent has many 'Agilent vs Tek ' themed product application notes [ i.e., ersatz propaganda?] here's a snapshot of a very relevant one.

See photos #1, the Agilent curve vs photo #2, Tek curve.

Its coincidental it reflects a distribution of frequency vs gain similar to the different experiences I've had with the Rigol 1052E [assuming Rigol follows Agilent'd requirements in design, having once been a source for Agilent's low end scopes] and alm's experience with his Tek scope, brought up in another thread.

Agilent maintains a smaller, i.e., tighter, spread of gain vs frequency until its roll off.  Note these high end Agilent and Tek scopes are brick wall filtered scopes, not Gaussian -3dB roll offs.  But at issue is not the filter but the variability in frequency response of the vertical amp gain. Agilent prefers amps have a flat response; one can see there are differences in frequency for each vertical setting, but falls within ~ +/- 0.5dB across its vertical settings [~ 6%].

The Tek has a similar spread of 0.5 dB but varies ~ +/- 1.5dB, past 13 GHz.  It has interesting implications for measurement.

Although we are considering 100 MHz DSOs, if this design philosophy is carried through their scopes line by Agilent and Tek, then even in the case of the low end 1052E vs Tek 220, it explains my experience of a flat response until -3dB on my Rigol and the more varied response alm's had on his Tek.  Looking at the 0.5dB variation is  ~ +/- 6% of reading on the scope divisions, that variation is close to the posted 1052E spec of 4% + other corrections for range, see Photo #3.

I'll try to reproduce some of alm's test as best possible given my meager gear and graph them, and reconfirm those impressions.  More later as I've time to analyze alm's good work in detail.

« Last Edit: January 31, 2011, 05:40:06 pm by saturation »
Best Wishes,

 Saturation
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
The roll of -3dB is from the front end filter, but this variation in gain vs frequency is something from the amplifier design.  It shouldn't make a difference between any scope in the Rigol 1000E lineup, but as I know, know one has really tested it like alm's procedure here, other than the scope manufacturers themselves and rarely publicize it [ except for the Agilent application note].

http://www.google.com/url?sa=t&source=web&cd=1&ved=0CBMQFjAA&url=http%3A%2F%2Fcp.literature.agilent.com%2Flitweb%2Fpdf%2F5989-9399EN.pdf&rct=j&q=tek%20agilent%20frequency%20response&ei=1PRGTcL3EZHQgAfdq62FAg&usg=AFQjCNEAeYWS0tKYyo4G0qjO2Y8y1MF3-g&cad=rja


nice. what about rigol 1052e hacked to 1102e?
Best Wishes,

 Saturation
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Here are some interesting regions in alms' graphs.

Red arrow: the point when the amps begin to diverge, ~ 20 MHz.  Prior to this point, I presume the frequency response is equal for all gain settings, except 2mV and 5mV, unless you move it back just a touch.

Point A: The maximum spread of most amp setting from ~ 30 MHz to 120 MHz, about +/- 0.5dB, or 6%.

Point B: Spread of gain at the -3dB mark.

Point C: maximum spread of different gain settings, ~ +/- 0.75dB, or 8%.


Discussion:

The Tek seems to have a desirable narrow spread of gain vs frequency up to ~ 20 MHz, the 5th harmonic cutoff.  At 30 MHz or more, the spread is roughly equal, with a 6% uncertainty to the -3dB mark.  This means if you had a complex waveform between A and B, the Tek scope would change the appearance and amplitude of the waveform, due to the roll off, its amplifier characteristics and naturally, the inherent ADC error.

What is the labeled frequency response of the scope?  If you had too, a manufacturer would unlikely base it rating on the 2V/div blue line, the best response of 158 MHz, but of 120 MHz green line, since the worst case frequency response would form the minimum of all the scope's gain vs frequency performance.  The scope could do far better than that on various gain settings, once the manufacturer made explicit exception of the 2mV and 5mV faster rolloff. 

How does this compare to the Rigol 1052 or 1102E?

I don't know anyone who has tested the Rigol close to what alm has but my experience has been the roll off is a Gaussian -3dB, but without the color introduced by the amp gain settings.  However, the Rigol quoted specs state its vertical amp also varies by 4%+/- and more, for example the ADC chip specs state its error rate is in the 6% range, see photo.

So the issues to confirm with the Rigol vs Tek response are:

Confirm vertical gain vs frequency response
Confirm the absence of a 2mV and 5mV -3dB roll off at 50 MHz


Best Wishes,

 Saturation
 

alm

  • Guest
Re: Dependence of a scope's frequency response on vertical attenuator setting
« Reply #6 on: February 01, 2011, 11:52:54 am »
It is fairly common for the bandwidth figures of scopes to not apply to the lowest few ranges as you have found. They will often have different bandwidth (and noise) figures for these lower ranges.
Presumably because those ranges are amplified instead of simply attenuated down to a base level range.
It was mainly the frequency response of 10mV/div and up I was interested in, the 20MHz limit for the lowest ranges is well documented, and the scope also displays the BW limit icon. (Almost) all of the scopes do something similar, indeed because there is usually an extra amplification stage involved. It may also help with the amount of displayed noise (not sure if the BW limit is before or after the CCD), since the TDS-200 is known for high levels of noise (which is why I used averaging).

Its coincidental it reflects a distribution of frequency vs gain similar to the different experiences I've had with the Rigol 1052E [assuming Rigol follows Agilent'd requirements in design, having once been a source for Agilent's low end scopes] and alm's experience with his Tek scope, brought up in another thread.
I would be very careful with extrapolating from philosophies for 20GHz scopes to 100MHz scopes, the issues and difficulties are very different. The rebadged Rigol scopes were fully designed by Rigol, as far as I know, Agilent only requested some firmware modifications, so I wouldn't count on them being designed with the Agilent philosophy. I would also be careful with using marketing material as facts, since it's likely that this test was set up to make Agilent look as good as possible. I wouldn't be surprised if Tek had similar data showing the opposite (perhaps not publicly). But I appreciate your efforts to find any data at all on this topic, Bode plots are not usually make public.

Looking at the 0.5dB variation is  ~ +/- 6% of reading on the scope divisions, that variation is close to the posted 1052E spec of 4% + other corrections for range, see Photo #3.
I don't see the relevance of this, what does this have to do with DC gain? I didn't test at DC, I started at 50kHz, and normalized everything relative to this level. Scope vendors don't usually specify gain flatness within the pass band, as far as I know.

In a different manual, I found that they derate the bandwidth with 1%/degC above 30 degC, so with an operational range of up to 50 degC, this might mean that the BW is down 20%. No such note in the TDS-220 manual, so I'm not sure how common this is.

nice. what about rigol 1052e hacked to 1102e?
Send me one and I'll find out ;). This is the closest I have, similar bandwidth and market segment, but somewhat older.

What is the labeled frequency response of the scope?  If you had too, a manufacturer would unlikely base it rating on the 2V/div blue line, the best response of 158 MHz, but of 120 MHz green line, since the worst case frequency response would form the minimum of all the scope's gain vs frequency performance.  The scope could do far better than that on various gain settings, once the manufacturer made explicit exception of the 2mV and 5mV faster rolloff.
Labeled bandwidth is 100 MHz. The bandwidth is specified as >= 100 MHz, so 158 MHz is perfectly within specs. These measurements were at 20 degC or so, so at the max 50 degC, the 200mV/div line might be close to 100MHz.

Your lines B and C should be horizontal, IMO, but I assume your numbers are correct and just quickly drew it in something like Paint. If you want raw data or different graphs, I can post that, too.

Remember that the resolution is 8-bits at best (without any noise or instability), and even less at 1V/div and up, because I was unable to generate >5.5V signals, and didn't have an amplifier that I trusted to be flat within 3% or so from 50kHz to 250MHz. I also can't prove that the SG-503 still meets its factory specs, although I don't have any indication that it doesn't.

This test is probably hard to do without leveled signal source, you would have to calibrate either your generator's amplitude over time with a spectrum analyzer / scope with a much larger bandwidth, or calibrate generator amplitude linearity at high frequencies. The best you could do is set the generator to a constant frequency and amplitude, terminate at scope end, and vary the scope's vertical setting, like what I mentioned about the 270MHz performance. Because of the constant amplitude, you would only be able to test a few vertical ranges at the same time, and it would only tell you about relative spread, not roll-off.
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11631
  • Country: my
  • reassessing directives...
Re: Dependence of a scope's frequency response on vertical attenuator setting
« Reply #7 on: February 01, 2011, 03:13:18 pm »
isnt there a possibility diy'ing the flat source generator. even if we have to manually reset gain or the circuit to a known Vpp at certain frequency. i have this crazy idea of testing Vpp of a high frequency oscilation (sine or square etc) using full bridge rectifier with cap and measured by DMM, but i dont know, the rectifier might not be fast enough (up to 200MHz), and i'm too noob for this kind of discussion.

Quote
This test is probably hard to do without leveled signal source, you would have to calibrate either your generator's amplitude over time with a spectrum analyzer / scope with a much larger bandwidth, or calibrate generator amplitude linearity at high frequencies
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

alm

  • Guest
Re: Dependence of a scope's frequency response on vertical attenuator setting
« Reply #8 on: February 01, 2011, 04:01:47 pm »
isnt there a possibility diy'ing the flat source generator. even if we have to manually reset gain or the circuit to a known Vpp at certain frequency. i have this crazy idea of testing Vpp of a high frequency oscilation (sine or square etc) using full bridge rectifier with cap and measured by DMM, but i dont know, the rectifier might not be fast enough (up to 200MHz), and i'm too noob for this kind of discussion.
You need some kind of reference, even for calibrating your probe. Being able to measure the level is about as good as a constant level, it just takes more work. There are commercial RF detector probes (which is kind of what you're describing), but they usually have their own roll-off at high frequencies (for example, the HP 11096A/B is +/- 0.5dB to 100MHz, and +/- 1.2dB to 500MHz). I wouldn't count on your standard diode/capacitor to be flat over the frequency range, at some frequency, self-resonance becomes an issue. You also still need a stable generator with the frequency range (well beyond the rated bandwidth) and voltage range (ideally from a few mV for the low ranges up to tens of volts for the high ranges).
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Re: Dependence of a scope's frequency response on vertical attenuator setting
« Reply #9 on: February 03, 2011, 03:44:59 pm »
Here's a prelim graph relative spread; 50kHz as baseline amplitude compared to 25 MHz and 50 MHz, terminated into 50 ohms resistor, 50 ohm cable.  Signal source was the Hantek 3x25, and the ?? are my uncertainty of measurements before the graph and ? is after.

Vpp measurements were taken from Rigol's automated measurements; even if questionable accuracy, it far more stable a procedure than my eyeballing the divisions and has precision limited by the 1052E's vertical amp specification.  They were measured with the average acquisition mode to reduce per waveform variation, and is also how the 1052E's vertical amp specification is reported.

The ?? and ? marks my uncertainty, and is a reason this is a prelim graph; I'm checking the signal source and seeing how reliable it really is there.  The reason are the peak acquisition versus average acquisition Vpp are relatively similar. at most vary <= 400uV, however at the ?? there is over mV differences due to Hantek's reduced output at 50 MHz and inability to provide a stable output below 100mVpp.

So the region between red lines are the better measurements.

I'll be replacing the graph when my 50 ohm terminator and adapters arrive.  So while I think the absolute reduction in -dB from 50 kHz can be questionable, the relative change from one V/div setting to another is fairly consistent and should not vary much when the next measurement cycle is made.



The best you could do is set the generator to a constant frequency and amplitude, terminate at scope end, and vary the scope's vertical setting, like what I mentioned about the 270MHz performance. Because of the constant amplitude, you would only be able to test a few vertical ranges at the same time, and it would only tell you about relative spread, not roll-off.
« Last Edit: February 03, 2011, 03:53:38 pm by saturation »
Best Wishes,

 Saturation
 

alm

  • Guest
Re: Dependence of a scope's frequency response on vertical attenuator setting
« Reply #10 on: February 03, 2011, 06:34:34 pm »
The ?? and ? marks my uncertainty, and is a reason this is a prelim graph; I'm checking the signal source and seeing how reliable it really is there.  The reason are the peak acquisition versus average acquisition Vpp are relatively similar. at most vary <= 400uV, however at the ?? there is over mV differences due to Hantek's reduced output at 50 MHz and inability to provide a stable output below 100mVpp.
It's hard to tell wether the variation is in the generator or in the scope, IMO. The generator is clearly not flat over the frequency range (can't imagine the scope being -11dB at its rated bandwidth), and the low amplitudes look suspicious, so who knows what's going on over the rest of the range. It's not surprising to me to see cheap generators not having a flat response, some don't even bother calibrating the output amplitude and just show it in relative units.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf