Author Topic: Will Keysight upgrade the 2000, 3000T X-Series Oscilloscopes within a few months  (Read 36875 times)

0 Members and 1 Guest are viewing this topic.

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Again, more bullshit from the math professors.

 :palm:  Look, it's pretty clear now that you don't understand basic signal processing, but that's no reason to become aggressive.

Quote
ALL digital sampling systems only provide you with a presentation of the data that is an approximation of the actual signal.  The more data points you have in a given time period, the better that approximation will be.  I don't understand why people don't get this.

*Any* scope only provides you with an approximation of the actual signal, even the analog scopes which is probably with what you spent most of your 40 years "experience" on.

The key is to know how a specific instrument impacts the measurement.

Quote
If you have a digital camera, more pixels is better

 :palm: No, it's not. As others explained more pixels are useless if the optical path limits the physical resolution. There are tons of high MPx cameras out there who take shit pictures because they only have a crappy plastic lens.

Quote
When I use a 'scope to look at a signal in the real world (as opposed to the dreamy Utopian academic world), I'm needing to look at the signal because there might be something there that I'm not expecting.  If the signal does not look like it's supposed to, then I need to take action to correct the problem, which might be any number of things.  Mathematical reconstruction algorithms (and I have used them too) only work well when you know in advance what the signal is supposed to look like.  When we are debugging a design, we don't know in advance what the signal will look like, especially if the circuit is misbehaving.  So, in this case, it's not a matter of applying a mathematical model to reconstruct the signal from a limited data set.  It's a matter of having an overwhelming amount of sample points such that reconstruction of the signal is not necessary other than perhaps a small amount of sin(x)/x smoothing for "nice looking" interpolation between data points.  More samples per horizontal division equals a better idea of what's really going on at the probe tip.

No, it doesn't.  :palm: 

Besides, signal theory isn't just an abstract thing, it's how the real world works, and basic knowledge for every engineer who works with any kind of signal processing. Fending it off as "bullshit" is stupid and only highlights your ignorance.

Quote
So, you are not impressing me with your academic view of a real-world problem, especially when empirically derived data does not match with your theoretical nonsense.  And worse yet, you are fooling younger players into believing that their 500MHz 'scope will be able to see a 500MHz signal with astounding detail.

Frankly, the real world gives a shit if you're impressed or not, facts and realities are not going away just because you put fingers in your ear to avoid getting confronted with it.

I have to say it's actually quite shocking to see someone who claims to be an engineer being so hostile to what really is basic knowledge for any EE these days. To some extend I understand that this isn't necessarily something that was taught 40 years ago, but a good engineer doesn't stop stop learning after graduation.

You'd be well advised to heed some of the advice that was given, and get a better understanding with the basic principles that make DSOs work. Your understanding of what you're measuring will improve as well. A lot.

Quote
I don't know....  Kids these days ...

Yeah, stupid kids.  All young and dumb, right? |O 

FYI, many of the posters here are a lot closer to retirement than graduation, and that includes me. Which shows you don't have to be young to understand signal processing.  ;)
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
The issue here is that you are both looking at this from different angles. Whilst it's useful to have a basic grasp of sampling theory you do also have to consider the real world. This means real world signals and the limitations of real world instruments.

Hiding behind sampling theory with a perfect sinewave test signal only gets you so far. I think DM is trying to tell you that you usually won't 'know' the spectrum of signals that are arriving at the scope input.

So you can get issues when your scope doesn't have enough sample rate to cope with the full spectrum of signals that can get through to the sampler. So you can see strange effects.

The scope sampling process itself won't be perfect in various ways and this can cause fake artifacts to appear too. So even a high sample rate isn't enough on its own if the scope hardware is flawed. All of this can mean that interpolation/reconstruction algorithms can let you down in the real world with real world signals and real world scopes.

You can quote sampling theory with your perfect sinewave until you are blue in the face but that won't change the fact that some DSOs can suffer sampling problems, eg alias effects due to their inadequate (max) sample rate wrt the spectrum of signals that can actually get through to the sampler. eg there are going to be scopes marketed as a 500MHz BW model with 2GS/s that are going to have subtle alias issues despite the 4x ratio of the 500MHz BW logo and 2GS/s sample rate. Other scopes may have subtle issues with the sample timing within the ADC hardware. The result of this will be a reconstruction algorithm that gives false artifacts on the display.

i.e. telling DM (again and again) that a perfect sinewave can in theory be reconstructed with a certain sample rate doesn't solve the problems that real scopes face with real signals.





« Last Edit: September 25, 2016, 11:17:25 am by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
You make it sound like black magic again. However it is the other way around: if you have a good grasp of sampling theory you know the limitations of the DSO you are using and thus can  identify artifacts and aliasing issues more quickly even if you don't know the spectrum of a signal. Peak-detect is one of the tools on a DSO which is very helpfull with that.

You also have to realise that a sine wave must have a significant amount of distortion before it starts looking odd on an oscilloscope. So even though the hardware and the signal reconstruction algorithm may not be perfect chances are these do not result in visual distortions.
« Last Edit: September 25, 2016, 11:43:23 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline lem_ix

  • Regular Contributor
  • *
  • Posts: 192
  • Country: cs
This is engineering, you don't just put a better adc because you *feel* you need it, same with the analog front end. Bottom line a 500 mhz scope only means that you'll be able to see a 500 mhz spectral component reasonably well. For starters maybe use FFT on a square wave and see what you're dealing with. For example you need a 100 Thz scope  to see why this discussion is going on under this topic :D Maybe it's a good topic for the next vid in Dave's scope series.
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
I'm not trying to make it sound like black magic. I'm just trying to say that real world signals are not just pure sinewaves and real world hardware doesn't always sample accurately (or fast enough) to prevent artifacts appearing on the display.

Also, I think you will agree that you can't just rely on the real time sampling mode when looking at wideband signals that cause aliasing. The repetitive mode on my old HP 2GS/s DSO can be useful here. But if I just looked at 500MHz 2GS/s on the front panel I might think that I can look at any waveform in real time mode with no risk of aliasing artefacts. But this isn't the case in reality.

Quote
You also have to realise that a sine wave must have a significant amount of distortion before it starts looking odd on an oscilloscope. So even though the hardware and the signal reconstruction algorithm may not be perfect chances are these do not result in visual distortions.
Someone with more scope experience than me may be best placed to answer this but try using a scope that uses lots of (interleaved) ADCs that have to be carefully synched up in order to prevent sampling artefacts becoming too objectionable. If there is imbalance in the scaling or timing then you can begin to see distortion on a sinewave in the scope hardware caused by reconstruction anomalies.

Going back a few posts, I could post up what my old HP54540C looks like with a 600MHz cw signal from a sig gen and it's going to look fairly similar to what KE5FX produced on that old Tek TDS694C scope in terms of wobble/jitter. But the reason I didn't comment on the scope video (until now) was because I can't trust what a youtube video is going to show me because I don't know what is caused by the scope and what is caused by the camera and/or youtube from frame to frame. Obviously, there is some wobble/jitter but I'd prefer to look at this directly rather than via a camera and youtube.

But with my admittedly limited experience of using fast DSOs I'd expect to see wobble on the signal when sampling a 3GHz signal at 10GS/s using interpolation on a real world scope like that old Tek TDS694C. But I haven't tried to make too much sense of the youtube video because of the risks of compression artefacts in the video. Also, I've never used a Tek TDS694C.
« Last Edit: September 25, 2016, 02:42:01 pm by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
When using interleaved ADCs the calibration procedure cancels all the errors so together with a stable design it will produce good repeatable results. Otherwise the DSO would be useless and the whole exersize to build it futile.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline nfmax

  • Super Contributor
  • ***
  • Posts: 1560
  • Country: gb
The wobble/jitter on the video under discussion may simply be due to less than perfect reconstruction. In general, the signal displayed is not synchronous with the ADC clock. If the trigger is operating correctly, x position on the screen is a function purely of the time from the trigger point. On each successive acquisition, the ADC sample clocks will be at different offsets from the trigger instant because the ADC clock is not synchronous with the displayed signal. Hence on successive sweeps the 'dots' will move, wobble, or jitter with respect to the graticule. This is the correct behaviour. If you select linear interpolation, the straight lines joining these wobbling dots will jump around all over the place, when the signal displayed occupies a large fraction of the Nyquist bandwidth. If you switch to perfect sinx/x interpolation (and the trigger is operating correctly), you should end up with a smooth, stable signal, on which the sample dots (if shown) jump around from acquisition to acquisition, but without changing the shape of the interpolated curve..

This effect is clearly evident when linear interpolation is used: it reduces dramatically, but does not vanish entirely, when sinx/x interpolation is used. This may be because the approximation to sinx/x used by the scope is not very accurate (it has to be an approximation as true sinx/x requires an infinitely long time record); or because a digital trigger interpolator in the scope is using linear interpolation instead of of sinx/x; or because an analog trigger interpolator can select among only a discrete number of interpolation delays. Tektronix invented triggered sweep: I don't think the trigger will be to blame (unless there is an instrument fault, of course). My suspicion is a poor choice of window, or too short a window, on the sinx/x interpolator, probably a limitation of the technology at the time.
« Last Edit: September 25, 2016, 03:17:39 pm by nfmax »
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
When using interleaved ADCs the calibration procedure cancels all the errors so together with a stable design it will produce good repeatable results. Otherwise the DSO would be useless and the whole exersize to build it futile.
So people who try and evaluate the effects of interleave distortion in scopes and write papers about it are just wasting their time because this form of distortion doesn't exist in real scopes because you can just calibrate it away?

It's also interesting that in your world the scope hardware is either error free or it is useless.
« Last Edit: September 25, 2016, 06:45:12 pm by G0HZU »
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
This is engineering, you don't just put a better adc because you *feel* you need it, same with the analog front end. Bottom line a 500 mhz scope only means that you'll be able to see a 500 mhz spectral component reasonably well. For starters maybe use FFT on a square wave and see what you're dealing with. For example you need a 100 Thz scope  to see why this discussion is going on under this topic :D Maybe it's a good topic for the next vid in Dave's scope series.

Yes, I can give an extreme example here. My HP54540C has an AC test signal that is a 500Hz square wave (yes, just 500 Hertz) but the scope can't evaluate this waveform properly even at 1GS/s in real time mode.  It looks better at 2GS/s in real time but even then I can see some subtle alias effects on the signal. It does much better in repetitive mode. I think the scope would work slightly better if it could manage 4GS/s in real time but the benefits would only be small. It therefore depends on how critical the requirements are and how forgiving the user is with respect to playing with the scope to provide a workaround. eg the user can switch to repetitive mode if this mode is suitable to use on the waveform under test and they can then get a much higher effective sample rate.



« Last Edit: September 25, 2016, 06:36:16 pm by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
When using interleaved ADCs the calibration procedure cancels all the errors so together with a stable design it will produce good repeatable results. Otherwise the DSO would be useless and the whole exersize to build it futile.
So people who try and evaluate the effects of interleave distortion in scopes and write papers about it are just wasting their time because this form of distortion doesn't exist in real scopes because you can just calibrate it away?

It's also interesting that in your world the scope hardware is either error free or it is useless.
You are reading something which isn't there. If you don't calibrate the errors away then you'll see them for sure and the waveform gets distorted. Look at the thread about hacking the TDS744A into a TDS784A by changing an option jumper. Because the ADCs are used for interleaving at 4Gs/s instead of 2Gs/s a new calibration / adjustment procedure is needed.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
The wobble/jitter on the video under discussion may simply be due to less than perfect reconstruction. In general, the signal displayed is not synchronous with the ADC clock. If the trigger is operating correctly, x position on the screen is a function purely of the time from the trigger point. On each successive acquisition, the ADC sample clocks will be at different offsets from the trigger instant because the ADC clock is not synchronous with the displayed signal. Hence on successive sweeps the 'dots' will move, wobble, or jitter with respect to the graticule. This is the correct behaviour. If you select linear interpolation, the straight lines joining these wobbling dots will jump around all over the place, when the signal displayed occupies a large fraction of the Nyquist bandwidth. If you switch to perfect sinx/x interpolation (and the trigger is operating correctly), you should end up with a smooth, stable signal, on which the sample dots (if shown) jump around from acquisition to acquisition, but without changing the shape of the interpolated curve..

This effect is clearly evident when linear interpolation is used: it reduces dramatically, but does not vanish entirely, when sinx/x interpolation is used. This may be because the approximation to sinx/x used by the scope is not very accurate (it has to be an approximation as true sinx/x requires an infinitely long time record); or because a digital trigger interpolator in the scope is using linear interpolation instead of of sinx/x; or because an analog trigger interpolator can select among only a discrete number of interpolation delays. Tektronix invented triggered sweep: I don't think the trigger will be to blame (unless there is an instrument fault, of course). My suspicion is a poor choice of window, or too short a window, on the sinx/x interpolator, probably a limitation of the technology at the time.
Yes, it's difficult to guess what is happening in the video. If I had access to the TDS694C then I'd try a few experiments. I'd make sure that the 3GHz test signal had very low harmonic distortion (add a lowpass filter?) and then feed it to the scope. This would rule out distortion issues with the signal itself.

Presumably the trigger issue could be ruled out if the scope was set to single shot mode. Just take one trace and then do an FFT on the data? If the scope can't do this then maybe dump out the raw data to Excel or Matlab and do an FFT offline.

I'm not a scope guru but my guess is that the FFT may well show what is happening. If the problem is noise related then the FFT would show just noise and the 3GHz signal and low distortion terms.

However, if this was a noise/interpolation/distortion? issue then I think the scope FFT may well show folded back harmonic terms in the display. So the FFT may show the 3GHz signal but also folded back harmonic distortion terms that appear as alias terms at 1GHz, 2GHz, 4GHz etc. These would cause wobble on the signal when viewed when the scope was set back to 'run' mode such that several traces per second are displayed. If the display was a lot faster in terms of displayed waveforms per second, you would probably just see a fatter/graded trace that would smear the wobble making it appear less obvious. But all this is just a guess, I don't do much with digital scopes I'm really just an RF person.
« Last Edit: September 26, 2016, 01:03:52 am by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
This guy doesn't seem to have problems triggering on a 3GHz signal but I just browsed through the video quickly since it is way past bedtime for me:
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline KE5FX

  • Super Contributor
  • ***
  • Posts: 1891
  • Country: us
    • KE5FX.COM
Yes, it's difficult to guess what is happening in the video. If I had access to the TDS694C then I'd try a few experiments. I'd make sure that the 3GHz test signal had very low harmonic distortion (add a lowpass filter?) and then feed it to the scope. This would rule out distortion issues with the signal itself.

As I pointed out earlier, the 3 GHz test signal is fine.  No harmonic problems, and it has about 140 fs of jitter in the 1 Hz-100 Hz range that would affect display stability.



It's tough to support a claim that "they all do that" based on N=2, but I have two TDS694Cs and they behave exactly the same.  (Edit: N=3, looking at the video in nctnico's post near 6:20.)  The only way that acquisition could look rock-solid from one sweep to the next is if either (a) the sampling rate were much higher -- perhaps 30 GHz or more -- or (b) the scope were capable of interpolating the trigger point to a position relative to the reconstructed sine wave.

The TDS694C's claim to fame was its realtime nature.  There were plenty of sampling scopes capable of doing a better job rendering a 3 GHz CW signal, but the 694C was meant for single-shot acquisitions, which in 1999 was comparatively rare in the 3 GHz class.   It's like watching a pig fly -- it would not normally occur to me to criticize its form, because I'm impressed that the thing got off the ground in the first place.   I'm sure LeCroy had something on the market that would blow the 694C away, of course, but it would have been relatively obscure and I probably haven't heard of it.  |O

Quote
Presumably the trigger issue could be ruled out if the scope was set to single shot mode. Just take one trace and then do an FFT on the data? If the scope can't do this then maybe dump out the raw data to Excel or Matlab and do an FFT offline.

I'm not a scope guru but my guess is that the FFT may well show what is happening. If the problem is noise related then the FFT would show just noise and the 3GHz signal and low distortion terms.  However, if this was a noise+interpolation issue then I think the scope FFT may well show folded back harmonic terms in the display caused by the less than perfect interpolation. So the FFT may show the 3GHz signal but also folded back harmonic distortion terms that appear as alias terms at 1GHz, 2GHz and 4GHz caused by the imperfect interpolation and limited sample rate. These would cause wobble on the signal when viewed when the scope was set back to 'run' mode such that several traces per second are displayed. If the display was a lot faster in terms of displayed waveforms per second, you would probably just see a fatter/graded trace that would smear the wobble making it appear less obvious. But all this is just a guess.

Several different things could be happening.  Timebase jitter could do it (although it would have to be really severe), low-quality interpolation certainly wouldn't help, a missing (or possibly faulty) trigger interpolator could be responsible, inadequate antialias filtering could do it if the HP 8672A had significant harmonic content (which it doesn't), but IMHO the only real problem is a low oversampling margin in a scope that's too old to hide its faults in software.   

If I look at the FFT display, there's about 15 dB of suppression on the alias at 7 GHz:



However, that "15 dB" is an artifact of performing the FFT in screenspace, like a lot of older scopes do.   We're just seeing the Fourier transform of some connected dots on a screen.   That places Tektronix in a state of sin(c):



Again, the fact that this scope is almost 20 years old gives them some room for absolution, I think. 
« Last Edit: September 26, 2016, 02:20:14 am by KE5FX »
 

Offline MarkL

  • Supporter
  • ****
  • Posts: 2131
  • Country: us
...
Several different things could be happening.  Timebase jitter could do it (although it would have to be really severe), low-quality interpolation certainly wouldn't help, a missing (or possibly faulty) trigger interpolator could be responsible, inadequate antialias filtering could do it if the HP 8672A had significant harmonic content (which it doesn't), but IMHO the only real problem is a low oversampling margin in a scope that's too old to hide its faults in software.   
...
I think G0HZU's idea to examine the FFT is a good approach, but it has to be done with more points and no interpolation getting in the way.

Do you want to do a max points binary capture and post it?  I wouldn't mind playing around with it.  Could be interesting.
 

Offline KE5FX

  • Super Contributor
  • ***
  • Posts: 1891
  • Country: us
    • KE5FX.COM
...
Several different things could be happening.  Timebase jitter could do it (although it would have to be really severe), low-quality interpolation certainly wouldn't help, a missing (or possibly faulty) trigger interpolator could be responsible, inadequate antialias filtering could do it if the HP 8672A had significant harmonic content (which it doesn't), but IMHO the only real problem is a low oversampling margin in a scope that's too old to hide its faults in software.   
...
I think G0HZU's idea to examine the FFT is a good approach, but it has to be done with more points and no interpolation getting in the way.

Do you want to do a max points binary capture and post it?  I wouldn't mind playing around with it.  Could be interesting.

There's only so much it could tell us, unfortunately.  Being generated from a real signal, it would have no useful data past Nyquist (5 GHz in this case.)   And at 130K points, it would be much too short to reveal the close-in noise pedestal of the sampling clock.

I did try feeding in 7 GHz to see how good the analog antialiasing filter was.  I don't remember the exact figure but it was at least 30-40 dB down.  Sure wish somebody would leak the schematics for this scope...
 

Offline MarkL

  • Supporter
  • ****
  • Posts: 2131
  • Country: us
There's only so much it could tell us, unfortunately.  Being generated from a real signal, it would have no useful data past Nyquist (5 GHz in this case.)   And at 130K points, it would be much too short to reveal the close-in noise pedestal of the sampling clock.

I did try feeding in 7 GHz to see how good the analog antialiasing filter was.  I don't remember the exact figure but it was at least 30-40 dB down.  Sure wish somebody would leak the schematics for this scope...
I was also going to try running a sin(x)/x reconstruction on it to see if there was the amplitude variation evident in the video.  I agree there's probably nothing to be seen with only 130k pts unless there's something terribly wrong (I thought the record length was longer).
 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 37740
  • Country: au
    • EEVblog
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
 

Offline MarkL

  • Supporter
  • ****
  • Posts: 2131
  • Country: us
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
I'm sure the answer will become clear after we figure out why Tektronix's sin(x)/x doesn't work well.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
I don't think anyone cares. Stay tuned...
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online JPortici

  • Super Contributor
  • ***
  • Posts: 3461
  • Country: it
pretty sure daniel answered the question in first or second page
 

Offline Hydrawerk

  • Super Contributor
  • ***
  • Posts: 2600
  • Country: 00
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
Well, there is still no industrial competitor for these Keysight scopes from Tektronix or LeCroy.
OK, someone might like WaveSurfer 3000 or Tek MSO2000B.
Keysight has no need to hurry with a new scope model.
Amazing machines. https://www.youtube.com/user/denha (It is not me...)
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3015
  • Country: gb
...
Several different things could be happening.  Timebase jitter could do it (although it would have to be really severe), low-quality interpolation certainly wouldn't help, a missing (or possibly faulty) trigger interpolator could be responsible, inadequate antialias filtering could do it if the HP 8672A had significant harmonic content (which it doesn't), but IMHO the only real problem is a low oversampling margin in a scope that's too old to hide its faults in software.   
...
I think G0HZU's idea to examine the FFT is a good approach, but it has to be done with more points and no interpolation getting in the way.

Do you want to do a max points binary capture and post it?  I wouldn't mind playing around with it.  Could be interesting.

There's only so much it could tell us, unfortunately.  Being generated from a real signal, it would have no useful data past Nyquist (5 GHz in this case.)   And at 130K points, it would be much too short to reveal the close-in noise pedestal of the sampling clock.

I did try feeding in 7 GHz to see how good the analog antialiasing filter was.  I don't remember the exact figure but it was at least 30-40 dB down.  Sure wish somebody would leak the schematics for this scope...
I was really thinking more in terms of the total wideband system noise. I'm sure your sig gen will not be this noisy (unless it was faulty) but a 3GHz BW is 95dBHz so a wideband noise floor of -135dBc/Hz would produce a S/N ratio of about 40dB in this BW. Presumably, this level of noise would be just enough to begin to show on the scope trace.

Also, I don't know much about the wideband noise contribution in the scope itself. This could come from various places and some of it may even fold over to make things worse. But I'm just guessing.

That was the idea of the FFT analysis. Dump out a full single shot and see if it shows lots of noise or if there are discrete distortion terms visible caused by sampling effects etc.

It might help with the diagnosis.
« Last Edit: September 27, 2016, 12:05:47 am by G0HZU »
 

Offline LabSpokane

  • Super Contributor
  • ***
  • Posts: 1899
  • Country: us
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?

Yes.  I have the only one.  It has a 3dB brick wall bandwidth of 867.5309 GHz +/- 6.022x10^-23 Hz.

The front end is so advanced, I can use a rusty nail and a coat hanger covered in black tape for a probe with complete confidence in the results.  There's a firmware easter egg that has a video clip of Beavis and Butthead singing "fuck Nyquist and his Theorem."

No, I will not do a teardown.  No, I will not be mailing it to you. 
 
The following users thanked this post: Wuerstchenhund

Offline heavenfish

  • Newbie
  • Posts: 8
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
Well, there is still no industrial competitor for these Keysight scopes from Tektronix or LeCroy.
OK, someone might like WaveSurfer 3000 or Tek MSO2000B.
Keysight has no need to hurry with a new scope model.

Agreed. The only thing not so great of 2k/3k is record length because Keysight uses on chip memory. I don't expect Keysight to change that since it's key to achieve current cost structure. It doesn't make sense to try to design a better but more expensive 2k/3k successor. If keysight had new technology to increase the memory size, sample rate or resolution of the ASIC, why not update the 4k or creat a new 5k product. It certainly will have better return from the investment.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
So after 5 pages of this did anyone figure out if Keysight are releasing a new 2000/3000 scope in the next few months?
Well, there is still no industrial competitor for these Keysight scopes from Tektronix or LeCroy.

Of course there are. You said it yourself:

Quote
OK, someone might like WaveSurfer 3000 or Tek MSO2000B.

Tek's MSO/DPO2000B is the competitor to the Keysight DSOX/MSOX2000A, and the MDO3000 to the DSOX/MSOX3000A/T. I know, pretty much no-one buys them but still they do exist ;)

The WaveSurfer 3000 is a competitor to the DSOX/MSOX3000A/T and DSOX/MSOX4000A (it sits right between both models), and is actually taking quite a few of the DSOX/MSOX sales. Ever wondered why Keysight bothered to come out with a touch screen version of the DSOX/MSOX3000A? ;)

Quote
Keysight has no need to hurry with a new scope model.

No, because most of the DSOX/MSOX models still sell well enough.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf