Author Topic: DSO waveform update rate importance in practice  (Read 11912 times)

0 Members and 1 Guest are viewing this topic.

Offline VanitarNordicTopic starter

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
DSO waveform update rate importance in practice
« on: October 05, 2015, 06:17:59 am »
Hello,

in what kinds of practical applications the DSO waveform update rate is matter?
 

Offline rf-loop

  • Super Contributor
  • ***
  • Posts: 4126
  • Country: fi
  • Born in Finland with DLL21 in hand
I drive a LEC (low el. consumption) BEV car. Smoke exhaust pipes - go to museum. In Finland quite all electric power is made using nuclear, wind, solar and water.

Wises must compel the mad barbarians to stop their crimes against humanity. Where have the wises gone?
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27193
  • Country: nl
    • NCT Developments
Re: DSO waveform update rate importance in practice
« Reply #2 on: October 05, 2015, 08:12:52 am »
Hello,
in what kinds of practical applications the DSO waveform update rate is matter?
Not many. Manufacturers keep touting the chance their machines offer to display an anomaly. They totally forget that people need to look at the screen to actually see it! For something which occurs less than a few times per second it is totally useless. A well devised trigger condition (or mask test) is much more useful. Don't forget that even on scopes with a very short blind time there is still a chance the anomaly doesn't get displayed. The chance you look away or blink at that time is also huge. If I need to check a signal I use infinite persistance, setup a trigger (preferred) or use the mask test feature and let it run for a day. At the end of the day I check the result.
« Last Edit: October 05, 2015, 08:16:25 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: DSO waveform update rate importance in practice
« Reply #3 on: October 05, 2015, 11:57:30 am »
Not many. Manufacturers keep touting the chance their machines offer to display an anomaly. They totally forget that people need to look at the screen to actually see it! For something which occurs less than a few times per second it is totally useless. A well devised trigger condition (or mask test) is much more useful. Don't forget that even on scopes with a very short blind time there is still a chance the anomaly doesn't get displayed. The chance you look away or blink at that time is also huge. If I need to check a signal I use infinite persistance, setup a trigger (preferred) or use the mask test feature and let it run for a day. At the end of the day I check the result.

Ntnico is absolutely right, the wavform update rate is absolutely overrated, mostly because a certain large T&M company that not too long ago changed its name again had decided to make it the focus of their marketing campaign for their entry level and mid-range oscilloscopes. Naturally this meant others like R&S had to follow suit to avoid being seen as staying behind.

Back in the days of analog scopes the waveform rate was very important, as scopes had only very simple triggers and no or in the best case some primitive storage tube which could keep the waveform on the screen for a short while longer. There was no memory, no analysis, no signal awareness, so for finding glitches and runts the operator's eye played an important role in finding them (although even if he find them doesn't mean he could measure and quantify them).

These days even low end scopes have pretty versatile triggers, and even a bottom-of-the-barrel scope offers lots of memory for storage. Move to a mid-range scope and you'll get very advanced real-time analyzers that can fish out any signal deviation you want, plus keep a register of when, where and how often a glitch occurred, so you can go back to the event and have all the relevant parameters on the screen.

However, the main reason why waveform rates are still a marketing argument is because especially entry level scopes are often bought and used by people who are used to analog scopes and who treat a DSO like an analog scope (of course thereby foregoing all the advantages of a digital scope), and they want their DSO's screen to update fast enough to capture glitches by eye. In addition, in terms of the mentioned large T&M manufacturer who recently changed its name again, their entry-level scopes use a properietary ASIC who offers high update rates but also limits sample memory to a meager 4M, so naturally the positives (fast update rate) are highlighted for marketing purposes.

I do have a scope that is specified with a high waveform update rate (1.25m wfms/s) and one with only 250k waveforms, and I can find alll glitches with the slower scope just fine (even more so since the slower scope has more analysis tools). Heck even this old scope with a very low waveform rate could capture every problem I threw at it, simply because of its very advanced triggers and analysis tools.
 
The following users thanked this post: PixieDust

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4997
  • Country: si
Re: DSO waveform update rate importance in practice
« Reply #4 on: October 05, 2015, 03:28:11 pm »
It is important...up to a point.

Scopes with a very slow update rate of like a few 100 wont be able to make a very nice intensity gradated image if they support it. It simply does not capture enough data to build up a good image. But once you get past a few 10s of thousands updates per second the waveform starts looking pretty nice with all the analog like fuzziness on it. As you go past that to 1 million updates it won't really improve the waveform image any, it will just be much more likely to catch infrequent glitches.

Its one of those things that is nice to have but past a certain point the advantage of more updates levels off. If you got 50 000 updates per second there is no reason to upgrade to a new scope just to get those 1 milion updates. In general these fast 10s of thousand update rate and faster scopes draw the waveform using hardware acceleration in a FPGA or ASIC. Scopes that draw the waveform in the CPU tend to not be able to make a nice looking intensity gradated image so usually getting a reasonably fast update rate scopes means better looking waveforms.

I recently bought a big boatanchor of a scope running windows and like all scopes with a pc hidden inside it does not get a ton of waveform updates per second. I think it only gets up to about 4000 on small memory sizes.(It does 100k updates using segmented memory tho) So waveforms don't quite look as analog like on it and i still prefer my old Agilent 6000 series scope for quick poking around because its so fast (Not only waveform updates but the UI is super quick). But if you really need to look at the "fuzzynes" on a signal (Perhaps you want to measure jitter or something) you can turn on color gradation and it will build up a nice soft image of the waveform over a few seconds.
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27193
  • Country: nl
    • NCT Developments
Re: DSO waveform update rate importance in practice
« Reply #5 on: October 05, 2015, 04:55:15 pm »
Scopes that draw the waveform in the CPU tend to not be able to make a nice looking intensity gradated image so usually getting a reasonably fast update rate scopes means better looking waveforms.
AFAIK Windows based scopes let the hardware draw the trace in a video overlay. The CPU nor the graphics card have anything to do with it. The downside is that the display is often limited to 640x480 due to earning back the ASIC development over several product generations. Just look at Tektronix' TDS5000 and TDS7000 series. Large display but still 640x480  :palm:
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2230
  • Country: 00
Re: DSO waveform update rate importance in practice
« Reply #6 on: October 05, 2015, 05:09:19 pm »
In theory it sounds nice that scopes with low or medium waveform update rates can compensate this with advanced trigger settings and/or analysis.
But in practice, it doesn't help much because with those scopes you have to actively search for it.

With a fast scope, many times you "accidently" discover problems which would otherwise go undetected untill you are in later stage of development.
A fast scope can save you time and money compared to scopes with low/mediocre update rates.

Don't believe the fairy tales of salesmen where they tell you that advanced trigger analisys can compensate for slow waveform updates.

Once you have some practical experience with fast scopes, you never want to ga back to a scope with slow waveform updates.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: DSO waveform update rate importance in practice
« Reply #7 on: October 05, 2015, 05:44:03 pm »
Scopes with a very slow update rate of like a few 100 wont be able to make a very nice intensity gradated image if they support it. It simply does not capture enough data to build up a good image.

That's not true, the capability to display intensity grading has nothing to do with the waveform update rate aside from a slower scope requiring a bit longer to build up the graduation). My old LeCroy WaveRunner LTs had very low update rates, and still they provided the same great intensity grading (monochrome and color) as my current high end scope (although they of course lack the 3D function).

Quote
In general these fast 10s of thousand update rate and faster scopes draw the waveform using hardware acceleration in a FPGA or ASIC. Scopes that draw the waveform in the CPU tend to not be able to make a nice looking intensity gradated image so usually getting a reasonably fast update rate scopes means better looking waveforms.

Not true. In fact, the first scope offering 1.25M waveforms (LeCroy WaveRunner Xi) that came out in 2006 was a scope that didn't use ASICs but does all calculations in the CPU's L2 cache, and had the waveform drawn by the intel chipset GPU.

Quote
I recently bought a big boatanchor of a scope running windows and like all scopes with a pc hidden inside it does not get a ton of waveform updates per second. I think it only gets up to about 4000 on small memory sizes.

That's more down to your particular scope than a general rule. 4k waveforms sound like an older Agilent Infiniium or Tek, and yes, their older Windows scopes were pretty slow. The reason for that is that both relied on ASICs to do the work (Agilent's implementation was particularly poor, usually maxing out at 2500 waveforms/s, plus the scope app could only run fullscreen).
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: DSO waveform update rate importance in practice
« Reply #8 on: October 05, 2015, 06:01:24 pm »
In theory it sounds nice that scopes with low or medium waveform update rates can compensate this with advanced trigger settings and/or analysis.
But in practice, it doesn't help much because with those scopes you have to actively search for it.

No, you don't.

Quote
With a fast scope, many times you "accidently" discover problems which would otherwise go undetected untill you are in later stage of development.
A fast scope can save you time and money compared to scopes with low/mediocre update rates.

If you need to stare on a scope's screen for an extended period of time to find out if there are any problems with your signal then I'd say you're doing it seriously wrong (or you have a really primitive DSO). Even a modern bottom-of-the-barrel scope can do mask testing, which allows you to find any signal deviation for a repetitive signal you might not know about, and will do it much more reliable than a human staring on a screen (plus it allows you to capture the event, so you can examine the problem which helps finding out what's causing it).

On better scopes you get a wide range of very flexible triggers which can quickly scan the signal in real-time for any deviation, find it, capture it and analyze it. That even includes serial busses. Good luck doing this with your stare  8)

Quote
Don't believe the fairy tales of salesmen where they tell you that advanced trigger analisys can compensate for slow waveform updates.

At the moment these salesmen are busy telling world and dog that excessive waveform rates are the best thing since sliced bread.

Quote
Once you have some practical experience with fast scopes, you never want to ga back to a scope with slow waveform updates.

Well, I have both (a fas scope with 1.25M waveforms/s), some slower scopes, and until not too long ago also a very slow scope. The letter did (like the others) have very advanced triggering, and there is really nothing I can think of that I wouldn't have found with the slow scope.

Aside my personal experience, it's also telling that the touting of waveform rates is pretty much limited to entry level scopes (aside from R&S, but they're pretty much a newcomer with their own non-Hameg scopes). It's not much of an issue for high end scopes which are pretty much universally fitted with highly capable triggers. Go figure.
 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2230
  • Country: 00
Re: DSO waveform update rate importance in practice
« Reply #9 on: October 05, 2015, 06:27:07 pm »
In theory it sounds nice that scopes with low or medium waveform update rates can compensate this with advanced trigger settings and/or analysis.
But in practice, it doesn't help much because with those scopes you have to actively search for it.

No, you don't.

Yes, you do.

Quote
With a fast scope, many times you "accidently" discover problems which would otherwise go undetected untill you are in later stage of development.
A fast scope can save you time and money compared to scopes with low/mediocre update rates.

If you need to stare on a scope's screen for an extended period of time to find out if there are any problems with your signal then I'd say you're doing it seriously wrong (or you have a really primitive DSO). Even a modern bottom-of-the-barrel scope can do mask testing, which allows you to find any signal deviation for a repetitive signal you might not know about, and will do it much more reliable than a human staring on a screen (plus it allows you to capture the event, so you can examine the problem which helps finding out what's causing it).

On better scopes you get a wide range of very flexible triggers which can quickly scan the signal in real-time for any deviation, find it, capture it and analyze it. That even includes serial busses. Good luck doing this with your stare  8)

You quote but have you actually read the quote? You act like a salesmen by carefully avoiding the point.

Quote
Don't believe the fairy tales of salesmen where they tell you that advanced trigger analisys can compensate for slow waveform updates.

At the moment these salesmen are busy telling world and dog that excessive waveform rates are the best thing since sliced bread.

Not the salesmen from lecroy that was trying to sell us their scopes. He was pretty upset that I didn't agree with him.


Quote
Once you have some practical experience with fast scopes, you never want to ga back to a scope with slow waveform updates.

Well, I have both (a fas scope with 1.25M waveforms/s), some slower scopes, and until not too long ago also a very slow scope. The letter did (like the others) have very advanced triggering, and there is really nothing I can think of that I wouldn't have found with the slow scope.

Aside my personal experience, it's also telling that the touting of waveform rates is pretty much limited to entry level scopes (aside from R&S, but they're pretty much a newcomer with their own non-Hameg scopes). It's not much of an issue for high end scopes which are pretty much universally fitted with highly capable triggers. Go figure.

Read my former post again and try to respond to my point instead of creating a smoke screen.

 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27193
  • Country: nl
    • NCT Developments
Re: DSO waveform update rate importance in practice
« Reply #10 on: October 05, 2015, 06:33:03 pm »
If a fault occurs many times per second you'll also spot it with a scope which has a low waveform update rate as well. If it doesn't occur many times per second then you'll have to stare at a screen for a long time so you are better off using a trigger condition, persistance or mask testing (which will catch the event 100% guaranteerd which the highest waveform update rate can't).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2230
  • Country: 00
Re: DSO waveform update rate importance in practice
« Reply #11 on: October 05, 2015, 06:46:07 pm »
If a fault occurs many times per second you'll also spot it with a scope which has a low waveform update rate as well. If it doesn't occur many times per second then you'll have to stare at a screen for a long time so you are better off using a trigger condition, persistance or mask testing (which will catch the event 100% guaranteerd which the highest waveform update rate can't).

I don't stare for a long time to the screen in order to catch possible errors. But during developing and debugging, it happens that I look at the scope's screen for a long time, not continuously but the accumulated time is.
It happened to me (and others!) that sometimes an anomaly was noticed that warned me of possible problems.
This happens only when you have a fast scope.
Afterwards, you can always say, hey, if you should have trigered like this or that or should have done this or that, you should have catched the error immediately.
Ofcourse, with the knowledge of the problem afterwards, it's easy to say that.
The point is, a fast scope catches errors also when you are not actively searching for them.
If you believe that this is not important to you, fine. But it is for a lot of people.
« Last Edit: October 05, 2015, 06:55:30 pm by Karel »
 

Offline miguelvp

  • Super Contributor
  • ***
  • Posts: 5550
  • Country: us
Re: DSO waveform update rate importance in practice
« Reply #12 on: October 05, 2015, 07:06:02 pm »
If a fault occurs many times per second you'll also spot it with a scope which has a low waveform update rate as well. If it doesn't occur many times per second then you'll have to stare at a screen for a long time so you are better off using a trigger condition, persistance or mask testing (which will catch the event 100% guaranteerd which the highest waveform update rate can't).

What do you consider many? One thousand?

With a 26MHz signal, my old tek 7613 was doing 500,000 waveforms per second while my Rigol DS2000 5,000 wfrms/s when set at just 14Kpoints. I can get up to 50,000 wfrms/s in auto when looking at 5ns per division and the scope changes the capture to just 700points.

https://www.eevblog.com/forum/testgear/first-personal-'scope-purchase/msg681691/#msg681691

So on normal operation the 42 year old tek will show you 100 times more waveforms.

Going back to 1,000 glitches per second and say it's a 26MHz signal. So that is 1 per 26,000.
On the tek it will display 9 million of those 26 million waveforms per second (better than 1 in 3)
On the Rigol it will display 90,000 out of those 26 million waveforms per second so the likelihood that any of those 1,000 glitches per second show up are pretty slim.

You can always put the scope on high res, capturing a large sequence, but you might still miss that, and that's if you are actively searching for such a glitch.
On the old tek, it will show up, even if you were not looking for it.

Persistence won't do a thing if the scope doesn't see it to begin with.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5326
  • Country: gb
Re: DSO waveform update rate importance in practice
« Reply #13 on: October 05, 2015, 07:12:06 pm »
I fall into the "nice to have, but not the most important factor, not by some way".

I was trying to think of a scenario, other than a manufactured one, where being able to see a 1:100000 glitch or runt would have been a useful feature recently, and I'm afraid I can't. My point being, by the time you're suspecting something of that sort you'll already have had a bunch of symptoms that would lead you to use other methods such as triggering to solve them.

One other point, it's a bit like the deep memory debate and how long is a piece of string: how much memory is enough, vs how fast an update rate do I need? It is very much the law of diminishing returns.

One thing I would say, though, is that even within the headline figures of updates/sec, this is quite often only under certain contrived circumstances, so be wary of that.

So yes. it's a convenient feature, but I think it's being overrated nowadays: given the choice, for example, between 10Mpts on all channels + 100kwfm/s versus 4Mpts shared + 1Mwfm/s, all other things being equal I'd prefer the former, but that's just an example, there are plenty of other things to consider than just update rate and deep memory, it is not a simple equation, and ultimately everyone will have different needs.


 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: DSO waveform update rate importance in practice
« Reply #14 on: October 05, 2015, 07:29:33 pm »
In theory it sounds nice that scopes with low or medium waveform update rates can compensate this with advanced trigger settings and/or analysis.
But in practice, it doesn't help much because with those scopes you have to actively search for it.

No, you don't.

Yes, you do.

No I don't. I just setup the scope to show me any non-conformities in my signal. Pretty simple if you know how.

Quote
Quote
With a fast scope, many times you "accidently" discover problems which would otherwise go undetected untill you are in later stage of development.
A fast scope can save you time and money compared to scopes with low/mediocre update rates.

If you need to stare on a scope's screen for an extended period of time to find out if there are any problems with your signal then I'd say you're doing it seriously wrong (or you have a really primitive DSO). Even a modern bottom-of-the-barrel scope can do mask testing, which allows you to find any signal deviation for a repetitive signal you might not know about, and will do it much more reliable than a human staring on a screen (plus it allows you to capture the event, so you can examine the problem which helps finding out what's causing it).

On better scopes you get a wide range of very flexible triggers which can quickly scan the signal in real-time for any deviation, find it, capture it and analyze it. That even includes serial busses. Good luck doing this with your stare  8)

You quote but have you actually read the quote? You act like a salesmen by carefully avoiding the point.

I'm not avoiding the point, I believe I was pretty clear why you don't need to know what problems your signal has to trigger on them. If that's beyond your understanding or doesn't fit your narrative then I'm sorry.

Quote
Quote
Don't believe the fairy tales of salesmen where they tell you that advanced trigger analisys can compensate for slow waveform updates.

At the moment these salesmen are busy telling world and dog that excessive waveform rates are the best thing since sliced bread.

Not the salesmen from lecroy that was trying to sell us their scopes. He was pretty upset that I didn't agree with him.

Then take this up with him (although considering your displayed attitude I start to feel sympathetic with him). If you don't like the service he gave you go to someone else, there are other manufacturers who make good scopes.

Quote
Quote
Once you have some practical experience with fast scopes, you never want to ga back to a scope with slow waveform updates.

Well, I have both (a fas scope with 1.25M waveforms/s), some slower scopes, and until not too long ago also a very slow scope. The letter did (like the others) have very advanced triggering, and there is really nothing I can think of that I wouldn't have found with the slow scope.

Aside my personal experience, it's also telling that the touting of waveform rates is pretty much limited to entry level scopes (aside from R&S, but they're pretty much a newcomer with their own non-Hameg scopes). It's not much of an issue for high end scopes which are pretty much universally fitted with highly capable triggers. Go figure.

Read my former post again and try to respond to my point instead of creating a smoke screen.

I did, pretty clearly, but of course feel free to ignore any of the points I made (which you are clearly unable to refute with tangible arguments anyways). Your call.
« Last Edit: October 05, 2015, 07:34:58 pm by Wuerstchenhund »
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27193
  • Country: nl
    • NCT Developments
Re: DSO waveform update rate importance in practice
« Reply #15 on: October 05, 2015, 07:33:10 pm »
If a fault occurs many times per second you'll also spot it with a scope which has a low waveform update rate as well. If it doesn't occur many times per second then you'll have to stare at a screen for a long time so you are better off using a trigger condition, persistance or mask testing (which will catch the event 100% guaranteerd which the highest waveform update rate can't).
Going back to 1,000 glitches per second and say it's a 26MHz signal. So that is 1 per 26,000.
Set the timebase slower and that number will reduce drastically.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5326
  • Country: gb
Re: DSO waveform update rate importance in practice
« Reply #16 on: October 05, 2015, 07:44:59 pm »
With a 26MHz signal, my old tek 7613 was doing 500,000 waveforms per second while my Rigol DS2000 5,000 wfrms/s when set at just 14Kpoints. I can get up to 50,000 wfrms/s in auto when looking at 5ns per division and the scope changes the capture to just 700points.

https://www.eevblog.com/forum/testgear/first-personal-'scope-purchase/msg681691/#msg681691

So on normal operation the 42 year old tek will show you 100 times more waveforms.

Going back to 1,000 glitches per second and say it's a 26MHz signal. So that is 1 per 26,000.
On the tek it will display 9 million of those 26 million waveforms per second (better than 1 in 3)
On the Rigol it will display 90,000 out of those 26 million waveforms per second so the likelihood that any of those 1,000 glitches per second show up are pretty slim.

You can always put the scope on high res, capturing a large sequence, but you might still miss that, and that's if you are actively searching for such a glitch.
On the old tek, it will show up, even if you were not looking for it.

Coincidentally just the other day I got my 2467B out, it's just under the desk doing floor standing duty (remember those days when you popped the scope on it's butt on the floor?) with an almost identical test, a 1:32768 runt pulse. No problem. Admittedly it has the MCT tube and you need to turn the volume up a tad, but that's part and parcel of using a CRO. But in general for me, these glitch and runt scenarios remain largely fabricated in my world, although that's not to say they aren't in someone else's. Of course, now I've said that so confidently, tomorrow I'll face exactly that situation!

Here's a vid of that 2467B by the way, I doubt that many of those pulses were missed at all: waveform update rate was 53.3kHz on that 10us span.

 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4997
  • Country: si
Re: DSO waveform update rate importance in practice
« Reply #17 on: October 05, 2015, 07:55:53 pm »
Sorry perhaps i should have been clearer on my post. What i mean by a good waveform is a nice fuzzy waveform(horizontal and vertical "fuzz") that updates at 30fps or more on the screen.

I am not saying all windows scopes use the CPU but a lot of them do including most of Keysights top of the range scopes. As a result they don't mind the scope window being resized to any size and will happily run on 1080p if you plug an external monitor in to it. Waveforms still have that intensity gradated fuzziness to them and it does pretty well on the "Dave AM modulation test" but it only happens in the vertical direction because the intensity gradations comes from drawing the few milion points on a screen just 1024 pixels wide so it has lots of points in one pixel to determine where the waveform spends more time. My particular Agilent 9000 series infiniium scope still makes a slightly worse image of the AM modulation signal than my other Agilent 6000 (An old MegaZoon ASIC in that one). It seams like in regular run mode the 9000 starts skipping samples when rendering to the screen at large memory sizes in order to make it work faster, but then again doing this allows the scope to still update the display reasonably quickly even when its using 512Mpt of memory in free run mode.

Where you don't see the fuzziness is in the horizontal direction when you look at a sharp edge in a signal. To do that the scope needs to capture at least a few hundred waveforms and overlay them on top of each other. This is where doing this in hardware shows as it can do this much quicker. Yes a CPU rendering scope will still be able to produce such an image when you set it up to do so but it will take seconds to build it up, not show a live 60fps updating image of the waveform with all the detail hidden in s few Mpts of memory.

While this Agilent 9000 says it can do 100 000 waveform updates in burst, thing is that only comes useful in certain special cases where you want to for example capture a few 100 pulses in sequence without missing one using segmented memory. Or when you set up the fancy triggering like zone trigger where it will capture at that speed until it finds a waveform that meets the zone trigger criteria and send it over to the CPU to be drawn. Internally it can run that fast but never actually put that many waveforms on to the screen due to the CPU bottleneck.

I was using windows based scopes as an example since they often do that. A lot of old or cheep non windows scopes also use CPU rendering of the waveform and usually those won't even have intensity gradation to begin with.

Basically what i am saying is you need a reasonably high waveform update rate in order to display a smooth live 60fps image of a signal that looks analog like with all the jitter and anomalies showing up as shadows around it. To capture 1000 waveforms in every 60Hz update of the screen you need to capture waveforms continuously at 60 000 waveforms per second. This sort of thing is valuable for quick poking at stuff where you are not really making a serious measurement of something, but rather just interested what sort of activity is on a bus or something since the moment you touch the pin on the PCB you get a very detailed and complete image of the signal so you can quickly see whats going on as well as perhaps see 1/10th of a second time scale patterns happening in the signal by seeing it change live on the screen. This is the true every day use for high waveform update rates.
« Last Edit: October 05, 2015, 07:58:51 pm by Berni »
 

Offline miguelvp

  • Super Contributor
  • ***
  • Posts: 5550
  • Country: us
Re: DSO waveform update rate importance in practice
« Reply #18 on: October 05, 2015, 09:33:59 pm »
If a fault occurs many times per second you'll also spot it with a scope which has a low waveform update rate as well. If it doesn't occur many times per second then you'll have to stare at a screen for a long time so you are better off using a trigger condition, persistance or mask testing (which will catch the event 100% guaranteerd which the highest waveform update rate can't).
Going back to 1,000 glitches per second and say it's a 26MHz signal. So that is 1 per 26,000.
Set the timebase slower and that number will reduce drastically.

The higher the timebase the higher the sample points the lower the waveform per second, at least that's true on the Rigol DS2000 series. Pretty much is the same blind time, but you select where the blind time happens and for how long.

At 140KPoints the waveforms per second drops to about 500. but it's capturing 10 times more samples so that makes sense.

Try it on yours, if you don't have a 2nd scope just use your DMM and monitor the trigger out frequency (if your DMM can do frequency that low)


 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2230
  • Country: 00
Re: DSO waveform update rate importance in practice
« Reply #19 on: October 05, 2015, 10:04:12 pm »
In theory it sounds nice that scopes with low or medium waveform update rates can compensate this with advanced trigger settings and/or analysis.
But in practice, it doesn't help much because with those scopes you have to actively search for it.

No, you don't.

Yes, you do.

No I don't. I just setup the scope to show me any non-conformities in my signal. Pretty simple if you know how.

You still don't get the point. Or, more probably, you don't want to get to the point.

Quote
Quote
With a fast scope, many times you "accidently" discover problems which would otherwise go undetected untill you are in later stage of development.
A fast scope can save you time and money compared to scopes with low/mediocre update rates.

If you need to stare on a scope's screen for an extended period of time to find out if there are any problems with your signal then I'd say you're doing it seriously wrong (or you have a really primitive DSO). Even a modern bottom-of-the-barrel scope can do mask testing, which allows you to find any signal deviation for a repetitive signal you might not know about, and will do it much more reliable than a human staring on a screen (plus it allows you to capture the event, so you can examine the problem which helps finding out what's causing it).

On better scopes you get a wide range of very flexible triggers which can quickly scan the signal in real-time for any deviation, find it, capture it and analyze it. That even includes serial busses. Good luck doing this with your stare  8)

You quote but have you actually read the quote? You act like a salesmen by carefully avoiding the point.

I'm not avoiding the point, I believe I was pretty clear why you don't need to know what problems your signal has to trigger on them. If that's beyond your understanding or doesn't fit your narrative then I'm sorry.

No, arguments, just patronizing/insulting.

Quote
Quote
Don't believe the fairy tales of salesmen where they tell you that advanced trigger analisys can compensate for slow waveform updates.

At the moment these salesmen are busy telling world and dog that excessive waveform rates are the best thing since sliced bread.

Not the salesmen from lecroy that was trying to sell us their scopes. He was pretty upset that I didn't agree with him.

Then take this up with him (although considering your displayed attitude I start to feel sympathetic with him). If you don't like the service he gave you go to someone else, there are other manufacturers who make good scopes.

I already did. I just wanted to say that it's not true that all these salesmen are busy telling world and dog that excessive waveform rates are the best thing since sliced bread.
At least not the ones from lecroy. Probably because, specially in the past, the lecroy's were very slow.

Quote
Quote
Once you have some practical experience with fast scopes, you never want to ga back to a scope with slow waveform updates.

Well, I have both (a fas scope with 1.25M waveforms/s), some slower scopes, and until not too long ago also a very slow scope. The letter did (like the others) have very advanced triggering, and there is really nothing I can think of that I wouldn't have found with the slow scope.

Aside my personal experience, it's also telling that the touting of waveform rates is pretty much limited to entry level scopes (aside from R&S, but they're pretty much a newcomer with their own non-Hameg scopes). It's not much of an issue for high end scopes which are pretty much universally fitted with highly capable triggers. Go figure.

Read my former post again and try to respond to my point instead of creating a smoke screen.

I did, pretty clearly, but of course feel free to ignore any of the points I made (which you are clearly unable to refute with tangible arguments anyways). Your call.

No, you didn't. And again you are patronizing/insulting.
 

Online Someone

  • Super Contributor
  • ***
  • Posts: 4593
  • Country: au
    • send complaints here
Re: DSO waveform update rate importance in practice
« Reply #20 on: October 05, 2015, 10:07:59 pm »
Hello,

in what kinds of practical applications the DSO waveform update rate is matter?
Several applications that I use frequently are:
building signal integrity plots of high speed logic, eye diagrams etc
averaging large numbers of captures to look at correlated noise, measuring 10's of uV at full bandwidth with 1:1 probing
measuring phase coincidence in noisy signals with intensity grading to provide averaging

These are on Agilent X series scopes where I don't have to trade memory depth for update rate, when using similar Tek scopes I notice the loss of data when trying to get the same fast responses by reducing the memory depth.
I fall into the "nice to have, but not the most important factor, not by some way".

I was trying to think of a scenario, other than a manufactured one, where being able to see a 1:100000 glitch or runt would have been a useful feature recently, and I'm afraid I can't. My point being, by the time you're suspecting something of that sort you'll already have had a bunch of symptoms that would lead you to use other methods such as triggering to solve them.

One other point, it's a bit like the deep memory debate and how long is a piece of string: how much memory is enough, vs how fast an update rate do I need? It is very much the law of diminishing returns.

One thing I would say, though, is that even within the headline figures of updates/sec, this is quite often only under certain contrived circumstances, so be wary of that.

So yes. it's a convenient feature, but I think it's being overrated nowadays: given the choice, for example, between 10Mpts on all channels + 100kwfm/s versus 4Mpts shared + 1Mwfm/s, all other things being equal I'd prefer the former, but that's just an example, there are plenty of other things to consider than just update rate and deep memory, it is not a simple equation, and ultimately everyone will have different needs.
Great answer, its still best to evaluate several instruments if possible and see what works for you.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: DSO waveform update rate importance in practice
« Reply #21 on: October 06, 2015, 05:05:54 am »
I am not saying all windows scopes use the CPU but a lot of them do including most of Keysights top of the range scopes.

Actually, Keysight scopes heavily rely on proprietary ASICs not the CPU, pretty much all of them (including the DSO9k Series).
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf