Products > Test Equipment

How does waveform updates on an oscilloscope work? Why do they work that way?

<< < (10/14) > >>

2N3055:

--- Quote from: Someone on April 19, 2023, 11:07:30 pm ---
--- Quote from: 2N3055 on April 19, 2023, 10:53:29 pm ---You can  have scenario where you have thousands of packets per second and a really fast scope that will render all of those (all 1000) to the screen in real time, keeping each on screen for 1 ms... That would be obviously useless b'cause I can't read that fast.
--- End quote ---
Immediately you have taken the same strawman trope that has been explained over and over.

Modern scopes have persistence in their display, even when set to "zero" persistence, but it can be increased as desired to infinity. Even if the trace was only visible for a single display frame (typically a broadcast rate such as 30Hz,50Hz,60Hz) the eye would still retain the event by visual persistence.

As nctnico mentions above, a serial trigger can focus down onto one part/transaction of a system (such as filtering on addresses of SPI) and show what states the data bits contained across many words/packets. In real time, continuously, but with possibly some missing data lost to the dead time.


--- Quote from: 2N3055 on April 19, 2023, 10:53:29 pm ---Or I'm interested only in messages sent by only one of sensors... In which case I need the scope to only TRIGGER on certain messages (set by trigger condition) and ignore all the rest. And if my sensor of interest sends data only once a second (or when I press key or something) then I leave scope running, and wait for it to catch a message and show it to the screen. This is interactive work, this is real time, but only if message rate is slow enough for a slow human operator.
--- End quote ---
So, now it's something "very" slow (from a scope perspective). Take a scope without a hardware serial trigger and how does it find that interesting message in amongst all the others? It has a larger blind time, the problem at hand.

--- End quote ---

And again, you read what you want to see,  so you can keep arguing...

There is no such thing as software serial trigger. Get your terminology right. Triggering is a hardware process of starting a new cycle of capture. Anything software is not a trigger per definition. If it is software than it is search or decode. Not trigger.

I don't care if someone out there abuses the word. And I don't care about toy scopes that have it all as a software in ARM. We are talking real scopes here.

Picoscope (for instance) has no hardware serial triggers. It has no decoded data search either. It only has data filtering, where you can filter messages in table by filter string...

I said that scopes without hardware serial trigger are not good for realtime, because, well, they cannot trigger on specific packets... It's kinda obvious.

In real time screen refreshes fully with every trigger and decoded messages overwrite on top of each other...
I said 1000 messages a second, every 1ms on top of each other... On same screen overwriting on top of each other... Persistence doesn't apply to decode. For decode you have Normal trigger mode that keeps last message/screen on until new trigger comes and new capture gets on screen, on top.

As for missing packets  in real time, if scope has hardware serial triggers, if you miss packets on the screen it won't be because trigger missed it, but because you got fast burst of 3-4 packets in 100 µs, and you will be able to see only the last one because previous 3 went by in such a hurry you didn't see them.

When you blink, that is 100 ms of screen dead time....

And about visual persistence: if you have something displayed for 10ms on screen and then deleted, you will see something happened but good luck reading it with comprehension.... You might as well not enable decoding at all and just look at flicker of waveform instead... You can obviously see bits flickering in waveform...

tggzzz:

--- Quote from: nctnico on April 19, 2023, 10:11:53 pm ---
--- Quote from: tggzzz on April 18, 2023, 08:01:14 am ---This whole discussion appears to be ending up down a traditional rathole, equivalent to trying to use a hammer to insert a screw.

Use a scope to ensure signal integrity, i.e. that the analogue waveforms will be correctly interpreted as digital signals.

Then flip to the digital domain, and use a logic analyser's superior clocking, triggering and filtering to discard (i.e. don't store) the irrelevant crap.

--- End quote ---
In theory. But in practise you don't always know what you are looking for so it can be helpfull to capture a lot of messages on an oscilloscope, save to disk for analysis on a PC and go back to the scope to inspect the waveform for the malformed packet.

--- End quote ---

What do you mean by "malformed packet"?

If the waveform shape is malformed, then that is a signal integrity issue and a scope is the only tool to use. Eye diagrams are extremely helpful here - but you must be able to prevent a digitising scope from interpolating between samples.

If the data within the packet is malformed, then ignore the scope and capture many bits/packets in a logic analyser, protocol analyser, or printf() statements. If convenient, use a PC + application of your choice.

Don't forget that if you are using a scope to interpret an analogue waveform as a digital signal/message, it may have a different interpretation to the receiving device.

tggzzz:

--- Quote from: nctnico on April 19, 2023, 10:20:50 pm ---
--- Quote from: 2N3055 on April 17, 2023, 12:35:35 pm ---So you would have trigger on every packet, and it's waveform would be shown on screen in real time. And decoding would skip some decodes in a burst and show you only last one. Instead of unrecognizable blur and then the last one... Same difference...

--- End quote ---
In some cases it is usefull to look at the decoded data in near realtime. Think about checking which bits change every now and then when reverse engineering a protocol.

--- End quote ---

If you are looking for changes in bits, then use a digital domain tool (LA, protocol analyser, printf()) to capture the bits, and a PC application to show diffs.

A scope is a non-optimum tool for that, even if it can be bent to the purpose.

Someone:

--- Quote from: 2N3055 on April 20, 2023, 07:54:56 am ---
--- Quote from: Someone on April 19, 2023, 11:07:30 pm ---
--- Quote from: 2N3055 on April 19, 2023, 10:53:29 pm ---You can  have scenario where you have thousands of packets per second and a really fast scope that will render all of those (all 1000) to the screen in real time, keeping each on screen for 1 ms... That would be obviously useless b'cause I can't read that fast.
--- End quote ---
Immediately you have taken the same strawman trope that has been explained over and over.

Modern scopes have persistence in their display, even when set to "zero" persistence, but it can be increased as desired to infinity. Even if the trace was only visible for a single display frame (typically a broadcast rate such as 30Hz,50Hz,60Hz) the eye would still retain the event by visual persistence.

As nctnico mentions above, a serial trigger can focus down onto one part/transaction of a system (such as filtering on addresses of SPI) and show what states the data bits contained across many words/packets. In real time, continuously, but with possibly some missing data lost to the dead time.


--- Quote from: 2N3055 on April 19, 2023, 10:53:29 pm ---Or I'm interested only in messages sent by only one of sensors... In which case I need the scope to only TRIGGER on certain messages (set by trigger condition) and ignore all the rest. And if my sensor of interest sends data only once a second (or when I press key or something) then I leave scope running, and wait for it to catch a message and show it to the screen. This is interactive work, this is real time, but only if message rate is slow enough for a slow human operator.
--- End quote ---
So, now it's something "very" slow (from a scope perspective). Take a scope without a hardware serial trigger and how does it find that interesting message in amongst all the others? It has a larger blind time, the problem at hand.

--- End quote ---

And again, you read what you want to see,  so you can keep arguing...

There is no such thing as software serial trigger. Get your terminology right.
--- End quote ---
There certainly is, various scopes slow down their acquisition rate when using serial trigger/decode. Including big $ scopes from A brands.


--- Quote from: 2N3055 on April 20, 2023, 07:54:56 am ---In real time screen refreshes fully with every trigger and decoded messages overwrite on top of each other...
I said 1000 messages a second, every 1ms on top of each other... On same screen overwriting on top of each other... Persistence doesn't apply to decode. For decode you have Normal trigger mode that keeps last message/screen on until new trigger comes and new capture gets on screen, on top.
--- End quote ---
As tggzzz says, why use a scope if all you wanted to see was the decoded data? use something more appropriate that doesn't have dead time like a USB to XXX serial adaptor, or a protocol analyser. The scope is more appropriate for when you want to see some analog characteristics of the signal correlated to the data, be that timing or collisions or coincident signals like power. All of those do show in the persistence (as does the presence or absence of 0's and 1's for each bit, which may be all the information required; confirming aligned addresses, or correct data).


--- Quote from: 2N3055 on April 20, 2023, 07:54:56 am ---As for missing packets  in real time, if scope has hardware serial triggers, if you miss packets on the screen it won't be because trigger missed it, but because you got fast burst of 3-4 packets in 100 µs, and you will be able to see only the last one because previous 3 went by in such a hurry you didn't see them.
--- End quote ---
Finally you are actually getting to the point, the dead time can miss valid triggers. Scopes can run with segmented or waveform history modes to collect such bursts for inspection, but they are still limited by finite dead time.... which can be a different length when the serial trigger or decode is enabled.

2N3055:

--- Quote from: tggzzz on April 20, 2023, 09:22:08 am ---
--- Quote from: nctnico on April 19, 2023, 10:20:50 pm ---
--- Quote from: 2N3055 on April 17, 2023, 12:35:35 pm ---So you would have trigger on every packet, and it's waveform would be shown on screen in real time. And decoding would skip some decodes in a burst and show you only last one. Instead of unrecognizable blur and then the last one... Same difference...

--- End quote ---
In some cases it is usefull to look at the decoded data in near realtime. Think about checking which bits change every now and then when reverse engineering a protocol.

--- End quote ---

If you are looking for changes in bits, then use a digital domain tool (LA, protocol analyser, printf()) to capture the bits, and a PC application to show diffs.

A scope is a non-optimum tool for that, even if it can be bent to the purpose.

--- End quote ---

I agree but premise of the question is a scope for that use.. Not what are alternatives.

I did agree with you many messages ago that you can even use analog CRT scope for SI and some sort of LA for actual decoding. How it was done for ages..
When you don't want to look at lots of decoded data, digital scope with decoding is useful because you can do both at the same time. Saves time and fiddling with connections...

But I don't want to look at 1000 messages on scope screen. Even super expensive scopes with HUGE 15"  :-DD screens are joke compared to the screen on a PC.. So for that job, for me it is MSO Pico or a LA... occasional print to uart. For stuff I'm making I bridge Pico's trigger deficiencies by toggling a pin of µC at critical points and trigger of that. Kind of "breakpoints" for scope.....

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod