This secondary buffer is all the main CPU has direct access to, not the original sample data. This is the reason most calculations are done "on screen".
It's not impossible to process the whole of sample memory but it would either have to be done in batches on the main CPU or directly inside the FPGA/ASIC.
This secondary buffer is all the main CPU has direct access to, not the original sample data. This is the reason most calculations are done "on screen".
It's not impossible to process the whole of sample memory but it would either have to be done in batches on the main CPU or directly inside the FPGA/ASIC.Utter nonsense!
And you believe that is the only way to make a digital oscilloscope?
There are multiple different architectures that can and have been used in digital scopes, they all have advantages and disadvantages, some are geared toward a specific use case while others are more general purpose. Which is "better" is a subjective matter.
Unfortunately that doesn't always work. Sometimes it is a specific message which can cause a device to misbehave every now and then. At some point I had to deal with a problem where a message went wrong once every 30 minutes to an hour.Was the problem in the analog domain?Yes. A logic analyser would not have caught this problem:
What happens here is that an I2C device is stretching the clock to make the I2C master wait but the I2C master doesn't care.
IMO if it's going to bother to do protocol decoding, it ought to do so on more than what is displayed on the screen at once.
Unfortunately that doesn't always work. Sometimes it is a specific message which can cause a device to misbehave every now and then. At some point I had to deal with a problem where a message went wrong once every 30 minutes to an hour.Was the problem in the analog domain?Yes. A logic analyser would not have caught this problem:
What happens here is that an I2C device is stretching the clock to make the I2C master wait but the I2C master doesn't care.Sorry if I missed it, but what model Agilent scope made the decoder view in the image you posted? Thx
A DSO7104A (but it has moved on to a new owner).
OTOH oscilloscopes aren't really designed internally for this sort of programming pattern and I can see why they don't do it.
It depends entirely on whether or not the acquisition memory is available at scope stop. Frankly, it almost has to be, because you can zoom and pan on that same data
Nope. It has to be done in hardware because of the enormous amount of data.
The acquisition memory will be on one bus. Data is fed to it via DMA direct from the ADC.
Also connected to that RAM ss some sort of FPGA/ASIC which takes data from there and copies it to another buffer, scaled to fit the number of pixels visible on screen (depending on your timebase). It will also do the sin(x)/x interpolation, etc. There's no way this scaling/interpolation could be done in software on a cheap CPU, data can be coming in a 1GS/s even on low-end oscilloscopes.
This secondary buffer is all the main CPU has direct access to, not the original sample data. This is the reason most calculations are done "on screen".
It's not impossible to process the whole of sample memory but it would either have to be done in batches on the main CPU or directly inside the FPGA/ASIC.
No idea, but since we're talking about software decoding, there isn't anything in principal or in practice (that I'm aware of) that would prevent it.
Apart from (eg.) trying to access the 24MB of sample memory in 1200 byte chunks (on a DS1054Z).
In principal it could be done - they did something like it for the DS1054Z's improved FFT.
In practice it doesn't seem like anybody is. Not for serial decoding.
I'm thinking it would be too slow.
Yes, the decode could be done in the FPGA/ASIC but that assumes there's enough gates left over and they bothered to do it.
it only has to be done once.
it only has to be done once.
...assuming there's enough spare RAM to store all the decoded data along with its position on the timeline, etc.
How much serial data+position could 24Mb of samples actually decode to? What's the maximum?
Data is one byte, position would need 3 bytes. I'm thinking it could be a megabyte or more of data in total if you set the timebase to worst case.
There are multiple different architectures that can and have been used in digital scopes, they all have advantages and disadvantages, some are geared toward a specific use case while others are more general purpose. Which is "better" is a subjective matter.
I'm sure.
The only point I'm trying to make is that the decoding CPU doesn't necessarily have access to the sample data, that doing a full-memory serial decode on a device that isn't designed for it could be a programming nightmare.
I believe it is changed in the Tek mdo3000 scopes where decodes and search occur across the whole sample memory and not just what's on the screen, that is why the Tek will take sometime to complete decodes after a new trigger
IMHO this whole battle DSO vs DMO (data mining oscilloscope) is weird.
IMHO this whole battle DSO vs DMO (data mining oscilloscope) is weird.
I get the feeling that if the 'scope manufacturers give people a full memory decode they'll just start saying things like, "Wahhh! It's such a pain to have to scroll through the data by twisting a knob in zoom mode, why can't we have a vertical list view? CSV dump? Automatic search for bad signals?"
I get the feeling that if the 'scope manufacturers give people a full memory decode they'll just start saying things like,
IMHO Pico should change their UI so it can work with a touch-screen and create an Android version so you can connect your scope to a tablet. I have used a Picoscope a couple of times many years ago and it didn't inspire me to buy one. Needing to mess around with the mouse to adjust something made using it way too tedious.
IMHO Pico should change their UI so it can work with a touch-screen and create an Android version so you can connect your scope to a tablet.
IMHO this whole battle DSO vs DMO (data mining oscilloscope) is weird.
I get the feeling that if the 'scope manufacturers give people a full memory decode they'll just start saying things like, "Wahhh! It's such a pain to have to scroll through the data by twisting a knob in zoom mode, why can't we have a vertical list view? CSV dump? Automatic search for bad signals?"My GW Instek GDS-2204E can do all that so I'm not complaining and I'm sure it is not the only oscilloscope with such features.