Products > Test Equipment

A High-Performance Open Source Oscilloscope: development log & future ideas

<< < (52/71) > >>

nctnico:
Where it comes to filtering it may be better to do this as a (first) post processing step before any other operation. From my experience it is useful to be able to adjust the filtering on existing acquisition data (GW Instek does this). Care must be taken though to avoid start-up issues and use real data. The R&S RTM3004 for example filters decimated data and doesn't take initial filter initialisation into account leading to weird behaviour and thus limiting the usefullness of filtering.

Averaging is another interesting case. One of the problems is that ideally you'd save the averaged data so you can scroll left/right zoom in / out. On some oscilloscopes (again R&S RTM3004) the averaged trace dissapears if you move the trace. I second the suggestion to be able to combine acquisitions modes but at some point you'll be creating a new trace in CPU memory and using the acquisition data only to update the trace in CPU memory.

gf:

--- Quote from: nctnico on December 18, 2020, 07:22:21 pm ---Where it comes to filtering it may be better to do this as a (first) post processing step before any other operation. From my experience it is useful to be able to adjust the filtering on existing acquisition data (GW Instek does this).
--- End quote ---

For pre-decimation filtering this would imply that the data would need to be stored at the full sampling rate, in order that the first processing steps can filter and decimate them. This likely defends the purpose (lower memory usage), why a lower sampling rate than the maximum was selected.

For any other kind of filtering which does not need to be done prior to decimation, I'm basically with you.


--- Quote ---Care must be taken though to avoid start-up issues and use real data. The R&S RTM3004 for example filters decimated data and doesn't take initial filter initialisation into account leading to weird behaviour and thus limiting the usefullness of filtering.
--- End quote ---

That's a general issue when filtering is done in the post-processing, on the stored data, where only a set of records, but no continuous stream of data is available. But where should the initial filter state come from? Do you want to ask the user to enter initial values for all state variables of the filter? Like: "Please enter the values of the 199 samples preceding the captured buffer" (for a 200 tap FIR filter). Another alternative could be to simply discard the samples falling into the fade-in/fade-out time interval of the filter, hereby reducing the record size, of course.

EDIT:


--- Quote ---Averaging is another interesting case. One of the problems is that ideally you'd save the averaged data so you can scroll left/right zoom in / out. On some oscilloscopes (again R&S RTM3004) the averaged trace dissapears if you move the trace. I second the suggestion to be able to combine acquisitions modes but at some point you'll be creating a new trace in CPU memory and using the acquisition data only to update the trace in CPU memory.
--- End quote ---

The question is at which waveform rate the averaged data are supposed to be recorded.

(1) At full trigger rate (storing a moving average)?
(2) At 1/N of the trigger rate (storing only a single averaged buffer after acquiring N triggers).

In case (1) the acquisition engine could equally well store just the triggered waveforms, and the processing engine could average them.

In case (2) the averaging would need to be done in a pipeline stage of the acquisition engine.
This mode saves memory, but at the cost of a lower waveform rate.

nctnico:
Skipping 1000 samples at the beginning and another 1000 at the end of a 100Mpts long record is something nobody will notice.

gf:

--- Quote from: nctnico on December 18, 2020, 08:26:09 pm ---Skipping 1000 samples at the beginning and another 1000 at the end of a 100Mpts long record is something nobody will notice.

--- End quote ---

Agreed, no problem for suffiiently long records, granted that the buffer management does not impose incompatible record length constraints (like e.g. all record lengths must be power of two, or all records must have the same fixed size,...).

gf:

--- Quote from: tom66 on December 18, 2020, 05:31:53 pm ---Normal - Applies a downsampling filter* when sampling rate is under the normal ADC rate, otherwise, does not filter
Decimate - Applies only decimation, i.e. dropping samples when sampling rate is under the normal ADC rate,  otherwise identical to

--- End quote ---

I would actually tend to use the name "Normal" for the non-filtered mode.
[ Sure, names signify nothing - it were just my personal preference. I wonder how other think about it. ]

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod