General > General Technical Chat
downsampling a signal: blockwise average: danger of aliasing?
herc:
dear forum - members, i have a question:
i want to downsample a 90Hz signal quite a bit to 1Hz. if i use simple blockwise averaging, am i risking aliasing?
i guess that taking the average of a 90 frames wide block is already a form of lowpass-filtering, that is required to avoid aliasing?
but i simply cannot find any more details on how blockwise averaging compares to the more sophisticated signal decimation methods, like used here:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.signal.decimate.html#scipy.signal.decimate
i have not a good feeling about signal.decimate, because this statement irritates me: "The downsampling factor. When using IIR downsampling, it is recommended to call decimate multiple times for downsampling factors higher than 13." this is vague. how many times i am expected to do decimate? should i rather use FIR filtering? is FIR filtering also problematic with the strong downsampling i want to achive?
any opinions?
ataradov:
With block-wise averaging you will see aliasing. It is not that different from just picking the points at fixed intervals. Moving average is a form of FIR filter, it would prevent aliasing.
Almost all decimation implementations will not recommend going more than 10 times in one pass. Otherwise your filter size becomes huge. In your case decimate by 10 and then by 9 and you will get overall 90 decimation factor.
As far as efficient implementations go, cascaded integrator-comb (CIC) implementations are unbeatable. Integer only math, interpolator/decimator with an integrated filter. CIC itself is a form of moving average, just implemented efficiently.
herc:
thank you very much! so i will apply - as you suggested - two steps of decimation using scipy.signal.decimate.
i also found this document:
https://www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_Ch15.pdf
that states that indeed the moving average FIR filter is not a good lowpass filter, but a good noise filter that preserves the step response.
"Not only is the moving average filter very good for many applications, it is optimal for a common problem, reducing random white noise while keeping the sharpest step response. ..... In short, the moving average is an exceptionally good smoothing filter (the action in the time domain), but an exceptionally bad low-pass filter (the action in the frequency domain)."
Siwastaja:
Aliasing is always a problem, there is no way to completely prevent aliasing, it's just about "how much" it happens and does it matter.
I recommend you get Matlab or Octave so you can build IIR/FIR filters and plot the response using the freqz function. Try a moving average IIR, or a simple 11111111 coeff box FIR first to see what the attenuation is at half of the new (decimated) sample rate (the Nyquist limit for the output signal). Then compare to the more optimal FIR coeffs you can design with said software, simplest way basically calling the fir1 function to design the filter for you. You'll see huge differences in stopband attenuation.
OTOH, the question often is, what signal is aliasing? Are you just preventing low levels of noise from aliasing? Or do you have a strong (maybe much stronger than the in-band signal of interest) offending signal outside the band? Sometimes -10dB stopband attenuation is more than fine, sometimes -100dB isn't enough.
What comes to performance, if you have just 90Hz signal, so assuming you are not sampling at too many hundreds Hz, maybe unless you are running a tiny 8-bit CPU at 1MHz, you may have a lot of excess performance so you can run a decent FIR with a lot of taps, if you need; this would be easy to implement. The upside in FIR is consistent and predictable group delay (half the number of taps).
The question really, as always, is, what are you actually doing?
herc:
@Siwastaja: thank you for your detailed and knowledgeable answer.
i am training a neural network on several data streams (for example, steering wheel angle, pedal values, current gear value etc.) from a vehicle-simulator. for the time beeing, this happens offline from a prerecorded dataset. hence, realtime requirements are - for now - not important. i want to predict the driver expertise from these datastreams over a sliding window of 50 seconds. this works already with the simple blockwise-average. i just wonder if results will improve with better downsampling that minimizes aliasing. i will try out and compare different methods and let you know what works best for training the network.
Navigation
[0] Message Index
[#] Next page
Go to full version