Hi!
I'm having a project where I'm measuring the concentration of carbon dioxide gas with an infrared sensor. I need to find the optimal averaging time for the signal, the one that minimizes the Allan deviation. I have a code that calculates the (overlapping) Allan variance using this formula:
sigma^2 = 1/(2*n^2*tau^2*(N-2*n)) * sum_(i=0 to N-2*n-1) (x_(i+2*n) - 2x_(i+n) + x_i)^2),
where N is the number of data points, n is the averaging multiplier, tau is the averaging time and x is the phase data point
However, I'm not sure what I should use as phase data. The signal is a time series of ppm values - should I transform it into radians somehow, like defining the minimum as 0 and the maximum as 2pi? Or could the raw data be passed as "phase"?