I have a large data set consisting of varying magnitude over time in the form of sine+cos. I want to apply a high pass filter to the data, but i'm not sure how to calculate the time constant for the data. I produce a power spectrum of the data that i can visually inspect to figure out where i want the cutoff to occur (typically between 2 and 10 μHz). here is how i am implementing the algorithm, i just need a way to calculate the time constant Code (Text): dt = (time - time) * 86400 ; converting from days to sec timeConstant = ? alpha = timeConstant / (timeConstant + dt) for i = 1 to sizeOfData mag[i] = alpha * (mag[i-1] + (time[i] - time[i-1])) end can anyone help?