I Approximate new acorrelation given previous acorrelation and a new set of data?

AI Thread Summary
The discussion focuses on approximating the autocorrelation coefficient ##\rho## when new data is introduced without recalculating from scratch. It highlights the need to multiply out terms in the autocorrelation formula to facilitate the approximation. By separating the sums of the terms, a new expression for ##\rho_k## is derived that incorporates the new data. The conversation emphasizes the importance of maintaining running sums for efficient computation. Ultimately, the approach requires careful coding and thorough testing to ensure accuracy.
member 428835
Hi PF!

The autocorrelation coefficient ##\rho## is defined as $$\rho_k \equiv \frac{\sum_{t=k+1}^T (x_t - \bar x)(x_{t-k} - \bar x)}{\sum_{t=1}^T(x_t-\bar x)^2}$$

Now suppose we calculate ##\rho## through ##T##, but are then given a new data at time ##T + \Delta t##. Is there a way to approximate the new autocorrelation without recalculating from scratch? Obviously if we use the definition we'd have to recalculate the mean ##\bar x## and therefore the entire computation.
 
Mathematics news on Phys.org
joshmccraney said:
Is there a way to approximate the new autocorrelation without recalculating from scratch?
Yes, can you multiply out the terms in ## (x_t - \bar x)(x_{t-k} - \bar x) ## and ## (x_t-\bar x)^2 ##?
 
pbuk said:
Yes, can you multiply out the terms in ## (x_t - \bar x)(x_{t-k} - \bar x) ## and ## (x_t-\bar x)^2 ##?
##\bar x^2 - \bar x x_{t-k}- \bar x x_{t} + x_t x_{t-k}## and ##\bar x^2 -2 \bar x x_{t} + x_t ^2##
 
joshmccraney said:
##\bar x^2 - \bar x x_{t-k}- \bar x x_{t} + x_t x_{t-k}## and ##\bar x^2 -2 \bar x x_{t} + x_t ^2##
Excellent, now you can sum over the terms separately:
$$ \rho_k \approx \frac{
\sum_{t=k+1}^T \bar x^2
- \sum_{t=k+1}^T \bar x x_{t-k}
- \sum_{t=k+1}^T \bar x x_{t}
+ \sum_{t=k+1}^T + x_t x_{t-k}
}{
\sum_{t=1}^T \bar x^2
- 2 \sum_{t=1}^T \bar x x_{t}
+ \sum_{t=1}^T x_t^2
} $$
and finally rewrite all the ## \bar x ## terms e.g.
$$ \sum_{t=k+1}^T \bar x x_{t-k} = \bar x \sum_{t=k+1}^T x_{t-k} = \frac 1 T \sum_{t=1}^T x_t \sum_{t=k+1}^T x_{t-k} $$
Then you "just" need to
  1. translate that all into code which keeps track of the running sums
  2. test it thoroughly
  3. rewrite it to pick up mistakes in my or your algebra
  4. test it again
Good luck!
 
Suppose ,instead of the usual x,y coordinate system with an I basis vector along the x -axis and a corresponding j basis vector along the y-axis we instead have a different pair of basis vectors ,call them e and f along their respective axes. I have seen that this is an important subject in maths My question is what physical applications does such a model apply to? I am asking here because I have devoted quite a lot of time in the past to understanding convectors and the dual...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Back
Top