- #1
Pere Callahan
- 586
- 1
Hi,
While teaching myself Time Series Analysis and ARMA processes in particular, I came across the question, whether two ARMA(p,q) processes
[tex]
\varphi(B)X_t=\theta(B)Z_t \qquad\qquad \tilde \varphi(B)\tilde X_t=\tilde \theta(B)\tilde Z_t\
[/tex]
with different autoregressive and/or moving average polynomials would necessarily have different covariance functions.
I know that the covariance function is given by
[tex]
\gamma_X(n)=\sum_{j\geq 0}{\psi_j\psi_{j+|n|}}
[/tex]
where
[tex]
\sum_{j\geq 0}{\psi_j z^j}=\psi(z)=\frac{\theta(z)}{\varphi(z)}
[/tex]
Equating the covariance of [itex]X_t[/itex] and [itex]\tilde X_t[/itex] at lags n=0,1,... gives and infinite number of relations between [itex]\psi_j[/itex] and [itex]\tilde \psi_j[/itex]. I was trying to use these relations to show that these coefficients actually coincide but since they are not linear there seems to be no easy inversion scheme available.
Any help would be greatly appreciated.
Pere
While teaching myself Time Series Analysis and ARMA processes in particular, I came across the question, whether two ARMA(p,q) processes
[tex]
\varphi(B)X_t=\theta(B)Z_t \qquad\qquad \tilde \varphi(B)\tilde X_t=\tilde \theta(B)\tilde Z_t\
[/tex]
with different autoregressive and/or moving average polynomials would necessarily have different covariance functions.
I know that the covariance function is given by
[tex]
\gamma_X(n)=\sum_{j\geq 0}{\psi_j\psi_{j+|n|}}
[/tex]
where
[tex]
\sum_{j\geq 0}{\psi_j z^j}=\psi(z)=\frac{\theta(z)}{\varphi(z)}
[/tex]
Equating the covariance of [itex]X_t[/itex] and [itex]\tilde X_t[/itex] at lags n=0,1,... gives and infinite number of relations between [itex]\psi_j[/itex] and [itex]\tilde \psi_j[/itex]. I was trying to use these relations to show that these coefficients actually coincide but since they are not linear there seems to be no easy inversion scheme available.
Any help would be greatly appreciated.
Pere