Hi
I have a stationary gaussian process { X_t } and I have the correlation function p(t) = Corr(X_s, X_(s+t)) = 1 - t/m, where X_t is in {1,2,..,m}, and I want to simulate this process.
The main idea is to write X_t as sum from 1 to N of V_i, X_t+1 as sum from p+1 to N+p V_i and so on, where...