Please any help will be greatly appreciated.
Suppose that a simple moving average of span N is used to forecast a time series that varies randomly around a constant mean, that is yt=m (m-mean and yt is y sub t). At the start of the period t1 the process shifts to a new mean level, say, m+b...