- #1

- 15

- 0

Suppose that a simple moving average of span N is used to forecast a time series that varies randomly around a constant mean, that is yt=m (m-mean and yt is y sub t). At the start of the period t1 the process shifts to a new mean level, say, m+b. Show that the exepected value of the moving average is

m when T<=t1-1

m+b-(t1+N-1)b/n when t1<=T<=t1+N-2

m+b when T>=N

I can prove 1st and the last part I am just really confused on the 2nd part--is that condition supposed to be t1<=T<=t1+N-1. Otherwise aren't we missing values if we have N-2??? I am sure I am missing something but I don't know what. PLEASE any suggestions will help.

Thanks