- #1

- 614

- 13

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Adel Makram
- Start date

- #1

- 614

- 13

- #2

- 18,633

- 8,568

- #3

- 614

- 13

Ok, here is an attached image of the tarjectory matrix X, the column vector of length L which is the window length of the series. Now suppose that the time series that is represented by this matrix has a period which is just equal to the time between 2 successful Xs values. For example, the period of the time series is equal to the time between x1 and x2 which is also equal to the time between X2 and X3 and so forth ( sorry I mentioned, the period =L in the origial post). In other words, the time series has a constant value as a function of time if we only scan it with time intervals =the time difference between 2 successful Xs. Now the matrix surely degenerates into a rank one matrix on doing Singular Value Decomposition (SVD) operation. Then what is the interpretation of that case? And in general, what value of L should be used to grantee the non-reduction of the matrix into one rank?

Share: