- #1
kingwinner
- 1,270
- 0
Let ∑ be the variance-covariance matrix of a random vector X. The first component of X is X1, and the second component of X is X2.
Then det(∑)=0
<=> the inverse of ∑ does not exist
<=> there exists c≠0 such that
a.s.
d=(c1)(X1)+(c2)(X2) (i.e. (c1)(X1)+(c2)(X2) is equal to some constant d almost surely)
=======================
I don't understand the last part. Why is it true? How can we prove it?
Any help is appreciated!:)
Then det(∑)=0
<=> the inverse of ∑ does not exist
<=> there exists c≠0 such that
a.s.
d=(c1)(X1)+(c2)(X2) (i.e. (c1)(X1)+(c2)(X2) is equal to some constant d almost surely)
=======================
I don't understand the last part. Why is it true? How can we prove it?
Any help is appreciated!:)