Let ∑ be the variance-covariance matrix of a random vector X. The first component of X is X1, and the second component of X is X2.(adsbygoogle = window.adsbygoogle || []).push({});

Then det(∑)=0

<=> the inverse of ∑ does not exist

<=> there exists c≠0 such that

a.s.

d=(c1)(X1)+(c2)(X2) (i.e. (c1)(X1)+(c2)(X2) is equal to some constant d almost surely)

=======================

I don't understand the last part. Why is it true? How can we prove it?

Any help is appreciated!:)

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Determinant of the variance-covariance matrix

Loading...

Similar Threads for Determinant variance covariance |
---|

I Determining functional relation of two dependant variables |

I How do I determine if improving a fit is significant? |

I One-Way Analysis of Variance |

I Variance of ratios |

I How to determine energy savings from weather and usage data |

**Physics Forums | Science Articles, Homework Help, Discussion**