How Can I Derive a Covariance Matrix from Known Eigenvalues and an Eigenvector?

Click For Summary
SUMMARY

The discussion focuses on deriving a covariance matrix from known eigenvalues and an eigenvector in the context of a 2D Gaussian distribution. The covariance matrix P can be expressed as P = VDV^T, where V is the matrix formed by the eigenvectors and D is a diagonal matrix containing the eigenvalues. The user identifies the need for a closed-form solution to construct the covariance matrix using the provided eigenvalues and eigenvector directions. The formulation emphasizes the properties of symmetric matrices and orthogonal eigenvectors.

PREREQUISITES
  • Understanding of 2D Gaussian distributions
  • Knowledge of eigenvalues and eigenvectors
  • Familiarity with matrix diagonalization
  • Concept of symmetric positive semi-definite matrices
NEXT STEPS
  • Study the process of matrix diagonalization in linear algebra
  • Learn about the properties of symmetric matrices and their eigenvectors
  • Explore the derivation of covariance matrices in multivariate statistics
  • Investigate applications of Mahalanobis distance in statistical analysis
USEFUL FOR

Statisticians, data scientists, and anyone involved in multivariate analysis or machine learning who needs to understand covariance matrix derivation from eigenvalues and eigenvectors.

Lindley
Messages
6
Reaction score
0
I have a Gaussian distribution. I know the variance in the directions of the first and second eigenvectors (the directions of maximum and minimum radius of the corresponding ellipse at any fixed mahalnobis distance), and the direction of the first eigenvector.

Is there a simple closed form equation to derive the corresponding covariance matrix? It seems like there should be, since if H0 is the first eigenvector and H1 is the second, then

H0*P*H0^T = var0
H1*P*H1^T = var1

However, this is only two equations and there are 3 unknowns (Pxx, Pxy, and Pyy). Any help?
 
Physics news on Phys.org
I don't 100% understand you formulation... however i read as following:
- the problem is 2D guassian distribution

you want to find the 2x2 covariance matrix,
[tex]P = \begin{pmatrix} P_{xx} & P_{xy} \\ P_{yx} & P_{yy} \end{pmatrix}[/tex]

which is a symmetric (Pxy = Pyx) positive semi-definite matrix, you know the eigenvalues, and an eigenvector of the matrix

in fact as it it symmetric, eigenvectors will be orthogonal, you know the direction and of both eigenvectors (call them v1, v2)

why not use the eigenvectors to create a matrix to diagonalise, call it V, then
[tex]V = [v_1, v_2] [/tx]<br /> [tex]V^T P V = D[/tex]<br /> <br /> where D is a diagonal matrix with eignevalues on the diagonal, then <br /> [tex]P = V D V^T[/tex][/tex]
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 5 ·
Replies
5
Views
15K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
10K
Replies
11
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
5
Views
2K
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K