- #1
Lindley
- 7
- 0
I have a Gaussian distribution. I know the variance in the directions of the first and second eigenvectors (the directions of maximum and minimum radius of the corresponding ellipse at any fixed mahalnobis distance), and the direction of the first eigenvector.
Is there a simple closed form equation to derive the corresponding covariance matrix? It seems like there should be, since if H0 is the first eigenvector and H1 is the second, then
H0*P*H0^T = var0
H1*P*H1^T = var1
However, this is only two equations and there are 3 unknowns (Pxx, Pxy, and Pyy). Any help?
Is there a simple closed form equation to derive the corresponding covariance matrix? It seems like there should be, since if H0 is the first eigenvector and H1 is the second, then
H0*P*H0^T = var0
H1*P*H1^T = var1
However, this is only two equations and there are 3 unknowns (Pxx, Pxy, and Pyy). Any help?