Orthogonally diagonalizing the matrix

  • Thread starter Thread starter war485
  • Start date Start date
  • Tags Tags
    Matrix
AI Thread Summary
The discussion centers around the process of orthogonally diagonalizing a given matrix A by finding an orthogonal matrix Q and a diagonal matrix D. The characteristic equation derived from the matrix leads to eigenvalues of 5, -1, and -1, with the corresponding eigenvector for the eigenvalue 5 being [1, 1, 1]. The user encounters difficulties in finding additional independent eigenvectors for the eigenvalue -1, leading to confusion about the independence of the resulting vectors. The conversation emphasizes the importance of understanding the span of eigenvectors and the distinction between different vector spaces, highlighting that orthogonal diagonalization simplifies matrix operations.
war485
Messages
90
Reaction score
0

Homework Statement



This is for linear algebra/matrix:

Orthogonally diagonalize this matrix A by finding an orthogonal matrix Q and a diagonal matrix D such that QTAQ = D

A =
[ 1 2 2 ]
[ 2 1 2 ]
[ 2 2 1 ]

Homework Equations



(A - \lambdaI ) = 0

The Attempt at a Solution



D =
[5 0 0 ]
[0 -1 0 ]
[0 0 -1 ]

characteristic equation : -\lambda3 + \lambda2 + 9\lambda + 5 = 0

\lambda = 5, -1, -1 (I got these after factoring the characteristic equation)

when \lambda = 5, I got v1 = [ 1 1 1 ]

Then I'm almost done but I got stuck when trying to find v2 and v3 when \lambda = -1 because when I tried to do it, it turned out weird (it turned into a zero matrix!):
[ 0 0 0 ]
[ 0 0 0 ]
[ 0 0 0 ]

So I think it means that x1 , x2 and x3 are all free variables for v2 and v3 , but if that's the case, then how can I make v1 v2 v3 into an orthogonal matrix if they're not independent?? I almost got it but I've no idea what to do now! Does this mean that it is not possible to orthogonally diagonalize it?
 
Physics news on Phys.org
Ok, I'm pretty sure I got it but I still have a problem.
[edit] sorry for making it complicated earlier.
I'll dumb down my problem:

I need help seeing that this matrix
[ 1 1 1 ]
[ 1 1 1 ]
[ 1 1 1 ]

have these two eigenvectors:
[-1, 1, 0]

[-1, 0, 1]

how?
I keep getting [ 0 -1 -1 ] and [ -1 0 -1 ]
 
war485 said:
have these two eigenvectors:
[-1, 1, 0]

[-1, 0, 1]

how?
I keep getting [ 0 -1 -1 ] and [ -1 0 -1 ]
It is impossible for a matrix to have exactly two eigenvectors. Instead, it might have a two-dimensional space of eigenvectors...

(incidentally, it's very easy to check if a given vector is an eigenvector...)
 
maybe I used the wrong terminology.
I think I meant that one of its eigenspace is the span of { [-1, 1, 0] , [-1, 0, 1] }
but I can't see how.

But I can see that its other eigenspace is [ 1 1 1 ]
 
First, I claim that it's very easy to show that that span is a subspace of the -1 eigenspace, just by direct verification.


Secondly, I was trying to give you a hint by making you use more precise terminology. The problem is to find a particular vector space. Your answer key specified a basis for some vector space. Your work computed a basis for some vector space. You're focusing too much on the fact that your basis is different than the answer key's basis... but you haven't spent any effort checking whether or not the answer key's vector space is equal to or different from your vector space...

If you're given spanning sets for two vector spaces, how do you check if they're equal or not?
 
Why orthogonally diagonalize a matrix?
 
matqkks said:
Why orthogonally diagonalize a matrix?
Because your teacher requires it on homework or a test?

But there are many good reasons to diagonalize a matrix- diagonal matrices are far easier to work with that other matrices- it becomes easy to take any power, find the exponential, or, generally, any function that has a Taylor's series.

"Orhogonally" diagonalizing a matrix is not quite as important but any matrix that can be diagonalized can be diagonlized using orthogonal matrices. And orthogonal matrices are relatively easy to handle.
 

Similar threads

Back
Top