A question in prooving diagonolisation

  • Thread starter Thread starter transgalactic
  • Start date Start date
transgalactic
Messages
1,386
Reaction score
0
i have trouble to understand the way this question is solved

we need to find the values of alfa for which the matrix could be diagonolised

i have written in the link where the problem is

http://img179.imageshack.us/my.php?image=img86031xw0.jpg
 
Physics news on Phys.org
Suppose you are given some (any) matrix M? How would you normally show that it is diagonalizable (apart from explicitly diagonalizing it, of course).
 
i would try to convert as many members as possible to zeros
in the transformation matrix P (not the victim matrix A)
and if ill get a line of zeros
then its not diagonalasabel
 
Anyone?
 
Your matrix is
\left(\begin{array}{ccc}\alpha & \alpha & 1 \\ 0 & 2 & 2 \\ 0 & 0 & 1 \end{array}\right)
and the problem is to find values of \alpha so that the matrix is diagonalizable.
You have, properly set up the "eigenvalue equation" which, since this is is an upper triangular matrix, is just the product on the main diagonal: (\lambda- \alpha)(\lambda- 2)(\lambda - 1)= 0. The eigenvalues are simply \alpha, 2, and 1.
You ask "what is the logic behind taking the values of \lambda the same as \alpha?" The answer is simply that \alpha is an eigenvalue!

It would be a lot easier to help you if you would make it clear what you do know! Here, you should know that a matrix is diagonalizable if and only if it has a "complete" set of eigenvectors- that is, a basis consisting entirely of eigenvectors. In that basis, the matrix representing the linear transformation is diagonal.

Here, the matrix will be diagonalizable if and only if it has three independent eigenvectors. Since eigenvectors corresponding to different eigenvalues must be independent, that will be true if the matrix has three different eigenvalues. That is, this matrix is diagonalizable if \alpha is any number other than 1 or 2.

But even is \alpha is either 1 or 2 the matrix still might be diagonalizable.

Suppose \alpha= 2 so that 2 is a double eigenvalue. What are the eigenvectors corresponding to \lambda= 2? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for \alpha= 2

Suppose \alpha= 1 so that 1 is a double eigenvalue. What are the eigenvectors corresponding to \lambda= 1? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for \alpha= 1
 
HallsofIvy said:
[...] Are there two independent eigenvalues?
Small typo: In both occurrences, read "eigenvalues" as "eigenvectors" there.
 
you said that in order for the matrix to be diagonazable
alfa needs to differ the values of 1 and 2
and if we have 3 different eigenvalues then we have three independent vectors and
thats it.

but i don't like this kind of explanation
i prever to build a transformation matrix P and to show that it has 3 independant vectors
how do i build it in this question?

also you said
"But even if alfa is either 1 or 2 the matrix still might be diagonalizable."

the problem is i showed that for these values we have only one vector
instead of the desired two.generaly i solve this kind of questions by building the transformation matrix P
and show that its vectors are independant

how do i do it here??and even if i'll except your way
in one part you say that alfa differs 1 and 2
but seperatly it can be diagonolized(although i showed its not possible)

whats the right way??
 
Last edited:
OK, so suppose you have a matrix A which you want to diagonalize. You want to build the transformation matrix P, s.t. A = P^{-1} D P (or was the inverse on the other one... I don't recall, and it doesn't really matter). Here D is a diagonal matrix, and the values on the diagonal are the eigenvalues of A. The transformation matrix P contains as its columns exactly the eigenvectors of A. So if A has eigenvalues \lambda_1, \lambda_2, \cdots, \lambda_n with eigenvectors \vec v_1, \vec v_2, \cdots, \vec v_n (both not necessarily distinct) then
P = \begin{pmatrix} (v_1)_1 &amp; (v_2)_1 &amp; \cdots &amp; (v_n)_1 \\ (v_1)_2 &amp; (v_2)_2 &amp; \cdots &amp; (v_n)_2 \\ \vdots &amp; \vdots &amp; \ddots &amp; \vdots \\ (v_1)_n &amp; (v_2)_n &amp; \cdots &amp; (v_n)_n \end{pmatrix}; \qquad<br /> D = \begin{pmatrix} \lambda_1 &amp; 0 &amp; \cdots &amp; 0 \\ 0 &amp; \lambda_2 &amp; \cdots &amp; 0 \\ \vdots &amp; \vdots &amp; \ddots &amp; \vdots \\ 0 &amp; 0 &amp; \cdots &amp; \lambda_n \end{pmatrix}
agree? And A is diagonalizable if you can write it this way, that is: if P is invertible. This is what you already said when you said "and if ill get a line of zeros
then its not diagonalasabel" : if you get a line of zeros, then the rank of the matrix is smaller than n (if you get a line of zeros, it means the row space has dim < n). Equivalently, the column space has dimension < n: two columns are linearly dependent. This means exactly that two (or more) of the eigenvectors v_1, \cdots, v_n are linearly dependent. Now of course, they cannot be linearly dependent and still be eigenvectors for different eigenvalues, so this can only happen if there are eigenvalues which occur more than once. So if all eigenvalues are distinct, A will certainly be diagonalizable. If an eigenvalue does occur (say) twice, you must calculate the eigenvectors and see if they are distinct.

I hope you see that the two problems are equivalent. What you want to do when you say you explicitly want to write down the transformation matrix, is calculate all eigenvectors and put them as columns in this matrix. What HallsOfIvy told you is sort of a shortcut: if all eigenvalues are distinct, then you know already that you are not going to get a row of zeros in P if you calculate them and put them as columns in P. But if two eigenvalues are the same (which happens when \alpha = 1, 2) then it can happen that you get two linearly dependent eigenvectors, aka a non-maximal rank transformation matrix, aka a non-invertible transformation matrix, aka a transformation matrix which will give a row of zeros when row-reduces; so in these cases you should explicitly calculate those eigenvectors. Of course, if you also calculate those for the other two eigenvalues (1 and 2) you have all the eigenvectors and you can still explicitly write down the transformation matrix, although it is not necessary.
 
so if i understood you correctly
then in case of 3 different eigenvalues we know for a fact that its diagonizable without
building the transformation matrix.

but it also could diagonolize if we have two similar eigenvalues
in these cases we need to try and build the transformation matrix
and show that its vectors are dippendent aka matrix not diagonizable(in this cases)

is that correct??
 
  • #10
transgalactic said:
you said that in order for the matrix to be diagonazable
alfa needs to differ the values of 1 and 2
and if we have 3 different eigenvalues then we have three independent vectors and
thats it.
No I did NOT say that. I said that if \alpha is neither 1 nor 2 then the matrix is diagonal. I did not say that \alpha must be not equal to 1 or 2.

but i don't like this kind of explanation
i prever to build a transformation matrix P and to show that it has 3 independant vectors
how do i build it in this question?
You "prefer" to do that even if you don't know how to do it? Or even if it can be done? The transformation TO what?

also you said
"But even if alfa is either 1 or 2 the matrix still might be diagonalizable."

the problem is i showed that for these values we have only one vector
instead of the desired two.
Excellent! Then you have shown that the matrix is diagonalizable for any \alpha other than 1 or 2 and not diagonalizable if \alpha is equal to either 1 or 2!


generaly i solve this kind of questions by building the transformation matrix P
and show that its vectors are independant

how do i do it here??
What reason do you have to think that you can do it here? "If the only tool you have is a hammer, every problem looks like a nail." You need more tools. And you develop more tools by understanding what is happening, not just applying formulas.


and even if i'll except your way
in one part you say that alfa differs 1 and 2
but seperatly it can be diagonolized(although i showed its not possible)

whats the right way??
Again, I did not say anything like that! I said that IF \alpha is neither 1 nor 2then the matrix is diagonalizable. If \alpha is either 1 or 2 then the matrix might be diagonalizable. You have shown that, for this particular matrix, that is not true.
 
  • #11
thanks : )
 
Back
Top