# A question in prooving diagonolisation

1. Feb 29, 2008

### transgalactic

i have trouble to understand the way this question is solved

we need to find the values of alfa for wich the matrix could be diagonolised

i have writen in the link where the problem is

http://img179.imageshack.us/my.php?image=img86031xw0.jpg

2. Feb 29, 2008

### CompuChip

Suppose you are given some (any) matrix M? How would you normally show that it is diagonalizable (apart from explicitly diagonalizing it, of course).

3. Feb 29, 2008

### transgalactic

i would try to convert as many members as possible to zeros
in the transformation matrix P (not the victim matrix A)
and if ill get a line of zeros
then its not diagonalasabel

4. Feb 29, 2008

Anyone?

5. Mar 1, 2008

### HallsofIvy

Staff Emeritus
$$\left(\begin{array}{ccc}\alpha & \alpha & 1 \\ 0 & 2 & 2 \\ 0 & 0 & 1 \end{array}\right)$$
and the problem is to find values of $\alpha$ so that the matrix is diagonalizable.
You have, properly set up the "eigenvalue equation" which, since this is is an upper triangular matrix, is just the product on the main diagonal: $(\lambda- \alpha)(\lambda- 2)(\lambda - 1)= 0$. The eigenvalues are simply $\alpha$, 2, and 1.
You ask "what is the logic behind taking the values of $\lambda$ the same as $\alpha$?" The answer is simply that $\alpha$ is an eigenvalue!

It would be a lot easier to help you if you would make it clear what you do know! Here, you should know that a matrix is diagonalizable if and only if it has a "complete" set of eigenvectors- that is, a basis consisting entirely of eigenvectors. In that basis, the matrix representing the linear transformation is diagonal.

Here, the matrix will be diagonalizable if and only if it has three independent eigenvectors. Since eigenvectors corresponding to different eigenvalues must be independent, that will be true if the matrix has three different eigenvalues. That is, this matrix is diagonalizable if $\alpha$ is any number other than 1 or 2.

But even is $\alpha$ is either 1 or 2 the matrix still might be diagonalizable.

Suppose $\alpha= 2$ so that 2 is a double eigenvalue. What are the eigenvectors corresponding to $\lambda= 2$? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for $\alpha= 2$

Suppose $\alpha= 1$ so that 1 is a double eigenvalue. What are the eigenvectors corresponding to $\lambda= 1$? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for $\alpha= 1$

6. Mar 1, 2008

### CompuChip

Small typo: In both occurrences, read "eigenvalues" as "eigenvectors" there.

7. Mar 1, 2008

### transgalactic

you said that in order for the matrix to be diagonazable
alfa needs to differ the values of 1 and 2
and if we have 3 different eigenvalues then we have three independent vectors and
thats it.

but i dont like this kind of explanation
i prever to build a transformation matrix P and to show that it has 3 independant vectors
how do i build it in this question????

also you said
"But even if alfa is either 1 or 2 the matrix still might be diagonalizable."

the problem is i showed that for these values we have only one vector

generaly i solve this kind of questions by building the transformation matrix P
and show that its vectors are independant

how do i do it here??

and even if i'll except your way
in one part you say that alfa differs 1 and 2
but seperatly it can be diagonolized(although i showed its not possible)

whats the right way??

Last edited: Mar 1, 2008
8. Mar 1, 2008

### CompuChip

OK, so suppose you have a matrix A which you want to diagonalize. You want to build the transformation matrix P, s.t. $A = P^{-1} D P$ (or was the inverse on the other one... I don't recall, and it doesn't really matter). Here D is a diagonal matrix, and the values on the diagonal are the eigenvalues of A. The transformation matrix P contains as its columns exactly the eigenvectors of A. So if A has eigenvalues $\lambda_1, \lambda_2, \cdots, \lambda_n$ with eigenvectors $\vec v_1, \vec v_2, \cdots, \vec v_n$ (both not necessarily distinct) then
$$P = \begin{pmatrix} (v_1)_1 & (v_2)_1 & \cdots & (v_n)_1 \\ (v_1)_2 & (v_2)_2 & \cdots & (v_n)_2 \\ \vdots & \vdots & \ddots & \vdots \\ (v_1)_n & (v_2)_n & \cdots & (v_n)_n \end{pmatrix}; \qquad D = \begin{pmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{pmatrix}$$
agree? And A is diagonalizable if you can write it this way, that is: if P is invertible. This is what you already said when you said "and if ill get a line of zeros
then its not diagonalasabel" : if you get a line of zeros, then the rank of the matrix is smaller than n (if you get a line of zeros, it means the row space has dim < n). Equivalently, the column space has dimension < n: two columns are linearly dependent. This means exactly that two (or more) of the eigenvectors $v_1, \cdots, v_n$ are linearly dependent. Now of course, they cannot be linearly dependent and still be eigenvectors for different eigenvalues, so this can only happen if there are eigenvalues which occur more than once. So if all eigenvalues are distinct, A will certainly be diagonalizable. If an eigenvalue does occur (say) twice, you must calculate the eigenvectors and see if they are distinct.

I hope you see that the two problems are equivalent. What you want to do when you say you explicitly want to write down the transformation matrix, is calculate all eigenvectors and put them as columns in this matrix. What HallsOfIvy told you is sort of a shortcut: if all eigenvalues are distinct, then you know already that you are not going to get a row of zeros in P if you calculate them and put them as columns in P. But if two eigenvalues are the same (which happens when $\alpha = 1, 2$) then it can happen that you get two linearly dependent eigenvectors, aka a non-maximal rank transformation matrix, aka a non-invertible transformation matrix, aka a transformation matrix which will give a row of zeros when row-reduces; so in these cases you should explicitly calculate those eigenvectors. Of course, if you also calculate those for the other two eigenvalues (1 and 2) you have all the eigenvectors and you can still explicitly write down the transformation matrix, although it is not necessary.

9. Mar 1, 2008

### transgalactic

so if i understood you correctly
then in case of 3 different eigenvalues we know for a fact that its diagonizable without
building the transformation matrix.

but it also could diagonolize if we have two similar eigenvalues
in these cases we need to try and build the transformation matrix
and show that its vectors are dippendent aka matrix not diagonizable(in this cases)

is that correct??

10. Mar 1, 2008

### HallsofIvy

Staff Emeritus
No I did NOT say that. I said that if $\alpha$ is neither 1 nor 2 then the matrix is diagonal. I did not say that $\alpha$ must be not equal to 1 or 2.

You "prefer" to do that even if you don't know how to do it? Or even if it can be done? The transformation TO what?

Excellent! Then you have shown that the matrix is diagonalizable for any $\alpha$ other than 1 or 2 and not diagonalizable if $\alpha$ is equal to either 1 or 2!

What reason do you have to think that you can do it here? "If the only tool you have is a hammer, every problem looks like a nail." You need more tools. And you develop more tools by understanding what is happening, not just applying formulas.

Again, I did not say anything like that! I said that IF $\alpha$ is neither 1 nor 2then the matrix is diagonalizable. If $\alpha$ is either 1 or 2 then the matrix might be diagonalizable. You have shown that, for this particular matrix, that is not true.

11. Mar 1, 2008

thanks : )