# A question in prooving diagonolisation

• transgalactic
In summary, the conversation discusses the process of finding values of alfa for which a given matrix is diagonalizable. The main methods mentioned include setting up an eigenvalue equation and finding independent eigenvectors, as well as building a transformation matrix P using eigenvectors. It is noted that if all eigenvalues are distinct, the matrix will be diagonalizable, but if two eigenvalues are the same, further calculations must be done to determine if the eigenvectors are independent.
transgalactic
i have trouble to understand the way this question is solved

we need to find the values of alfa for which the matrix could be diagonolised

i have written in the link where the problem is

http://img179.imageshack.us/my.php?image=img86031xw0.jpg

Suppose you are given some (any) matrix M? How would you normally show that it is diagonalizable (apart from explicitly diagonalizing it, of course).

i would try to convert as many members as possible to zeros
in the transformation matrix P (not the victim matrix A)
and if ill get a line of zeros
then its not diagonalasabel

Anyone?

$$\left(\begin{array}{ccc}\alpha & \alpha & 1 \\ 0 & 2 & 2 \\ 0 & 0 & 1 \end{array}\right)$$
and the problem is to find values of $\alpha$ so that the matrix is diagonalizable.
You have, properly set up the "eigenvalue equation" which, since this is is an upper triangular matrix, is just the product on the main diagonal: $(\lambda- \alpha)(\lambda- 2)(\lambda - 1)= 0$. The eigenvalues are simply $\alpha$, 2, and 1.
You ask "what is the logic behind taking the values of $\lambda$ the same as $\alpha$?" The answer is simply that $\alpha$ is an eigenvalue!

It would be a lot easier to help you if you would make it clear what you do know! Here, you should know that a matrix is diagonalizable if and only if it has a "complete" set of eigenvectors- that is, a basis consisting entirely of eigenvectors. In that basis, the matrix representing the linear transformation is diagonal.

Here, the matrix will be diagonalizable if and only if it has three independent eigenvectors. Since eigenvectors corresponding to different eigenvalues must be independent, that will be true if the matrix has three different eigenvalues. That is, this matrix is diagonalizable if $\alpha$ is any number other than 1 or 2.

But even is $\alpha$ is either 1 or 2 the matrix still might be diagonalizable.

Suppose $\alpha= 2$ so that 2 is a double eigenvalue. What are the eigenvectors corresponding to $\lambda= 2$? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for $\alpha= 2$

Suppose $\alpha= 1$ so that 1 is a double eigenvalue. What are the eigenvectors corresponding to $\lambda= 1$? Are there two independent eigenvalues? If so, then the matrix is diagonalizable even for $\alpha= 1$

HallsofIvy said:
[...] Are there two independent eigenvalues?
Small typo: In both occurrences, read "eigenvalues" as "eigenvectors" there.

you said that in order for the matrix to be diagonazable
alfa needs to differ the values of 1 and 2
and if we have 3 different eigenvalues then we have three independent vectors and
thats it.

but i don't like this kind of explanation
i prever to build a transformation matrix P and to show that it has 3 independant vectors
how do i build it in this question?

also you said
"But even if alfa is either 1 or 2 the matrix still might be diagonalizable."

the problem is i showed that for these values we have only one vector
instead of the desired two.generaly i solve this kind of questions by building the transformation matrix P
and show that its vectors are independant

how do i do it here??and even if i'll except your way
in one part you say that alfa differs 1 and 2
but seperatly it can be diagonolized(although i showed its not possible)

whats the right way??

Last edited:
OK, so suppose you have a matrix A which you want to diagonalize. You want to build the transformation matrix P, s.t. $A = P^{-1} D P$ (or was the inverse on the other one... I don't recall, and it doesn't really matter). Here D is a diagonal matrix, and the values on the diagonal are the eigenvalues of A. The transformation matrix P contains as its columns exactly the eigenvectors of A. So if A has eigenvalues $\lambda_1, \lambda_2, \cdots, \lambda_n$ with eigenvectors $\vec v_1, \vec v_2, \cdots, \vec v_n$ (both not necessarily distinct) then
$$P = \begin{pmatrix} (v_1)_1 & (v_2)_1 & \cdots & (v_n)_1 \\ (v_1)_2 & (v_2)_2 & \cdots & (v_n)_2 \\ \vdots & \vdots & \ddots & \vdots \\ (v_1)_n & (v_2)_n & \cdots & (v_n)_n \end{pmatrix}; \qquad D = \begin{pmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{pmatrix}$$
agree? And A is diagonalizable if you can write it this way, that is: if P is invertible. This is what you already said when you said "and if ill get a line of zeros
then its not diagonalasabel" : if you get a line of zeros, then the rank of the matrix is smaller than n (if you get a line of zeros, it means the row space has dim < n). Equivalently, the column space has dimension < n: two columns are linearly dependent. This means exactly that two (or more) of the eigenvectors $v_1, \cdots, v_n$ are linearly dependent. Now of course, they cannot be linearly dependent and still be eigenvectors for different eigenvalues, so this can only happen if there are eigenvalues which occur more than once. So if all eigenvalues are distinct, A will certainly be diagonalizable. If an eigenvalue does occur (say) twice, you must calculate the eigenvectors and see if they are distinct.

I hope you see that the two problems are equivalent. What you want to do when you say you explicitly want to write down the transformation matrix, is calculate all eigenvectors and put them as columns in this matrix. What HallsOfIvy told you is sort of a shortcut: if all eigenvalues are distinct, then you know already that you are not going to get a row of zeros in P if you calculate them and put them as columns in P. But if two eigenvalues are the same (which happens when $\alpha = 1, 2$) then it can happen that you get two linearly dependent eigenvectors, aka a non-maximal rank transformation matrix, aka a non-invertible transformation matrix, aka a transformation matrix which will give a row of zeros when row-reduces; so in these cases you should explicitly calculate those eigenvectors. Of course, if you also calculate those for the other two eigenvalues (1 and 2) you have all the eigenvectors and you can still explicitly write down the transformation matrix, although it is not necessary.

so if i understood you correctly
then in case of 3 different eigenvalues we know for a fact that its diagonizable without
building the transformation matrix.

but it also could diagonolize if we have two similar eigenvalues
in these cases we need to try and build the transformation matrix
and show that its vectors are dippendent aka matrix not diagonizable(in this cases)

is that correct??

transgalactic said:
you said that in order for the matrix to be diagonazable
alfa needs to differ the values of 1 and 2
and if we have 3 different eigenvalues then we have three independent vectors and
thats it.
No I did NOT say that. I said that if $\alpha$ is neither 1 nor 2 then the matrix is diagonal. I did not say that $\alpha$ must be not equal to 1 or 2.

but i don't like this kind of explanation
i prever to build a transformation matrix P and to show that it has 3 independant vectors
how do i build it in this question?
You "prefer" to do that even if you don't know how to do it? Or even if it can be done? The transformation TO what?

also you said
"But even if alfa is either 1 or 2 the matrix still might be diagonalizable."

the problem is i showed that for these values we have only one vector
Excellent! Then you have shown that the matrix is diagonalizable for any $\alpha$ other than 1 or 2 and not diagonalizable if $\alpha$ is equal to either 1 or 2!

generaly i solve this kind of questions by building the transformation matrix P
and show that its vectors are independant

how do i do it here??
What reason do you have to think that you can do it here? "If the only tool you have is a hammer, every problem looks like a nail." You need more tools. And you develop more tools by understanding what is happening, not just applying formulas.

and even if i'll except your way
in one part you say that alfa differs 1 and 2
but seperatly it can be diagonolized(although i showed its not possible)

whats the right way??
Again, I did not say anything like that! I said that IF $\alpha$ is neither 1 nor 2then the matrix is diagonalizable. If $\alpha$ is either 1 or 2 then the matrix might be diagonalizable. You have shown that, for this particular matrix, that is not true.

thanks : )

## 1. How do I prove that a matrix is diagonalizable?

To prove that a matrix is diagonalizable, you need to show that it has n linearly independent eigenvectors, where n is the dimension of the matrix. This can be done by finding the eigenvalues of the matrix and then finding the corresponding eigenvectors.

## 2. What is the significance of diagonalization in linear algebra?

Diagonalization is important because it simplifies matrix operations and makes them easier to understand. It also allows for the efficient computation of powers of matrices and the solving of systems of linear differential equations.

## 3. Can any matrix be diagonalized?

No, not every matrix can be diagonalized. A matrix can only be diagonalized if it has n linearly independent eigenvectors, where n is the dimension of the matrix. If there are not enough eigenvectors, the matrix cannot be diagonalized.

## 4. How do I find the diagonal matrix after diagonalizing a matrix?

To find the diagonal matrix after diagonalizing a matrix, you need to find the eigenvalues and eigenvectors of the original matrix. The diagonal matrix will have the eigenvalues as its diagonal entries and the eigenvectors as its columns.

## 5. What are the applications of diagonalization in real life?

Diagonalization has many applications in fields such as engineering, physics, and computer science. It is used in image processing, signal processing, and data compression. It is also used in solving systems of differential equations and in calculating the long-term behavior of complex systems.

• Calculus and Beyond Homework Help
Replies
4
Views
2K
• Calculus and Beyond Homework Help
Replies
4
Views
1K
• Calculus and Beyond Homework Help
Replies
1
Views
2K
• Calculus and Beyond Homework Help
Replies
2
Views
531
• Calculus and Beyond Homework Help
Replies
3
Views
957
• Calculus and Beyond Homework Help
Replies
7
Views
1K
• Calculus and Beyond Homework Help
Replies
2
Views
725
• Calculus and Beyond Homework Help
Replies
11
Views
1K
• Calculus and Beyond Homework Help
Replies
3
Views
736
• Calculus and Beyond Homework Help
Replies
8
Views
2K