# Diagonalization and similar matrices

1. Apr 8, 2012

### stripes

So when dealing with a linear transformation, after we have computed the matrix of the linear transformation, and we are asked "is this matrix diagonalizable", I begin by finding the eigenvalues and eigenvectors using the characteristic equation.

Once I have found eigenvectors, if I see these vectors are linearly independent, then a theorem tells us the matrix of the linear transformation is, in fact, diagonalizable. Another theorem tells us that this matrix, D, has entries along its main diagonal that are the eigenvalues of the matrix of the linear transformation.

My questions are as follows: what about the order of these eigenvalues? When I solve the characteristic equation, I can order the roots however I want. This means that the matrix of the transformation is diagonalizable to up to n! different diagonal matrices (actually, exactly n! different matrices if we consider complex roots, since all polynomials of degree n contain exactly n roots). My other question is: when choosing eigenvectors that correspond to eigenvalues, can we just choose any vector in the eigenspace, since often times, our eigenvector contains arbitrary constants that can be any real number? I will illustrate my question with an example:

For the matrix A =
$\left[ \begin{array}{ccc} 1 & 1 \\ -2 & 4 \end{array} \right]$

Solving for the characteristic polynomial, we have

$\lambda = 2, \lambda = 3$

whose associated eigenvectors are

$\left[ \begin{array}{ccc} 1 \\ 1 \end{array} \right]$

and

$\left[ \begin{array}{ccc} 1 \\ 2 \end{array} \right]$. These eigenvectors were chosen with a random scalar constant; can we choose any other eigenvectors?

These two vectors are linearly independent, thus the matrix A can be diagonalized. We see that matrix A is similar to matrix D, where D =

$\left[ \begin{array}{ccc} 2 & 0 \\ 0 & 3 \end{array} \right]$.

Could we not have stated the roots in a different order, and then eventually conclude that A is similar to D, where D =

$\left[ \begin{array}{ccc} 3 & 0 \\ 0 & 2 \end{array} \right]$?

I'm confused as to what order we are to choose, if it even matters (which it seems it does. Sorry if this is a "homework type question", but I think my question is more general and not a specific textbook style question. Sorry if my question is confusing. Any help is appreciated.

2. Apr 8, 2012

### micromass

Yeah sure. That's alright to. As long as you end up with a diagonal matrix. There is no prefered order for the eigenvalues.

3. Apr 9, 2012

### DocZaius

The order in which you put the eingenvalues in the diagonal of D only matter in the construction of U and U inverse where:

A = U D U^-1

Each possible D will have its own U and U inverse pair associated with it.

edit: To be clearer, the way you'd construct U, and D (U inverse is obvious) is: U is the nxn matrix whose columns are the eigenvectors of A in whichever order you wish, and D is the nxn diagonal matrix whose entries are the corresponding eigenvalues to the eigenvectors from U (in the same order).

Last edited: Apr 9, 2012
4. Apr 9, 2012

### stripes

Yes that is right, U's columns are the eigenvectors in the order of the eigenvalues along the main diagonal of D. Thanks everyone for your help.