Diagonalization and similar matrices

In summary, after finding the eigenvalues and eigenvectors for a linear transformation, the matrix is diagonalizable if the eigenvectors are linearly independent.
  • #1
stripes
266
0
So when dealing with a linear transformation, after we have computed the matrix of the linear transformation, and we are asked "is this matrix diagonalizable", I begin by finding the eigenvalues and eigenvectors using the characteristic equation.

Once I have found eigenvectors, if I see these vectors are linearly independent, then a theorem tells us the matrix of the linear transformation is, in fact, diagonalizable. Another theorem tells us that this matrix, D, has entries along its main diagonal that are the eigenvalues of the matrix of the linear transformation.

My questions are as follows: what about the order of these eigenvalues? When I solve the characteristic equation, I can order the roots however I want. This means that the matrix of the transformation is diagonalizable to up to n! different diagonal matrices (actually, exactly n! different matrices if we consider complex roots, since all polynomials of degree n contain exactly n roots). My other question is: when choosing eigenvectors that correspond to eigenvalues, can we just choose any vector in the eigenspace, since often times, our eigenvector contains arbitrary constants that can be any real number? I will illustrate my question with an example:

For the matrix A =
[itex]\left[ \begin{array}{ccc}
1 & 1 \\
-2 & 4 \end{array} \right][/itex]

Solving for the characteristic polynomial, we have

[itex]\lambda = 2, \lambda = 3[/itex]

whose associated eigenvectors are

[itex] \left[ \begin{array}{ccc}
1 \\
1 \end{array} \right][/itex]

and

[itex]\left[ \begin{array}{ccc}
1 \\
2 \end{array} \right][/itex]. These eigenvectors were chosen with a random scalar constant; can we choose any other eigenvectors?

These two vectors are linearly independent, thus the matrix A can be diagonalized. We see that matrix A is similar to matrix D, where D =

[itex]\left[ \begin{array}{ccc}
2 & 0 \\
0 & 3 \end{array} \right][/itex].

Could we not have stated the roots in a different order, and then eventually conclude that A is similar to D, where D =

[itex]\left[ \begin{array}{ccc}
3 & 0 \\
0 & 2 \end{array} \right][/itex]?

I'm confused as to what order we are to choose, if it even matters (which it seems it does. Sorry if this is a "homework type question", but I think my question is more general and not a specific textbook style question. Sorry if my question is confusing. Any help is appreciated.
 
Physics news on Phys.org
  • #2
stripes said:
Could we not have stated the roots in a different order, and then eventually conclude that A is similar to D, where D =

[itex]\left[ \begin{array}{ccc}
3 & 0 \\
0 & 2 \end{array} \right][/itex]?

Yeah sure. That's alright to. As long as you end up with a diagonal matrix. There is no preferred order for the eigenvalues.
 
  • #3
The order in which you put the eingenvalues in the diagonal of D only matter in the construction of U and U inverse where:

A = U D U^-1

Each possible D will have its own U and U inverse pair associated with it.

edit: To be clearer, the way you'd construct U, and D (U inverse is obvious) is: U is the nxn matrix whose columns are the eigenvectors of A in whichever order you wish, and D is the nxn diagonal matrix whose entries are the corresponding eigenvalues to the eigenvectors from U (in the same order).
 
Last edited:
  • #4
Yes that is right, U's columns are the eigenvectors in the order of the eigenvalues along the main diagonal of D. Thanks everyone for your help.
 
  • #5


I understand your confusion and am happy to provide some clarification on diagonalization and similar matrices. Diagonalization is a process in linear algebra where a square matrix is transformed into a diagonal matrix through a similarity transformation. This process can be useful in simplifying calculations and understanding the behavior of a linear transformation.

To answer your first question, the order of the eigenvalues in the diagonal matrix does not affect the diagonalizability of the original matrix. As you mentioned, the characteristic equation can have multiple roots, and the order in which they are listed may vary. However, the diagonal matrix will still have the same eigenvalues on its main diagonal, regardless of the order.

In terms of choosing eigenvectors, you are correct that any vector in the eigenspace can be chosen. This is because the eigenspace is a subspace of the original vector space, and any vector within that subspace can be used as an eigenvector. The arbitrary constants in the eigenvector do not affect its linear independence and therefore do not affect the diagonalizability of the matrix.

In your example, you have correctly found two linearly independent eigenvectors for matrix A, and thus it is diagonalizable. However, it is important to note that not all matrices are diagonalizable. In some cases, there may not be enough linearly independent eigenvectors to form a diagonal matrix. In these cases, the matrix is said to be "defective" and cannot be diagonalized.

I hope this helps to clarify your questions about diagonalization and similar matrices. Keep in mind that these concepts can be complex and may require further study and practice to fully understand. As a scientist, it is important to have a strong understanding of linear algebra and its applications in order to effectively analyze and interpret data.
 

1. What is diagonalization and why is it important in linear algebra?

Diagonalization is the process of finding a diagonal matrix that is similar to a given square matrix. This is important in linear algebra because it simplifies the calculation of powers and inverses of matrices, and allows for easier analysis of matrix properties such as eigenvalues and eigenvectors.

2. How do you determine if a matrix is diagonalizable?

A matrix is diagonalizable if it has a full set of linearly independent eigenvectors. This means that the matrix must have as many distinct eigenvalues as its dimension.

3. Can any matrix be diagonalized?

No, not all matrices can be diagonalized. A matrix can only be diagonalized if it has a full set of linearly independent eigenvectors, as mentioned in the previous question. If a matrix does not have this property, it is called non-diagonalizable.

4. What is the relationship between diagonalization and similarity of matrices?

Diagonalization and similarity are closely related concepts in linear algebra. Similar matrices have the same eigenvalues and eigenvectors, and thus can be transformed into each other through diagonalization. Essentially, diagonalization allows us to find a similar matrix that is easier to work with.

5. How is diagonalization used in applications?

Diagonalization has many applications in fields such as physics, engineering, and computer science. Some examples include using diagonalization to simplify systems of differential equations, finding optimal solutions in optimization problems, and efficient algorithms for computing matrix powers and inverses.

Similar threads

Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
941
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
935
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
817
  • Linear and Abstract Algebra
Replies
2
Views
602
  • Linear and Abstract Algebra
Replies
1
Views
711
  • Linear and Abstract Algebra
Replies
2
Views
2K
Back
Top