Can a matrix A be decomposed if it doesn't have n distinct eigenvalues?

In summary, the conversation discusses the eigenvalue decomposition of an nxn matrix and the conditions under which it can be decomposed. In order to have a decomposition, the matrix must have n distinct eigenvalues and n linearly independent eigenvectors. If these conditions are not met, alternative methods such as the Jordan form or singular value decomposition can be used.
  • #1
JanClaesen
56
0
If a n x n matrix A has an eigenvalue decomposition, so if it has n different eigenvalues, by the way, is it correct that a n x n matrix that doesn't have n different eigenvalues can't be decomposed? Are the more situations in which it can't be decomposed? Why can't I just put the same eigenvalue more than once in the diagonal matrix D, perhaps P is not anymore invertible?

A=PDP^-1

Ax = PDP^-1 * x
The vector x holds the weights for combining the vectors in the column space of P^-1,
so P^-1 * x are the coordinates of the vector x if unit vectors are used as base and x itself are the coordinates of the vector x if the column space of P^-1 is used as base.
D scales every component of this transformed vector, and finally the coordinates of this deformed vector are expressed again in terms of P^-1.
If a matrix A doesn't have an eigenvalue decomposition, does this imply that the vector x isn't just scaled, but is scaled/rotated/translated?
 
Physics news on Phys.org
  • #2
Since no-one has replied in over a week I'll take a swing at your questions.

First, your post has so many questions that I will not attempt to answer them all.

By "eigenvalue decomposition" I'm assuming you mean that the nxn matrix is similar to a diagonal matrix, so that [tex]A = P D P^{-1}[/tex], where [tex]D[/tex] is diagonal. In order to do this, you form [tex]P[/tex] out of the eigenvectors of [tex]A[/tex]. Hence, you need n linearly independent eigenvectors in order for [tex]P^{-1}[/tex] to exist. If [tex]A[/tex] has n distinct eigenvalues, then you are guaranteed to have n linearly independent eigenvectors, so this case is easy. Another easy case is when [tex]A[/tex] is Hermitian (equal to its conjugate transpose), in which case you are also guaranteed to be able to diagonalize, and furthermore [tex]P[/tex] is unitary (I am assuming you are in a complex vector space, so eigenvalues are allowed to be complex, etc.). Otherwise, in general you cannot expect a matrix to be diagonalizable.

If it isn't, then you have a couple of options. First, you can find a [tex]P[/tex] that makes [tex]A[/tex] as diagonal as possible - this is called the Jordan form, and can be useful at times. Another option is to abandon similarity transformations altogether, and use the singular value decomposition which is another matrix decomposition that can be useful in practice.

I hope this helps.

jason
 
Last edited:

1. What is Eigenvalue Decomposition?

Eigenvalue decomposition is a mathematical technique used to decompose a matrix into its constituent parts, namely the eigenvectors and eigenvalues. It is used in linear algebra and has applications in various fields of science and engineering.

2. Why is Eigenvalue Decomposition important?

Eigenvalue decomposition is important because it allows us to simplify complex matrices and perform various operations, such as diagonalization and power iteration, on them. It also provides insight into the properties of a matrix, such as its rank and determinant.

3. How is Eigenvalue Decomposition performed?

Eigenvalue decomposition involves finding the eigenvectors and eigenvalues of a matrix. This can be done by solving the characteristic equation or using specialized algorithms such as the QR algorithm or the power method.

4. What are the applications of Eigenvalue Decomposition?

Eigenvalue decomposition has various applications in fields such as physics, engineering, statistics, and data analysis. It is commonly used in image and signal processing, control systems, and quantum mechanics.

5. Is Eigenvalue Decomposition always possible?

No, eigenvalue decomposition is not always possible. It is only applicable to square matrices and not all square matrices can be decomposed into eigenvalues and eigenvectors. Additionally, some matrices may have complex eigenvalues and eigenvectors, making the decomposition more challenging.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
576
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
915
  • Linear and Abstract Algebra
Replies
2
Views
4K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Quantum Physics
Replies
2
Views
957
  • Linear and Abstract Algebra
Replies
4
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
437
Back
Top