Eigenproblem for non-normal matrix

  • I
  • Thread starter swampwiz
  • Start date
  • Tags
    Matrix
In summary: I just got my own answer. The original matrix would have to be normal, so there is no other case.)In summary, the conversation discusses the concept of eigendecomposition for normal and non-normal matrices. A normal matrix has a complete and canonical eigendecomposition with a unitary modal eigenvector matrix and a diagonal modal eigenvalue matrix. However, for a non-normal matrix, the modal eigenvector matrix must be invertible in order to have an eigendecomposition, although it may be more difficult to find. The proof for diagonalizability of matrices with distinct eigenvalues is discussed, and it is shown that a non-normal matrix can never have a complete eigendecomposition.
  • #1
swampwiz
571
83
I understand that a normal matrix has a complete, canonical eigendecomposition in which the normalized modal eigenvector matrix is unitary, and this its inverse is simply the transpose, and the modal eigenvalue matrix is diagonal (let's presume distinct eigenvalues). But I wonder if there is also an eigendecomposition for a non-normal matrix as well so long as the modal eigenvector matrix is invertible, although it would be clunky to have to take the inverse of [ A ], and would unclear as to whether there is a diagonalization (as opposed to being automatic for the case of a normal matrix). Perhaps what is termed an eigensolution is something that only diagonalizes, and therefore only normal matrices are considered eigensolvable?

[ A ] { x } = λ { x }

get → { λ } & [ Φ ]

{ x } = [ Φ ] { q }

[ A ] [ Φ ] { q } = λ [ Φ ] { q } = [ Φ ] [ Λ ] { q }

[ Φ ]-1 [ A ] [ Φ ] { q } = [ Λ ] { q }
 
Physics news on Phys.org
  • #2
(this post assumes we're dealing with n x n matrices with scalars in ##\mathbb C##.)

Yes, normal matrices are the good stuff. And yes the conjugate transpose of a unitary matrix (collecting the eigenvectors in this case) is the inverse.

swampwiz said:
the modal eigenvalue matrix is diagonal (let's presume distinct eigenvalues). But I wonder if there is also an eigendecomposition for a non-normal matrix as well so long as the modal eigenvector matrix is invertible, although it would be clunky to have to take the inverse of [ A ], and would unclear as to whether there is a diagonalization (as opposed to being automatic for the case of a normal matrix)

no: you already stated that the eigenvalues are distinct. As you've pointed out, normal matrices are always diagonalizable. But any matrix with ##n## distinct eigenvalues must be diagonalizable irrespective of normality. That is, unique eigenvalues have linearly independent eigenvectors. (Why?) There are a few proofs of this, a lot of which rely on a contradiction. By far my favorite proof just shows this directly, using a Vandermonde matrix.
 
  • #3
StoneTemplePython said:
(this post assumes we're dealing with n x n matrices with scalars in ##\mathbb C##.)

Yes, normal matrices are the good stuff. And yes the conjugate transpose of a unitary matrix (collecting the eigenvectors in this case) is the inverse.
no: you already stated that the eigenvalues are distinct. As you've pointed out, normal matrices are always diagonalizable. But any matrix with ##n## distinct eigenvalues must be diagonalizable irrespective of normality. That is, unique eigenvalues have linearly independent eigenvectors. (Why?) There are a few proofs of this, a lot of which rely on a contradiction. By far my favorite proof just shows this directly, using a Vandermonde matrix.

OK, so the results of null-space solutions to the eigenproblem is an "eigendecomposition", no matter what the eigenvector matrix looks like. It is only for a normal matrix that this solution is easily found, and also complete. And a complete eigendecomposition only requires that there be a diagonalizing matrix. Should I presume that a non-normal matrix never has a complete eigendecomposition?
 
  • #4
swampwiz said:
OK, so the results of null-space solutions to the eigenproblem is an "eigendecomposition", no matter what the eigenvector matrix looks like. It is only for a normal matrix that this solution is easily found, and also complete. And a complete eigendecomposition only requires that there be a diagonalizing matrix. Should I presume that a non-normal matrix never has a complete eigendecomposition?

wait what? Did you see what I wrote in my last posting about linear independence amongst unique eigenvalues?

it really is as simple as going through each eigenvalue.

you iterate through the definition for each one:

##\mathbf A \mathbf x_k = \lambda_k \mathbf x_k##

collect the relationships in a matrix

where

##\mathbf X := \bigg[\begin{array}{c|c|c|c|c}
\mathbf x_1 & \mathbf x_2 &\cdots & \mathbf x_{n-1} & \mathbf x_n\end{array}\bigg]
##

and diagonal matrix ##\mathbf D## is given by

##\mathbf D := \begin{bmatrix}
\lambda_1 & 0 & \dots & 0 &0 \\
0 & \lambda_2 & \dots & 0 &0 \\
\mathbf 0 & \mathbf 0 & \ddots & \mathbf 0 & \mathbf 0\\
0& 0 & \dots & \lambda_{n-1} & 0\\
0& 0 & \dots & 0 & \lambda_n
\end{bmatrix}##

those eigenvalue relationships can be shown all at once as, in matrix form as:

##\mathbf A \mathbf X = \mathbf X \mathbf D##

(check the multiplication)

- - - -
now ##\mathbf X## is ##\text{n x n}##, and you said all eigenvalues are unique, so ##\mathbf X## has ##\text{n}## linearly independent columns, and hence is invertible. multiply on the right by ##\mathbf X^{-1}## and get

##\mathbf A = \mathbf A \mathbf X \mathbf X^{-1} = \mathbf X \mathbf D \mathbf X^{-1}##

which is the diagonalization decomposition. If you prefer, you can show that ##\mathbf A## is similar to a diagonal matrix, i.e.

##\mathbf X^{-1} \mathbf A \mathbf X = \mathbf D ##

- - - -

I also don't know what you mean about the decomposition being "easily found"? For large matrices, finding eigenvalues is almost always not easy (though there are a some great numerical approaches).
 
  • #5
StoneTemplePython said:
I also don't know what you mean about the decomposition being "easily found"? For large matrices, finding eigenvalues is almost always not easy (though there are a some great numerical approaches).

What I meant is that getting X-1 is easily found since it is simply X; of course, this doesn't say anything about how easily X is found.

I guess that my overall question is whether the case where X-1 is not X is still a valid "eigenproblem solution". (The only way that that case holds true is if X is unitary, which can only happen if the original matrix is normal.)

So if I have some Φ that is computed from the null-space solution, I define Λ (which is not necessarily diagonal, if A is not normal, or there are shared eigenvalues) as

Λ = Φ-1 A Φ

and from which follows

Λ = Φ-1 A Φ

so the normal products of the left & right hand sides of the eigenvalue equation are

LHS = [ A Φ ] [ A Φ ] = Φ A A Φ

RHS = [ Φ Λ ] [ Φ Λ ] = Λ Φ Φ Λ

so LHS = RHS yields

Φ Φ Λ Φ-1 Φ-† Λ Φ Φ = Λ Φ Φ Λ

which is known to work if Φ is orthogonal (i.e., unitary with possibly differently scaled columns) and Λ is normal; if A is normal and the eigenvalues are all distinct, then Φ is orthogonal and Λ is diagonal, and thus normal.

But even if these conditions are not met, it seems that it could turn out to work even if the matrices just happen to concatenate with the same net matrix product.

And it seems that even if it doesn't work out completely like this, there is still a "solution", just one that is not complete.
 
  • #6
I'm sorry your post doesn't make much sense to me. You seem to be creating new terms and ignoring standard well defined ones.

swampwiz said:
What I meant is that getting X-1 is easily found since it is simply X; of course, this doesn't say anything about how easily X is found.

I guess that my overall question is whether the case where X-1 is not X is still a valid "eigenproblem solution". (The only way that that case holds true is if X is unitary, which can only happen if the original matrix is normal.)

I don't know what constitutes a valid "eigenproblem solution".

swampwiz said:
So if I have some Φ that is computed from the null-space solution, I define Λ (which is not necessarily diagonal, if A is not normal, or there are shared eigenvalues) as

I have no idea what it means to come up with ##\mathbf \Phi## which is "computed from the null-space solution". If you are using ##\mathbf \Lambda## it better be diagonal. If you want it to be 'merely' upper triangular in general, use ##\mathbf J## or ##\mathbf T## or something like that.

If you want to write a matrix as ##\mathbf A = \mathbf{PJP}^{-1}## in terms of a Jordan Canonical Form, that's canonical and fine, but it involves a lot of heavy duty machinery. You may be interested in Schur Triangularization, ##\mathbf A = \mathbf{UTU}^{*}## with unitary ##\mathbf U## and upper triangular ##\mathbf T##. Schur Triangularization unlocks spectral theory for these normal matrices you keep bringing up.

Note: "which is not necessarily diagonal, if A is not normal, or there are shared eigenvalues" is wrong.

The correct statement:
##\mathbf A## isn't necessarily diagonalizable (if ##\mathbf A## is not normal, and there are shared / repeated eigenvalues).
- - - - -
swampwiz said:
which is known to work if Φ is orthogonal (i.e., unitary with possibly differently scaled columns) and Λ is normal; if A is normal and the eigenvalues are all distinct, then Φ is orthogonal and Λ is diagonal, and thus normal.

Again, huh?

What I think you mean is that ##\mathbf \Phi## is unitary -- the way you toggle unitary and orthogonal for a matrix is wrong... you seem to be confusing orthogonal (column) vectors with the well understood definition of an orthogonal matrix-- and again you don't need eigenvalues to be distinct. If you have repeated eigenvalues, you may still wisely select eigenvectors for any normal ##\mathbf A## such that the resulting ##\mathbf \Phi## is unitary.

- - - -
If you want to understand this stuff, you'll need to do some reading and exercises. In particular consider the freely available Linear Algebra Done Wrong, chapters 4, 5 and 6. If you want Jordan decomposition, then chapter 9 as well, though that may be out of reach (and not that great a payoff per amount of time invested).

https://www.math.brown.edu/~treil/papers/LADW/LADW_2017-09-04.pdf
 
Last edited:

1. What is an eigenproblem for non-normal matrix?

An eigenproblem for a non-normal matrix is finding the eigenvalues and eigenvectors for a matrix that is not normal. A normal matrix is a square matrix that commutes with its conjugate transpose, while a non-normal matrix does not. This makes the eigenproblem more complex and requires different methods for solving it.

2. What are the applications of solving an eigenproblem for non-normal matrix?

The eigenproblem for non-normal matrix has various applications in different fields such as physics, engineering, and computer science. Some common applications include solving differential equations, analyzing complex systems, and image processing.

3. How do you solve an eigenproblem for non-normal matrix?

Unlike a normal matrix, the eigenproblem for a non-normal matrix cannot be solved using the same methods. Instead, it requires more advanced techniques such as the Schur decomposition, the Jordan decomposition, or the QR algorithm. These methods can be applied to find the eigenvalues and eigenvectors of a non-normal matrix.

4. What are the challenges of solving an eigenproblem for non-normal matrix?

One of the main challenges of solving an eigenproblem for non-normal matrix is that the eigenvectors may not be orthogonal, unlike in the case of a normal matrix. This makes it difficult to determine a complete set of eigenvectors. Additionally, the eigenvalues may not be real, which can complicate the calculations.

5. Can a non-normal matrix have complex eigenvalues?

Yes, a non-normal matrix can have complex eigenvalues. Unlike normal matrices, which always have real eigenvalues, non-normal matrices can have complex eigenvalues due to their non-commutative nature. In some cases, the eigenvalues can also be repeated, making it challenging to find a complete set of eigenvectors.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
Replies
4
Views
3K
  • Linear and Abstract Algebra
Replies
17
Views
4K
  • Linear and Abstract Algebra
Replies
11
Views
4K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • STEM Academic Advising
Replies
6
Views
4K
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
Replies
1
Views
2K
Back
Top