Eigenvalues of A: Same Eigenspaces for A^-1, Transpose, A^k

In summary, the conversation discusses the relationship between eigenspaces and eigenvalues of matrix A and its variants such as A^-1, transpose of A, and A^k. It is mentioned that the eigenvectors of A are the same for all such variants, but this is not guaranteed for A^k if k is an integer. A counterexample is given to show that the eigenspaces of A and A^k may not match for a non-diagonalizable matrix. Finally, it is stated that this property only holds for diagonalizable matrices.
  • #1
JinM
66
0
Is this a correct realization? The eigenspaces corresponding to the eigenvalues of A are the same as the eigenspaces corresponding to the eigenvalues of A^-1, transpose of A, and A^k for any k > 1.

It took me some time to realize this but the v's, when you manipulate these equations, don't change. So I'm lead to believe that the eigenvectors are actually the same for all such variants of A.
 
Physics news on Phys.org
  • #2


If A is invertible, then clearly v is an eigenvector for A if and only if it is an eigenvector for A^-1.

Unless you define 'variants' of A then we can't answer your second question. I'll attempt a guess: no, the eigenvectors of A^2 are not the same as eigenvectors of A.
 
  • #3


Sorry for the ambiguity -- you knew what I meant anyway.

Why aren't the eigenvectors of A^k for k > 1 the same as the eigenvectors of A?
 
  • #4


Why should they be?
 
  • #5


JinM said:
Sorry for the ambiguity -- you knew what I meant anyway.

Why aren't the eigenvectors of A^k for k > 1 the same as the eigenvectors of A?

Assuming that you are referring to an integer k, it's true that eigenvectors of A are automatically eigenvectors of A^k. However, you don't have any guarantee that eigenvectors of A^k are eigenvectors of A.
 
  • #6


Wait a second.

If x is an eigenvalue of A, then x^k is an eigenvalue of A^k (k is an integer).

Are you guys saying that there could possibly be other eigenvalues of A^k that differ from all x^k's (the eigenvalues of A raised to the k)? That's why we can't guarantee matching eigenvectors -- is that it?
 
Last edited:
  • #7


But is that even possible? A^k is always a square matrix, whose order matches that of matrix A.

Hmm..

If we take a 3x3 matrix A with eigenvalues -1, 1, and 4. A^2 will have eigenvalues of 1 (with algebraic multiplicity 2), and 4. A will have three eigenspaces, while A^2 will have two eigenspaces. So, the eigenspace of A corresponding to the eigenvalue -1 will not "live" in the eigenspace of A^1 corresponding to the eigenvalue (-1)^2 = 1. But that contradicts what Manchot is saying.. hrmph
 
  • #8


JinM said:
Wait a second.

If x is an eigenvalue of A, then x^k is an eigenvalue of A^k (k is an integer).

Are you guys saying that there could possibly be other eigenvalues of A^k that differ from all x^k's (the eigenvalues of A raised to the k)?

No, we're definitely not saying that.

That's why we can't guarantee matching eigenvectors -- is that it?

No.
 
  • #9


JinM said:
But is that even possible? A^k is always a square matrix, whose order matches that of matrix A.

What do you mean by order?

Hmm..

If we take a 3x3 matrix A with eigenvalues -1, 1, and 4. A^2 will have eigenvalues of 1 (with algebraic multiplicity 2), and 4. A will have three eigenspaces, while A^2 will have two eigenspaces. So, the eigenspace of A corresponding to the eigenvalue -1 will not "live" in the eigenspace of A^1 corresponding to the eigenvalue (-1)^2 = 1. But that contradicts what Manchot is saying.. hrmph

You mean 16, not 4, for the e-value of A^2.

Manchot stated that you cannot assume an e-vector of A^k is an e-vector of A for all matrices. You're saying he's wrong just because you can do it for one (diagonalizable) matrix.

Certainly if A is diagonalizable, then e-vectors of A are e-vectors of A^k and vice versa. However, being diagonalizable is a very special property.
 

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are mathematical concepts used to analyze linear transformations. Eigenvalues represent the scalar values that scale the eigenvectors, which are the non-zero vectors that do not change direction after the linear transformation.

What is the significance of having the same eigenspaces for A^-1, Transpose, and A^k?

Having the same eigenspaces for A^-1, Transpose, and A^k means that these transformations share the same set of eigenvectors, which also means that they have the same set of eigenvalues. This can provide useful information about the relationship between these transformations and can simplify calculations.

How can I find the eigenvalues and eigenvectors of a matrix?

To find the eigenvalues and eigenvectors of a matrix, you can use the characteristic equation and solve for the roots, which will be the eigenvalues. Then, you can plug these eigenvalues back into the characteristic equation to find the corresponding eigenvectors.

What does it mean when a matrix has repeated eigenvalues?

When a matrix has repeated eigenvalues, it means that there are multiple eigenvectors that correspond to the same eigenvalue. This can lead to a lack of linearly independent eigenvectors, which can affect the diagonalizability of the matrix.

Can a matrix have complex eigenvalues and eigenvectors?

Yes, a matrix can have complex eigenvalues and eigenvectors. This often occurs when the matrix has complex numbers as its entries. In this case, the complex eigenvalues and eigenvectors can still be used to analyze the matrix's behavior and properties.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
897
  • Linear and Abstract Algebra
Replies
10
Views
983
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
527
  • Linear and Abstract Algebra
2
Replies
39
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
2
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
5K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top