Linear Algebra: Diagonalization, Transpose, and Disctinct Eigenvectors.

Click For Summary
If an nxn matrix A has n linearly independent eigenvectors, then its transpose A^T also has n linearly independent eigenvectors. This is established by noting that A is diagonalizable, represented as A = PDP^-1, where P contains the independent eigenvectors. The eigenvalues of A and A^T are the same, allowing for the conclusion that distinct eigenvalues correspond to distinct eigenvectors in both matrices. In cases of repeated eigenvalues, the argument hinges on the properties of the transpose and the structure of the eigenvector equations, ensuring that A^T maintains the same eigenvector characteristics. Ultimately, A^T is also diagonalizable, confirming it possesses n linearly independent eigenvectors.
JTemple
Messages
9
Reaction score
0

Homework Statement



Show that if an nxn matrix A has n linearly independent eigenvectors, then so does A^T

The Attempt at a Solution



Well, I understand the following:

(1) A is diagonalizable.

(2) A = PDP^-1, where P has columns of the independent eigenvectors

(3) A is invertible, meaning it has linearly independent columns, and rows that are not scalar multiples of each other.

(4) A^T has the same eigenvalues of A.


So, based mainly on (4), I can prove the "easy" case, in which there are n distinct eigenvalues.

IE: If A has n distinct eigenvalues, then A^T has those same distinct eigenvalues. Thus, If lambda_1 through lambda_n are distinct, then they each correspond to distinct eigenvectors v_1 through v_n for A and v_1T through v_nT for A^T.

In this case, the eigenvectors could be the same (in the case that A=A^T), but don't have to be.


My problem!

What if the eigenvalues are not distinct, IE there is some lambda_i with multiplicity =k.

I understand that in the case of A, the (A-lambda_i*I_n) matrix must have k free variables in order to give the (stated) n linearly independent eigenvectors.

So, how can I prove that the repeated eigenvalues in the A^T case ALSO correspond to (A^T-lambda_i*I_n) having k free variables and thus n linearly independent eigenvectors?
 
Physics news on Phys.org
Using just the equations you've written, and the elementary algebraic properties of the transpose, I bet you can diagonalize A^T, and thus get an algebraic formula for the its eigenvectors.
 
I still can't piece the last bit together. I realize that A^T has linearly independent columns, and rows that aren't scalar multiples of each other. I also see that the Transpose has the same diagonal entries as the original. However, the systems I'm setting up with transpose to find Eigenvectors aren't the same at all. In cases where I try with numbers, they still work, which is pretty cool, but I can' seem to guarantee k free variables in the A-lamI of A^T
 
I think I got it.

A = PDP^(-1)
A^T = (P^-1)^T * D^T * P^T
D^T = D
(P^-1)^T = (P^T)^-1

A^T = (P^T)^-1 * D P^T

Since P is invertible, it has linearly independent columns and so does P^T.

So let P^T = M

A^T = MDM^-1

Therefore A^T is diagonalizable.

Therefor A^T has n linearly independent eigenvectors (the columns of M).
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K