Simultaneous Diagonalizability of A and B

  • Thread starter Thread starter Daron
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around the simultaneous diagonalizability of two commuting diagonalizable linear operators, A and B. The original poster seeks to prove that if A is diagonal and commutes with B, then B must also be diagonal, leading to a shared Jordan basis.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the implications of the commutation relation AB = BA, particularly how it affects the eigenvectors of A and B. There is discussion about the behavior of eigenvectors when the multiplicity of eigenvalues is considered, and whether B can be represented as a block diagonal matrix mirroring A's structure.

Discussion Status

Participants have provided insights into the relationship between the eigenvectors of A and B, particularly in the context of eigenvalue multiplicities. There is an ongoing exploration of how B interacts with the eigenspaces of A, with some guidance offered on the implications of linear combinations of eigenvectors.

Contextual Notes

There is a focus on the uniqueness of Jordan forms and the constraints imposed by the properties of diagonalizable matrices. The discussion also touches on the potential restrictions regarding the block diagonal form of diagonalizable matrices.

Daron
Messages
13
Reaction score
0

Homework Statement



A and B are commuting diagonalizable linear operators. prove that they are simultaneously diagonalizable.

Homework Equations



AB = B
A


The Attempt at a Solution



We deal with the problem in the Jordan basis of A, where A is diagonal, as Jordan forms are unique.

Then by rearranging the basis vectors, we can treat A as a block diagonal matrix, where the blocks are of the form λiI.

I aim to prove that, if A is diagonal, and commutes with B, then B must also be diagonal, so they have the same Jordan basis.

I can prove that B must also be a block diagonal matrix, with the dimensions of the blocks mirroring those of A.
This is because if a nonzero entry exists outside of and of B's blocks, the corresponding entries in AB and BA would be this entry multiplied by different eigenvectors. So the multiplication would not be commutative.

But from here I don't know what to do next. Is there some restriction that a diagonalizable matrix may not be put in block diagonal form where the blocks are not diagonalizable themselves?
 
Last edited:
Physics news on Phys.org


You can use the fact that AB=BA to show that if x is an eigenvector of A with eigenvalue λ, then Bx is also an eigenvector of A with eigenvalue λ.

If the eigenvalue has multiplicity 1, then what can you say about x in relation to B?

What happens when the multiplicity is greater than 1?
 


If the multiplicity is 1, then A(Bx) = λBx, so Bx is an eigenvector of A with eigenvalue λ, but since there is only one eigenvector with eigenvalue 1. then Bx = x.

I've moved on to trying to prove how commuting matrices share a basis of eigenvectors, which implies that both are diagonal in the basis of eigenvectors.

If multiplicity is over 1, then B can shuffle through the eigenvectors. For example if there are two eigenvectors with eigenvalue λ, then I can't see why the following can't be true:

Bxi = λxj
Bxj = λxi.
 


That can be true, but what's important is that B merely shuffles among the eigenvectors of A with the same eigenvalue so that any linear combination of xi and xj will still be an eigenvector of A.
 


Is the idea that every eigenvector of B with eigenvalue λ can be formed from a linear combination of the eigenvectors of A with eigenvalue λ, and that these combinations are still eigenvectors of A due to linearity?
 


Yes, that's the idea.

I'm not sure if you meant to say that the eigenvalue associated with B and the eigenvalue associated with A are equal. They're generally not.
 


They're equal up to a scalar multiple.

So for every eigenvalue λ with multiplicity m, we will get a system of m linear equations of the form Bxi = aixaj which define an eigenspace that is invariant under B.

And because B has an orthonormal basis of eigenvectors, we may consider B acting only on this eigenspace and find a basis of m perpendicular eigenvectors within it. And each will be a linear combination of eigenvectors of A and hence an eigenvector itself.

And then vice-verse. Thanks.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K