Eigenvectors of commuting matrices

acarchau
Messages
21
Reaction score
0
I can't follow an argument in Horn and Johnson's Matrix analysis in a suggestion (actually an outline of a proof) that follows problem 8 following section 1.3 (pg 55 in my copy).

They argue that if A and B are complex square matrices of order n which commute, and if all eigenvalues of B are distinct, then given any eigenvector of B, say x, corresponding to an eigenvalue u, Ax is also an eigenvector of B. This apparently simply follows from the fact that B(Ax)=A(Bx) = u (Ax).

However they skip mentioning why Ax cannot be the zero vector. Is this obvious? I am clearly missing something.
 
Physics news on Phys.org
There's no reason why Ax can't be zero: if you pick A = 0, then clearly A and B commute, but Ax is zero.
 
Right. That is a simple counterexample.
 
Are you talking about eigenvalue 0 or eigenvector 0?

Typically, we say that an "eigenvalue" is a NON-ZERO vector such that Ax= \lambda x simply because A0= \lambda 0 for any linear transformation A and any number \lambda. Of course, the 0 vector is in the "eigen space" for any eigenvalue.
 
I meant an eigenvector. My problem was with the claim Ax was an eigenvector of B when x was an eigenvector of B, even though it was not obvious to me why Ax was not the zero vector.
 
Well if Ax is the 0 vector then it cannot be an eigenvector. I think they did forget to mention that possibility, but you managed to figure it out anyway.
 
Back
Top