Matrix Multiplication and Commuting
Why might two matrices commute? I.e Why would AB=BA because in general, matrices usually do not commute. What are the properties of matrices that do commute?
Ben 
Diagonal matrices commute.

I forgot to mention there is a geometrical way of looking at this  suppose the diagonal is filled with scalars, then each a_{ii} stretches any object that the matrix acts on in the i direction. For example, suppose a_{11} is 5, then any object acted on by the matrix will be streched by a factor of 5 in the x direction, if you are using an x,y,z coordinate system. In a diagonal matrix the eigenvalues are precisely the entries on the diagonal.

I completely left out the geometric intuition I sought to express  stretching in the x direction, then the y, is the same as stretching in the y direction, then the x  the two operations commute, and so do any matrices associated with the operations.

More generally: two matrices, A and B, commute if and only if they are "simultaneously diagonalizable": that is if there exist some invertible matrix M such that MAM^{1}= D1 and MBM^{1}= D2 where D1 and D2 are diagonal matrices (more generally, Jordan Normal Form if A and B are not diagonalizable).
It follows then that A= M^{1}D1M and B= M^{1}D2M. Then AB= M^{1}D1MM^{1}D2M= M^{1}D1D2M = M^{1}D2D1M (since all diagonal matrices commute) = M^{1}D2MM^{1}D2M = BA. 
That is the proof I couldn't remember.

How does the proof in the other direction go? I fiddled around with it a bit and wasn't getting very far.

It wasn't a proof, just geometric intuition, given by the fact that it doesn't matter which order something is stretched in  the final result will be the same, and the fact that diagonal matrices stretch any object they act on, by the degree given by the ith eigenvalue, in the ith direction. There is an excellent geometric discussion of how eigenvalues and eigenvectors act on an object available here:
http://hverrill.net/courses/linalg/linalg8.html I don't have anything better to add to it. 
Ok... I can prove that if A has not defective then A and B commute iff they are simultaneously diagonalizable. Anyone know how to do it when both are defective?
Here's a sketch of what I have so far: Assume A is not defective and finite dimensional. Then choose a basis that diagonalizes A. The i,kth entry in AB is A_{ii}B_{ik} The i,kth entry in BA is B_{ik}A_{kk} So AB = BA iff, for all (i, k), B_{ik} = 0 or A_{ii}=A_{kk}. If A_{ii} = A_{jj}, then A is the identity over the subspace spanned by the ith and jth basis vectors, so we can replace these two vectors with basis vectors that diagonalize B over that subspace. By repeating this process, we can produce a basis that simultaneously diagonalizes both A and B. I don't know what to do if A is defective or infinite dimensional. (though I admit not having taken a crack at modifying the above proof to use transfinite induction to tackle the infinite dimensional case) :frown: 
That proof should work for infinite dimensional cases as is because it already uses induction in the repetition of converting the basis vectors to basis vectors that span subspaces of B, two by two, as long as A is not defective. Proof if both A and B are defective is not easy. Good luck.

Re: Matrix Multiplication and Commuting
Quote:
this is not IFF is it? A * B can B* A without A being diagonal 
Re: Matrix Multiplication and Commuting
Can you give me and example to find matrices commute with matrices A that we know?

Re: Matrix Multiplication and Commuting
Quote:

All times are GMT 5. The time now is 01:52 AM. 
Powered by vBulletin Copyright ©2000  2014, Jelsoft Enterprises Ltd.
© 2014 Physics Forums