What can commute with a diagonal matrix?

Click For Summary
The discussion centers on the conditions under which a diagonal matrix can commute with another matrix, specifically denoted as B. It is proposed that either the diagonal matrix is proportional to the identity or B itself must be diagonal, while exploring cases where B is symmetric or contains identical elements in its diagonal. The conversation reveals that if B is symmetric, the two matrices commute only if the product is symmetric, which typically leads back to the original conditions. An example is provided where a non-diagonal matrix commutes with a diagonal matrix that is not proportional to the identity, challenging the initial assumptions. The participants conclude that the exploration of commuting matrices is complex and may require reevaluation of their proofs.
weetabixharry
Messages
111
Reaction score
0
I have two matrices which commute, one of which is definitely diagonal:

\textbf{B}diag\{\underline{\lambda}\} = diag\{\underline{\lambda}\}\textbf{B}

and I want to know what I can say about \textbf{B} and/or \underline{\lambda}. Specifically, I feel that either one or both of the following must be correct:

(1) diag\{\underline{\lambda}\} is proportional to identity.
(2) \textbf{B} is diagonal.
[ignoring the trivial cases where one or both matrices equal the zero matrix]

But are there other cases when these two matrices can commute? i.e. Is it possible for both \textbf{B} to be non-diagonal and the elements of \underline{\lambda} to not all be identical?
 
Physics news on Phys.org
Hi weetabixharry! :smile:

What about if B is a symmetric matrix?
Or what if some of the lambda's are identical, forming a submatrix that is proportional to identity?
Or what if a lambda is zero?
 
I like Serena said:
Hi weetabixharry! :smile:

What about if B is a symmetric matrix?
Or what if some of the lambda's are identical, forming a submatrix that is proportional to identity?
Or what if a lambda is zero?

If \textbf{B} is symmetric, then:

\textbf{B}diag\{\underline{\lambda}\} = [diag\{\underline{\lambda}\}\textbf{B}]^T

Therefore the two matrices only commute if diag\{\underline{\lambda}\}\textbf{B} is symmetric. I feel like this can only happen in the 2 cases I stated above, because each element of \underline{\lambda} multiplies across an entire row of \textbf{B}.

I'm not sure how to approach the other two cases you mentioned.
 
weetabixharry said:
If \textbf{B} is symmetric, then:

\textbf{B}diag\{\underline{\lambda}\} = [diag\{\underline{\lambda}\}\textbf{B}]^T

Therefore the two matrices only commute if diag\{\underline{\lambda}\}\textbf{B} is symmetric. I feel like this can only happen in the 2 cases I stated above, because each element of \underline{\lambda} multiplies across an entire row of \textbf{B}.

I'm not sure how to approach the other two cases you mentioned.

Yes, you are right.
B being symmetric doesn't help.

I just checked a 2x2 matrix with a zero on the diagonal.
Still yields that B must be diagonal, if all lambda's are different.

If we have a set of equal lambda's, we can split B into sub blocks matrices, and multiply the matrices as sub blocks.
A sub block of B on the diagonal that corresponds to a block with equal lambdas always commutes.
A sub block of B that is not on the diagonal has to be zero.
 
I like Serena said:
...
If we have a set of equal lambda's, we can split B into sub blocks matrices, and multiply the matrices as sub blocks.
A sub block of B on the diagonal that corresponds to a block with equal lambdas always commutes.
A sub block of B that is not on the diagonal has to be zero.

To prove this it is useful to write the commutator in components:
\sum_j B_{ij}\lambda_j \delta_{jl}=\sum_j \lambda_i \delta_{ij}B_{jl}
B_{il}\lambda_l=\lambda_i B_{il}
B_{il}(\lambda_l-\lambda_i)=0
 
aesir said:
To prove this it is useful to write the commutator in components:
\sum_j B_{ij}\lambda_j \delta_{jl}=\sum_j \lambda_i \delta_{ij}B_{jl}
B_{il}\lambda_l=\lambda_i B_{il}
B_{il}(\lambda_l-\lambda_i)=0

Ah yes, this is an excellent way of seeing it. Many thanks for that! (Though, I feel the RHS of the first line should have \lambda_j instead of \lambda_i... even though the result will be the same).

Quick example of a non-diagonal matrix commuting with a non-proportional-to-identity diagonal matrix:

<br /> <br /> \left[\begin{array}{lll}<br /> 7&amp;2&amp;0 \\<br /> 0&amp;1&amp;0 \\<br /> 0&amp;0&amp;4<br /> \end{array}\right]<br /> <br /> \left[\begin{array}{lll}<br /> 3&amp;0&amp;0 \\<br /> 0&amp;3&amp;0 \\<br /> 0&amp;0&amp;2<br /> \end{array}\right]<br /> <br /> =<br /> <br /> \left[\begin{array}{lll}<br /> 3&amp;0&amp;0 \\<br /> 0&amp;3&amp;0 \\<br /> 0&amp;0&amp;2<br /> \end{array}\right]<br /> <br /> \left[\begin{array}{lll}<br /> 7&amp;2&amp;0 \\<br /> 0&amp;1&amp;0 \\<br /> 0&amp;0&amp;4<br /> \end{array}\right]<br /> <br /> =<br /> <br /> \left[\begin{array}{lll}<br /> 21&amp;6&amp;0 \\<br /> 0&amp;3&amp;0 \\<br /> 0&amp;0&amp;8<br /> \end{array}\right]<br /> <br />

Unfortunately, this ruins the proof I was writing... back to the drawing board I guess...
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 6 ·
Replies
6
Views
7K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 0 ·
Replies
0
Views
836
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K