How does matrix non-commutivity relate to eigenvectors?

In summary, the given conditions state that two matrices A and B do not commute (Condition 1) and do not have any common eigenvectors (Condition 2). The question is whether these conditions are equivalent, and if not, how they are related. The conversation also brings up the possibility of a connection between non-commuting matrices and the uncertainty principle, specifically in quantum mechanics. It is mentioned that A and B commuting implies they have the same sets of eigenvectors, but the converse is not necessarily true. The conversation ends with a discussion on the relationship between commutivity, simultaneous observations, and common eigenvectors. Ultimately, it is stated that if two hermitian matrices satisfy Conditions 3 and 4, then they
  • #1
snoopies622
840
28
Given matrices A,B and
Condition 1: AB does not equal BA
Condition 2: A and B do not have common eigenvectors

are these two conditions equivalent? If not, exactly how are they related? Since I'm thinking about quantum mechanics I'm wondering specifically about Hermitian matrices, but I'm also curious about how this applies to matrices in general. Thank you!
 
Physics news on Phys.org
  • #2
If in Condition 2, by 'have common eigenvectors', you mean that A and B have exactly the same sets of eigenvectors, with the same multiplicities then I think 1 implies 2 (ie identical eigenvector sets implies the two matrices commute). To see that, just compare the actions of AB and BA on each element of a common eigenbasis. It will be the same.

But I do not think that A and B commuting implies they have the same sets of eigenvectors. Counterexample: ##A=\pmatrix{0&1\\1&0},B=\pmatrix{1&0\\0&2}##. These commute but have different eigenvectors.
 
  • #3
Actually for Condition 2 i meant that they have no common eigenvectors at all. So to express the whole thing in contrapositives, let's call

Condition 3: AB=BA
Condition 4: A and B have at least one common eigenvector.

Are these equivalent? (Also, unless I can't do matrix multiplication, I'm getting AB not quite equal to BA for the two matrices in entry #2)
 
  • #4
snoopies622 said:
Actually for Condition 2 I meant that they have no common eigenvectors at all. So to express the whole thing in contrapositives, let's call

Condition 3: AB=BA
Condition 4: A and B have at least one common eigenvector.

Are these equivalent? (Also, unless I can't do matrix multiplication, I'm getting AB not quite equal to BA for the two matrices in entry #2)
What about ##A=0## and ##B=1\,##?
 
  • Like
Likes QuantumQuest
  • #5
fresh_42 said:
What about ##A=0## and ##B=1\,##?

Hmm, funny case. It appears to me that both Conditions 3 and 4 are satisfied here, but I'm wondering if they imply one another in every case.
 
  • #6
snoopies622 said:
Hmm, funny case. It appears to me that both Conditions 3 and 4 are satisfied here, but I'm wondering if they imply one another in every case.
How do they have a common eigenvector? Or do you allow eigenvectors to be equal whereas the eigenvalues are not?
 
  • #7
That's what I had in mind, yes. Never thought about these cases before but I guess any vector is an eigenvector of matrix 1 with eigenvalue 1 and likewise any vector is an eigenvector of matrix 0 with eigenvalue 0.

My wondering about this matter springs from quantum mechanics, and specifically what the connection is between non-commuting matrices and the uncertainty principle. Specifically - if two matrices do not commute, does that mean that they have no common eigenvectors? But I would also like a more general and precise understanding of the relationship here between conditions 1 and 2 above (or 3 and 4).
 
Last edited:
  • #8
There isn't any direct connection you're looking for. ##[A,B]=0## has at prior nothing to do with eigen vectors. Linear transformations can be split in a semisimple and nilpotent part and there are various theorems about these parts. E.g. for complex vector spaces, semisimple transformations are diagonalizable, and this diagonalization can be done simultaneously for commuting, semisimple transformations.

If you allow any transformations, or not algebraically closed fields, then there can be found counterexamples of all kind. One important remark: A semisimple linear transformation is basically defined by diagonalizability, i.e. pairwise distinct eigenvalues. It has nothing to do, at least not on this level, with the semisimplicity of Lie algebras, may they be defined as matrix algebras or not. However, the classification of semisimple Lie algebras start with Jordan decompositions, i.e. the decomposition of linear transformations in a semisimple and a nilpotent part. An important theorem is: A semisimple, finite dimensional, linear Lie Algebra contains all semisimple and nilpotent parts of its elements. So there are connections between the various concepts, just not as simple as you have thought.
 
  • Like
Likes QuantumQuest
  • #9
Well that was a much deeper answer that I anticipated, thank you! Here's the section of H.S. Green's "Matrix Mechanics" (1965) that got me thinking.

"A hermitian linear operator (observable) represents a measurable quantity, and its eigenvalues the results of the measurement. If the simultaneous measurement of two different quantities is possible, the observables representing these quantities commute; otherwise, they fail to commute." (p. 34)

So if two matrices A,B representing two different observable quantities have a common eigenvector, then the physical system can be in a state that's an eigenvector for both, and can therefore have a precisely defined value for both physical quantities. If A and B have no common eigenvector, then no such state exists and so the two corresponding physical quantities cannot be simultaneously precisely defined.

Hence a relationship between commutivity, simultaneous observations, and common eigenvectors.
 
  • #10
Hermitian is a very strong property. Hermitian matrices are diagonalizable, so two commuting Hermitian matrices are simultaneously diagonalizable, i.e. we can find a basis of eigenvectors in this case.
 
  • #11
I am just browsing here but think of your space as decomposed into an orthogonal product of a line and a hyperplane, say L x W. To say the two operators have a common eigenvector, which we may assume lies in L, says nothing at all about the behavior of the two operators on the complementary hyperplane W. I.e. for an operator that preserves orthogonality, if it maps L into itself it also maps W into itself, but that is the only common property of the two operators we seem to have. So this is like asking if we have two operators on LxW that both preserve not only L but also W, then they do have a common eigenvector, namely any ≠0 vector in L, but do they have to have any common properties when restricted to W? Well, no, as long as dim(W) > 1.
 
  • #12
So I guess my original question should be as follows: Given hermitian matrices A,B and
Condition 3: AB=BA
Condition 4: A and B have at least one common eigenvector
Are Conditions 3 and 4 equivalent?
 
  • #13
If ##[A,B]=0## then there is even a basis of eigenvectors for both of them and therefore especially at least one eigenvector in common. For the other direction, I think it is not true. I would follow the way @mathwonk has sketched with three or four dimensions and only one eigenvector in common to look for a counterexample. An easy example could be one common eigenvector and the rest some permutation matrices with different signatures (orientation) - but this is only a guess which looks easy to verify.
 
  • #14
snoopies622 said:
So I guess my original question should be as follows: Given hermitian matrices A,B and
Condition 3: AB=BA
Condition 4: A and B have at least one common eigenvector
Are Conditions 3 and 4 equivalent?
As far as QM is concerned, no these are not equivalent. If two Hermitian operators commute, then they share a complete set of eigenvectors.

If they do not commute, then they may have common eigenvectors, but not a complete set.

To put it another way, if two operators share a single eigenvector, then they do not necessarily commute. Simple counterexamples can be found as advised by @mathwonk.

Some QM texts may give the impression that non-commuting operators cannot share any common eigenvectors, but that is false.
 
  • Like
Likes QuantumQuest
  • #15
andrewkirk said:
Counterexample: ##A=\pmatrix{0&1\\1&0},B=\pmatrix{1&0\\0&2}##. These commute but have different eigenvectors.
##\left[\pmatrix{0&1\\1&0},\pmatrix{1&0\\0&2} \right]=\pmatrix{0&1\\-1&0}\neq 0##
If we replace ##2## by ##1## then they commute, are Hermitian, but have both eigenvectors in common.
 
  • #16
So in the context of quantum mechanics, suppose matrices A and B do not commute but have a common eigenvector. This implies that - if the system is in the state corresponding to this eigenvector, then it simultaneously has an exact value for both of the physical observables corresponding to A and B, even though A and B do not commute.

Am I correct? This definitely seems to contradict what H.S. Green implies in the quote in entry #9 above.
 
  • #17
PeroK said:
Some QM texts may give the impression that non-commuting operators cannot share any common eigenvectors, but that is false.

I think blocked matrices are instructive. Consider the following example:

##\mathbf A := \begin{bmatrix} \mathbf I_m & \mathbf 0\\
\mathbf 0 & \mathbf X_n \end{bmatrix}##

##\mathbf B := \begin{bmatrix} \mathbf I_m & \mathbf 0\\
\mathbf 0 & \mathbf Y_n \end{bmatrix}##

where ##m## and ##n## are positive integers.

If ##\mathbf A## and ##\mathbf B## don't commute (i.e. if ##\mathbf X_n## and ##\mathbf Y_n## don't commute), they still certainly have common eigenvectors -- infact if we like we can say that ##\{\mathbf e_1, \mathbf e_2, ..., \mathbf e_m\}## are all common eigenvectors, and there may be even more.

we can of course also consider things like
##\mathbf A := \begin{bmatrix} \mathbf C_m & \mathbf 0\\
\mathbf 0 & \mathbf X_n \end{bmatrix}##

##\mathbf B := \begin{bmatrix} \mathbf C_m & \mathbf 0\\
\mathbf 0 & \mathbf Y_n \end{bmatrix}##

where ##\mathbf C_m## is a Hermitian -- and still we can say ##\mathbf A## and ##\mathbf B## have at least m common (mutually orthonormal) eigenvectors. From a graph theory standpoint, all I did was put together two disjoint/disconnected graphs into one adjacency matrix, so the result isn't very surprising. But there are lots of other variations that can be done as well.
 
  • #18
Thanks all, will pick this up tomorrow after I (hopefully) find a couple very good QM texts from my storage unit. There are some details about the axioms that I'm not remembering exactly and are probably critical here.
 
  • #19
fresh_42 said:
##\left[\pmatrix{0&1\\1&0},\pmatrix{1&0\\0&2} \right]=\pmatrix{0&1\\-1&0}\neq 0##
If we replace ##2## by ##1## then they commute, are Hermitian, but have both eigenvectors in common.
d'oh!o:)
 
  • #21
So is there a rule in quantum mechanics such that any non-commuting operators (representing physical observables) do not have any common eigenvectors? Otherwise I don't know how to avoid the apparent contradiction I described in entry #16.
 
Last edited:
  • #22
snoopies622 said:
So is there a rule in quantum mechanics such that any non-commuting operators (representing physical observables) do not have any common eigenvectors? Otherwise I don't know how to avoid the apparent contradiction I described in entry #16.

There's no contradiction. You may have two observables whose compatibility depends on the state.
 
  • #23
Ok, I just want to be clear: You're saying that even within the limited context of quantum mechanics it's possible for two non-commuting operators to have a common eigenvector. Does this not imply that a physical state exists that is an eigenstate of both operators? I thought one way the uncertainty principle is stated is that if two observables are represented by non-commuting operators then no state exists in which both observables simultaneously have exact values.
 
  • #24
snoopies622 said:
Ok, I just want to be clear: You're saying that even within the limited context of quantum mechanics it's possible for two non-commuting operators to have a common eigenvector. Does this not imply that a physical state exists that is an eigenstate of both operators? I thought one way the uncertainty principle is stated is that if two observables are represented by non-commuting operators then no state exists in which both observables simultaneously have exact values.

As this thread has proved, there are operators that do not commute on the whole vector space, but do commute on a subspace.

That translates in QM to a contradiction of your version of the uncertainty principle.

The general uncertainty principle - look it up - involves the expected value of the commutator. That requires a specific state.

However, some operators have a minimum uncertainty across all states.
 
  • Like
Likes QuantumQuest
  • #25
PeroK I would very much appreciate your thoughts on the essay I linked in entry #20. It's pretty short and the author's thinking seems very close to what brought me here. Thanks.
 
  • #26
snoopies622 said:
PeroK I would very much appreciate your thoughts on the essay I linked in entry #20. It's pretty short and the author's thinking seems very close to what brought me here. Thanks.

I would post on the QM forum to see whether anyone has an example of state-dependent uncertainty.

I would be interested in the answer.

Note that there are a lot of Hermitian operators, but most physics is concerned with just a few observables!

Again, it would be interesting to know how many operators are in practical use.
 
  • #27
PS spin-1 states are represented by 3d vectors, so you could definitely construct two observables with the "partially" commuting property.
 

1. How does matrix non-commutivity affect eigenvectors?

Matrix non-commutivity refers to the fact that the order in which matrices are multiplied affects the resulting matrix. This means that if matrices A and B are non-commutative, A*B is not equal to B*A. This can have a significant impact on eigenvectors, as the eigenvectors of a non-commutative matrix may change when multiplied by another non-commutative matrix.

2. What is the relationship between matrix non-commutivity and eigenvalues?

The relationship between matrix non-commutivity and eigenvalues is that the eigenvalues of non-commutative matrices may also change when multiplied by another non-commutative matrix. This is because eigenvectors and eigenvalues are closely related, and changes in eigenvectors can result in changes in eigenvalues.

3. Can matrix non-commutivity affect the diagonalizability of a matrix?

Yes, matrix non-commutivity can affect the diagonalizability of a matrix. Diagonalizability refers to the ability to transform a matrix into a diagonal matrix using a similarity transformation. If the matrix is non-commutative, this transformation may not be possible, and the matrix may not be diagonalizable.

4. How does matrix non-commutivity relate to the Jordan canonical form?

The Jordan canonical form is a way of representing a matrix that is not diagonalizable. Matrix non-commutivity can affect the Jordan canonical form by changing the eigenvectors and eigenvalues of the matrix, which are used to construct the Jordan blocks in the canonical form.

5. Are there any real-world applications of matrix non-commutivity and eigenvectors?

Yes, there are several real-world applications of matrix non-commutivity and eigenvectors. One example is in quantum mechanics, where non-commutative matrices are used to represent quantum operators. Eigenvectors and eigenvalues are also used in machine learning algorithms, such as principal component analysis, which is used for dimensionality reduction in data analysis.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
627
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Topology and Analysis
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
4K
Back
Top