Is Matrix Addition Commutative?

Click For Summary

Discussion Overview

The discussion revolves around the properties of matrix addition, specifically whether it is commutative. Participants explore the implications of defining the sum of linear operators and their corresponding matrices, examining the nuances of dummy variables in the context of Einstein summation convention.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant presents a proof that if the sum of linear operators C' equals A' plus B', then the corresponding matrices C equals A plus B, using the Einstein summation convention.
  • Another participant questions the validity of changing dummy variables in the expression for Cijvj, suggesting that it could lead to confusion in subsequent steps.
  • It is noted that while dummy variables can be renamed, doing so in a way that changes the structure of the equation can obscure the relationships between terms.
  • A participant emphasizes that removing dummy variables from expressions can lead to nonsensical results, particularly when free indices are involved.
  • There is a discussion about the implications of manipulating dummy variables and how it affects the ability to factor out components from sums.
  • Another participant reinforces the definition of matrix multiplication and its relationship to the addition of matrices, indicating that if the equality holds for all vectors v, then the matrix equality follows.

Areas of Agreement / Disagreement

Participants generally agree on the properties of dummy variables and the structure of matrix addition, but there is some contention regarding the manipulation of these variables and its implications for clarity in mathematical expressions.

Contextual Notes

Participants express uncertainty about the consequences of changing dummy variables and the potential for misinterpretation in mathematical expressions. The discussion highlights the importance of maintaining clarity in notation when dealing with indices in matrix operations.

albega
Messages
74
Reaction score
0
Suppose we have linear operators A' and B'. We define their sum C'=A'+B' such that
C'|v>=(A'+B')|v>=A'|v>+B'|v>.

Now we can represent A',B',C' by matrices A,B,C respectively. I have a question about proving that if C'=A'+B', C=A+B holds. The proof is

Using the above with Einstein summation convention,
C|v>=A|v>+B|v>
and so component i on each side matches. Then
Cijvj=Aijvj+Bijvj
which holds for any |v>, so
C=A+B
as this is how we define matrix addition.

However, why couldn't I have written
Cijvj=Aikvk+Bilvl
because I have changed only dummy variables, not affecting the sum. This would then not lead to Cijvj=Aijvj+Bijvj. I'm assuming it's something to do with the fact the next step sort of stops this sum from happening anyway, but I'm not sure.
 
Physics news on Phys.org
albega said:
However, why couldn't I have written
Cijvj=Aikvk+Bilvl
because I have changed only dummy variables, not affecting the sum.

Yes, you could have written that, but it would change nothing since the only difference is in what you call the dummy variables. You could just rename them back and write it back on your original form.
 
Orodruin said:
Yes, you could have written that, but it would change nothing since the only difference is in what you call the dummy variables. You could just rename them back and write it back on your original form.

It's just that if I do say
Cijvj=Aikvk+Bilvl
and then I write
Cij=Aik+Bil
It looks like a different story, and we can't see that the k and l were initially dummy variables and so we can't simply say it's ok to change them both to j.

Ahh, actually, if we do that then we can't cancel out the v components anymore from each of the sums can we, which answers my question...
 
k and l are dummy variables, you cannot just take them away from the v, you would end up with an expression that does not make sense (you cannot have different free indices in Cij). However, you can rename them both to j (they are dummy indices after all) and then factorise the vj out.
 
albega said:
It's just that if I do say
Cijvj=Aikvk+Bilvl
and then I write
Cij=Aik+Bil
It looks like a different story, and we can't see that the k and l were initially dummy variables and so we can't simply say it's ok to change them both to j.

Ahh, actually, if we do that then we can't cancel out the v components anymore from each of the sums can we, which answers my question...
Use the definition of matrix multiplication: ##(AB)_{ij}=A_{ik}B_{kj}##.
\begin{align}
&C_{ij}v_j=(Cv)_i\\
&A_{ik}v_k+B_{il}v_l=(Av)_i+(Bv)_i=(Av+Bv)_i
\end{align} Since the left-hand sides are equal, the right-hand sides are equal, and we have ##Cv=Av+Bv=(A+B)v##. If this holds for all v, then ##C=A+B##.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K