Prove trace of matrix: Tr(AB) = Tr(BA)

Click For Summary
SUMMARY

The discussion centers on proving the cyclic property of the trace of matrices, specifically that Tr(AB) = Tr(BA) for matrices A and B. Participants emphasize the importance of understanding matrix multiplication and the properties of the trace, including its behavior under permutations of matrix products. Key insights include the use of rank one matrices for intuitive proofs and the applicability of the trace identity to non-square matrices. The conversation highlights the need for clarity in notation, particularly regarding row and column vectors.

PREREQUISITES
  • Understanding of matrix multiplication and properties of traces
  • Familiarity with linear algebra concepts, particularly rank one matrices
  • Knowledge of matrix notation, including row and column vectors
  • Basic comprehension of linear algebra in the context of quantum mechanics
NEXT STEPS
  • Study the properties of matrix traces in detail, focusing on cyclic permutations
  • Learn about rank one matrices and their role in matrix multiplication
  • Explore the implications of trace identities for non-square matrices
  • Review linear algebra resources, such as "Linear Algebra Done Wrong," for deeper insights
USEFUL FOR

Students and professionals in mathematics, physics, and engineering, particularly those working with linear algebra and quantum mechanics, will benefit from this discussion.

DrMCoeus

Homework Statement


[/B]
The trace of a matrix is defined to be the sum of its diaganol matrix elements.
1. Show that Tr(ΩΛ) = Tr(ΩΛ)
2. Show that Tr(ΩΛθ) = Tr(θΩΛ) = Tr(ΛθΩ) (the permutations are cyclic)

my note: the cross here U[+][/+]is supposed to signify the adjoint of the unitary matrix U

Homework Equations



$$
Tr(Ω) = \sum_{i=1} Ω_{ii} \\
I = \sum_{k=1}^n | k >< k |
$$

The Attempt at a Solution



$$
Tr(Ω) = \sum_{i=1} Ω_{ii} \\
Tr(ΛΩ) = \sum_{i=1} (ΛΩ)_{ii} \\
(Ω)_{ij} = < i | Ω | j >
$$

(this is saying that when we take the product of the matrices we sum the diagonal entries where the element is in the ith row of the ith column, I also assume the trace is )

1.
$$
= \sum_{i=1} (Λ Ω)_{ii} \\
= \sum_{i=1} < i |ΛΩ| i > \\
= \sum_{i=1} < i |Λ I Ω| i > \\
= \sum_{k=1} \sum_{i=1} < i |Λ| k >< k |Ω| i > \\
= \sum_{k=1} \sum_{i=1} Λ_{ik} Ω_{ki}
$$

Unsure about how to finish as I think I am on the right track but my thinking is a bit cloudy.

To finish proof i need to show as this is where we end up if we reverse the operators initially as we want:

$$
\sum _{k=1} \sum _{i=1} Λ_{ik} Ω_{ki} = \sum _{k=1} \sum _{i=1} Ω_{ki} Λ_{ik}
$$

So correct me where/if I'm wrong.

The way I'm thinking about it is in terms of matrix multiplication. The trace only sums the diagonal elements of the a matrix. Thus when multiplying two matrices/operators the only terms that 'survive' are terms which end up there as the ith row * the ith column. This is denoted by the fact that ' i ' is the 1st term $$Λ_{ik}$$ denotes the ith row time of Λ times the ith column $$Ω_{ki}$$ when we sum through all the values k = 1 to k = n.

Is it possible that we can just swap the two indexes k & i? as they both go to n? Can I just swap their positions as they are are matrix elements and are thus commutative?

For part 2 i can use a similar method to get to:
$$
\sum _{k=1} \sum _{i=1} \sum _{j=1} Λ_{ik} Ω_{kj} θ_{ji}
$$
I notice that the last index of a matrix matches the first index of the next operator. Which may be a clue to why its cyclic but I can't figure out why.

Apologies for the long winded attempt, I just wanted to be clear for anybody attempting to understand my confusion. In my text this proof is in the Unitary Matrix/ Determinant section. These are obviously relevant in later parts to this question but they don't appear to be relevant here.

Any help greatly appreciated
 
Physics news on Phys.org
It might be easier to work with matrices ##A_1,A_2,A_3## and calculate ##\operatorname{tr}(A_1A_2)## and ##\operatorname{tr}(A_1A_2A_3)##. Then the result should be symmetric in ##(1,2)## and cyclic in ##(1,2,3)##. In any case, you have finite sums here, so changing the order of summation is no issue and the rest comes with the fact, that your matrix entries are hopefully from a commutative and associative domain.
 
  • Like
Likes   Reactions: DrMCoeus
My advice would be to only spend much time on the first statement, and you ultimately need to find a way to prove this that is intuitively satisfying to you, as traces are ridiculously useful and the cyclic property is vital. I.e. spend your time on proving

##trace\Big(\mathbf {AB}\Big) = trace\Big(\mathbf {BA}\Big)##

Personally, I like to start simple and build, which in this context means rank one matrices, i.e. that

##trace\Big(\mathbf a \tilde{ \mathbf b}^T \Big) = trace\Big(\tilde{ \mathbf b}^T \mathbf a \Big)##

If you look at matrix multiplication in terms of a series of rank one updates vs a bunch of dot products, you can build on this simple vector setup.

- - - -
once you have this locked down, use that fact plus associativity of matrix multiplication to get the cyclic property. Specifically consider

##trace\Big(\mathbf {XYZ}\Big) = trace\Big(\big(\mathbf {XY}\big)\big(\mathbf Z \big)\Big)##

now call on the fact that ##trace\Big(\mathbf {AB}\Big) = trace\Big(\mathbf {BA}\Big)##, which if you assigned ##\mathbf A := \big(\mathbf {XY}\big)## and ##\mathbf B := \big(\mathbf Z\big)##, gives you the below:

##trace\Big(\big(\mathbf {XY}\big)\big(\mathbf Z \big)\Big) = trace\Big(\big(\mathbf Z \big)\big(\mathbf {XY}\big)\Big)##

and use associativity and original proof once more to finish this off.
 
  • Like
Likes   Reactions: DrMCoeus
Thanks for help guys. I am beginning to understand some of my issues.

However I realized that that this identity applies for non-square matrix products also. i.e. if we have matrix A (2x3 matrix) and matrix B (3x2 matrix) then AB produces a 2x2 matrix & BA produces a 3x3 matrix yet the traces are still the same. Which was something I hadn't even considered and very interesting.

StoneTemplePython said:
Personally, I like to start simple and build, which in this context means rank one matrices, i.e. that

##trace\Big(\mathbf a \tilde{ \mathbf b}^T \Big) = trace\Big(\tilde{ \mathbf b}^T \mathbf a \Big)##

If you look at matrix multiplication in terms of a series of rank one updates vs a bunch of dot products, you can build on this simple vector setup.

I was just wondering if you could elaborate on this a bit more? More specifically, why do you have the complex conjugate and transpose of b denoted. Also the term rank one updates is unfamiliar to me. My Linear algebra is geared towards QM and lacks some generality.

Also I had figured out the answer to why they are cyclic. Very nice proof indeed :)
 
DrMCoeus said:
Thanks for help guys. I am beginning to understand some of my issues.

However I realized that that this identity applies for non-square matrix products also. i.e. if we have matrix A (2x3 matrix) and matrix B (3x2 matrix) then AB produces a 2x2 matrix & BA produces a 3x3 matrix yet the traces are still the same. Which was something I hadn't even considered and very interesting.
I was just wondering if you could elaborate on this a bit more? More specifically, why do you have the complex conjugate and transpose of b denoted. Also the term rank one updates is unfamiliar to me. My Linear algebra is geared towards QM and lacks some generality.

Also I had figured out the answer to why they are cyclic. Very nice proof indeed :)

I would probably start off by assuming the scalars are in ##\mathbb R## for starters here that way you don't get lost in things like dot product versus inner product. If you right click the LaTeX, and do 'show math as -> Tex commands' you'll see that the symbol above the b vector in

##trace\Big(\mathbf a \tilde{ \mathbf b}^T \Big) = trace\Big(\tilde{ \mathbf b}^T \mathbf a \Big)##

is actually the "tilde" sign.

People don't always use it, but basically ##\tilde{ \mathbf b}^T## indicates that the vector is a row vector and was originally a row vector. (As opposed to if it didn't have the tilde sign, it would mean you have a column vector that has been transposed so that it is now a row vector.)

Notation is not uniform though... FWIW if I was showing a conjugate transpose I would do ##\mathbf b^H## or ##\mathbf b^*## but again, in the example I show here, conjugation has nothing to do with anything... you're certainly welcome to ignore the tilde signs if you want. There is a hint, though, tied in with the tilde sign -- consider how to partition ##\mathbf A## and ##\mathbf B## appropriately into row and column vectors, and look at the two different types of multiplication mentioned.

As for rank one updates -- you can just call it a finite series of rank one matrices if you want. Sometimes people call it the outer product interpretation of matrix multiplication.

Again, the key thing is finding a way to prove

##trace\Big(\mathbf {AB}\Big) = trace\Big(\mathbf {BA}\Big)##

in a way that is satisfying and intuitive to you. Most proofs I've seen involve juggling double sigmas. I prefer something a bit more visual, and building off of rank one matrices.

In Linear Algebra Done Wrong, they strongly hint at using, in effect, indicator variables for ##\mathbf B## to prove this, but ultimately the problem is left as an exercise. It is a very good book, and free. You may want to read it as some point. It is available here:

https://www.math.brown.edu/~treil/papers/LADW/book.pdf
 
DrMCoeus said:
I was just wondering if you could elaborate on this a bit more?
You have already done all the work needed in your opening post under point ##1##. Just use ##\Lambda_{ik}\Omega_{ki}=\Omega_{ki}\Lambda_{ik}##, switch the summation order and walk the same way back to end up with ##\operatorname{Tr}(\Omega \Lambda)##. The ##3-##cycles follow with what @StoneTemplePython said in post #3 by using the associativity of matrix multiplication. No need for ranks, row- or column vectors, transposings, conjugates or special fields, except that the scalar field has to be commutative and associative.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
5
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K