Sum of Hermitian Matrices Proof

Click For Summary

Homework Help Overview

The discussion revolves around proving that the sum of two nxn Hermitian matrices is also Hermitian. Participants are exploring the properties of Hermitian matrices and the implications of their definitions, particularly focusing on the Hermitian conjugate and its relationship to matrix addition.

Discussion Character

  • Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the definition of Hermitian matrices and the properties of their conjugates. There are attempts to express the proof using matrix notation and summation, with some participants questioning the necessity of certain steps in their reasoning.

Discussion Status

Several participants have provided insights and clarifications regarding the proof, indicating that the relationship between the Hermitian conjugate and matrix addition is simpler than initially thought. There is ongoing exploration of notation and indexing, with some participants expressing confusion about their approach.

Contextual Notes

Some participants express a lack of experience with matrix notation and seek clarification on their understanding of the properties of Hermitian matrices. There is also a mention of a follow-up question regarding the product of Hermitian matrices, indicating that the discussion may extend beyond the initial proof.

RJLiberator
Gold Member
Messages
1,094
Reaction score
63

Homework Statement


Show that the sum of two nxn Hermitian matrices is Hermitian.

Homework Equations


Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

The Attempt at a Solution



This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

(<br /> \begin{pmatrix}<br /> a &amp; b &amp; c\\<br /> d &amp; e &amp; f\\<br /> g &amp; h &amp; i\\<br /> \end{pmatrix}+<br /> \begin{pmatrix}<br /> a_1 &amp; b_1 &amp; c_1\\<br /> d_1 &amp; e_1 &amp; f_1\\<br /> g_1 &amp; h_1 &amp; i_1\\<br /> \end{pmatrix})^†=<br /> \begin{pmatrix}<br /> a+a_1 &amp; b+b_1 &amp; c+c_1\\<br /> d+d_1 &amp; e+e_1 &amp; f+f_1\\<br /> g+g_1 &amp; h+h_1 &amp; i+i_1\\<br /> \end{pmatrix}

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.
 
Physics news on Phys.org
RJLiberator said:

Homework Statement


Show that the sum of two nxn Hermitian matrices is Hermitian.

Homework Equations


Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

The Attempt at a Solution



This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

(<br /> \begin{pmatrix}<br /> a &amp; b &amp; c\\<br /> d &amp; e &amp; f\\<br /> g &amp; h &amp; i\\<br /> \end{pmatrix}+<br /> \begin{pmatrix}<br /> a_1 &amp; b_1 &amp; c_1\\<br /> d_1 &amp; e_1 &amp; f_1\\<br /> g_1 &amp; h_1 &amp; i_1\\<br /> \end{pmatrix})^†=<br /> \begin{pmatrix}<br /> a+a_1 &amp; b+b_1 &amp; c+c_1\\<br /> d+d_1 &amp; e+e_1 &amp; f+f_1\\<br /> g+g_1 &amp; h+h_1 &amp; i+i_1\\<br /> \end{pmatrix}

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.

How about using (or showing) that ##(A+B)^{\dagger}= A^\dagger+B^\dagger##?
 
  • Like
Likes   Reactions: RJLiberator
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. (A+B)^†=A^†+B^†

2. LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)

3. The transpose flips the subscript ij to ji:
LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)

And similiarly, for the RHS we show that:

5. RHS: A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)

which equals:
A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D
 
RJLiberator said:
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. (A+B)^†=A^†+B^†

2. LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)

3. The transpose flips the subscript ij to ji:
LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)

And similiarly, for the RHS we show that:

5. RHS: A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)

which equals:
A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D

Why are you summing at all? Taking the Hermitian conjugate does not involve a sum.
 
  • Like
Likes   Reactions: RJLiberator
Hm.
Well, my thoughts on summing were to show, in general, that each element on each side is identical.

But let me take a step back and see if I think it is necessary.

So you are saying, that this proof is as simple as:

(A+B)^†=A^†+B^†
(A+B)^*_ji=A*_ji+B*_ji
A*_ji+B*_ji=A*_ji+B*_ji

I suppose that does make sense, and the third line is brought on by earlier proposition. And yes, indeed the second line as well.Oh, no, it's even easier..
One step proof:
(A+B)^† = A^†+B^†
by earlier proposition...
 
Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.
 
You don't need to write out matrices at all. You only need to consider the elements aij, aji, bij, bji. Pretty easy.
 
  • Like
Likes   Reactions: RJLiberator
RJLiberator said:
Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.

That's what I was looking for.
 
  • Like
Likes   Reactions: RJLiberator
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?
 
  • #10
RJLiberator said:
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?

Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?
 
  • Like
Likes   Reactions: RJLiberator
  • #11
Oh dear, think simple again. I think you need only consider cij and cji of the product matrix and where it has come from. You can avoid hard-to-think-about complete generality because to disprove a statement you only need one counter example so you can oook for and hopefully find a simple one.
 
Last edited:
  • Like
Likes   Reactions: RJLiberator
  • #12
Dick said:
Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?

Here's what I have:

(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.
 
  • #13
RJLiberator said:
Here's what I have:

(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.

Yes, the product is not Hermitian unless ##AB=BA## i.e. they commute. In general ##AB## is not equal to ##BA##, but it might happen. You might want to find a counterexample to show it isn't always true.
 
  • Like
Likes   Reactions: RJLiberator
  • #14
I should have thought easiest and best proof is just write out a multiplication if two arbitrary 2×2 Hermitian matrices, (can leave diagonal product terms blank as it doesn't matter what they are) and you see the diagonally matching terms contain elements totally unrelated to anything in the opposite element. The elements can be submatrices, so there is nothing special about 2×2.
 
  • Like
Likes   Reactions: RJLiberator

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
8K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 31 ·
2
Replies
31
Views
4K
  • · Replies 1 ·
Replies
1
Views
5K
Replies
20
Views
2K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 6 ·
Replies
6
Views
7K
Replies
10
Views
2K