Sum of Hermitian Matrices Proof

RJLiberator
Gold Member
Messages
1,094
Reaction score
63

Homework Statement


Show that the sum of two nxn Hermitian matrices is Hermitian.

Homework Equations


Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

The Attempt at a Solution



This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

(<br /> \begin{pmatrix}<br /> a &amp; b &amp; c\\<br /> d &amp; e &amp; f\\<br /> g &amp; h &amp; i\\<br /> \end{pmatrix}+<br /> \begin{pmatrix}<br /> a_1 &amp; b_1 &amp; c_1\\<br /> d_1 &amp; e_1 &amp; f_1\\<br /> g_1 &amp; h_1 &amp; i_1\\<br /> \end{pmatrix})^†=<br /> \begin{pmatrix}<br /> a+a_1 &amp; b+b_1 &amp; c+c_1\\<br /> d+d_1 &amp; e+e_1 &amp; f+f_1\\<br /> g+g_1 &amp; h+h_1 &amp; i+i_1\\<br /> \end{pmatrix}

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.
 
Physics news on Phys.org
RJLiberator said:

Homework Statement


Show that the sum of two nxn Hermitian matrices is Hermitian.

Homework Equations


Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

The Attempt at a Solution



This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

(<br /> \begin{pmatrix}<br /> a &amp; b &amp; c\\<br /> d &amp; e &amp; f\\<br /> g &amp; h &amp; i\\<br /> \end{pmatrix}+<br /> \begin{pmatrix}<br /> a_1 &amp; b_1 &amp; c_1\\<br /> d_1 &amp; e_1 &amp; f_1\\<br /> g_1 &amp; h_1 &amp; i_1\\<br /> \end{pmatrix})^†=<br /> \begin{pmatrix}<br /> a+a_1 &amp; b+b_1 &amp; c+c_1\\<br /> d+d_1 &amp; e+e_1 &amp; f+f_1\\<br /> g+g_1 &amp; h+h_1 &amp; i+i_1\\<br /> \end{pmatrix}

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.

How about using (or showing) that ##(A+B)^{\dagger}= A^\dagger+B^\dagger##?
 
  • Like
Likes RJLiberator
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. (A+B)^†=A^†+B^†

2. LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)

3. The transpose flips the subscript ij to ji:
LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)

And similiarly, for the RHS we show that:

5. RHS: A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)

which equals:
A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D
 
RJLiberator said:
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. (A+B)^†=A^†+B^†

2. LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)

3. The transpose flips the subscript ij to ji:
LHS: (A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)

And similiarly, for the RHS we show that:

5. RHS: A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)

which equals:
A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D

Why are you summing at all? Taking the Hermitian conjugate does not involve a sum.
 
  • Like
Likes RJLiberator
Hm.
Well, my thoughts on summing were to show, in general, that each element on each side is identical.

But let me take a step back and see if I think it is necessary.

So you are saying, that this proof is as simple as:

(A+B)^†=A^†+B^†
(A+B)^*_ji=A*_ji+B*_ji
A*_ji+B*_ji=A*_ji+B*_ji

I suppose that does make sense, and the third line is brought on by earlier proposition. And yes, indeed the second line as well.Oh, no, it's even easier..
One step proof:
(A+B)^† = A^†+B^†
by earlier proposition...
 
Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.
 
You don't need to write out matrices at all. You only need to consider the elements aij, aji, bij, bji. Pretty easy.
 
  • Like
Likes RJLiberator
RJLiberator said:
Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.

That's what I was looking for.
 
  • Like
Likes RJLiberator
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?
 
  • #10
RJLiberator said:
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?

Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?
 
  • Like
Likes RJLiberator
  • #11
Oh dear, think simple again. I think you need only consider cij and cji of the product matrix and where it has come from. You can avoid hard-to-think-about complete generality because to disprove a statement you only need one counter example so you can oook for and hopefully find a simple one.
 
Last edited:
  • Like
Likes RJLiberator
  • #12
Dick said:
Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?

Here's what I have:

(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.
 
  • #13
RJLiberator said:
Here's what I have:

(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.

Yes, the product is not Hermitian unless ##AB=BA## i.e. they commute. In general ##AB## is not equal to ##BA##, but it might happen. You might want to find a counterexample to show it isn't always true.
 
  • Like
Likes RJLiberator
  • #14
I should have thought easiest and best proof is just write out a multiplication if two arbitrary 2×2 Hermitian matrices, (can leave diagonal product terms blank as it doesn't matter what they are) and you see the diagonally matching terms contain elements totally unrelated to anything in the opposite element. The elements can be submatrices, so there is nothing special about 2×2.
 
  • Like
Likes RJLiberator
Back
Top