# Sum of Hermitian Matrices Proof

• RJLiberator
In summary: A^†=A and B^†=B(A+B)^† = A+BIn summary, the sum of two nxn Hermitian matrices is also Hermitian, as shown by the proof that the Hermitian conjugate of the sum is equal to the sum of the Hermitian conjugates. This can be further simplified to (A+B)^†=A+B, proving that the sum of two Hermitian matrices is Hermitian.
RJLiberator
Gold Member

## Homework Statement

Show that the sum of two nxn Hermitian matrices is Hermitian.

## Homework Equations

Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

## The Attempt at a Solution

This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

$( \begin{pmatrix} a & b & c\\ d & e & f\\ g & h & i\\ \end{pmatrix}+ \begin{pmatrix} a_1 & b_1 & c_1\\ d_1 & e_1 & f_1\\ g_1 & h_1 & i_1\\ \end{pmatrix})^†= \begin{pmatrix} a+a_1 & b+b_1 & c+c_1\\ d+d_1 & e+e_1 & f+f_1\\ g+g_1 & h+h_1 & i+i_1\\ \end{pmatrix}$

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.

RJLiberator said:

## Homework Statement

Show that the sum of two nxn Hermitian matrices is Hermitian.

## Homework Equations

Hermitian conjugate means that you take the complex conjugate of the elements and transpose the matrix. I will denote it with a †.
I will denote the complex conjugate with a *.

## The Attempt at a Solution

This proof, theoretically, seems rather simple.
But, I'm just not connecting the dots. There's too much inexperience on my part.

Let me explain:

A hermitian matrix is hermitian if A†=A.
My thinking: Since we have one matrix that is hermitian, adding it to another hermitian matrix will result in a hermitian matrix. This, while not obvious, seems to make sense due to how the transpose definition works. Since we are simply adding together two hermitian matrices, the result should also be hermitian as the sum is even throughout.

If I am looking at 3x3 matrices, I note that:

$( \begin{pmatrix} a & b & c\\ d & e & f\\ g & h & i\\ \end{pmatrix}+ \begin{pmatrix} a_1 & b_1 & c_1\\ d_1 & e_1 & f_1\\ g_1 & h_1 & i_1\\ \end{pmatrix})^†= \begin{pmatrix} a+a_1 & b+b_1 & c+c_1\\ d+d_1 & e+e_1 & f+f_1\\ g+g_1 & h+h_1 & i+i_1\\ \end{pmatrix}$

So I am now beginning to work with conditions.
We know the diagonal simply just has the complex conjugate to work with.
The rest can be transposed.

This is where my thinking starts to get fluttered, I feel like I went down a wrong hole. There's too much going on, it may seem. It feels like there is an easier way.

How about using (or showing) that ##(A+B)^{\dagger}= A^\dagger+B^\dagger##?

RJLiberator
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. $(A+B)^†=A^†+B^†$

2. LHS: $(A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)$

3. The transpose flips the subscript ij to ji:
LHS: $(A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)$

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
$(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)$

And similiarly, for the RHS we show that:

5. RHS: $A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)$

which equals:
$A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)$My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D

RJLiberator said:
Doh! My intuition for matrices is vastly inexperienced.
That does seem to clear things up, mate! Wow, what a revelation that invoked within me!

With this, we can use summation to find the ij'th elements are equal to each other.
I still am weak in my notation and use of indexes, so let me try to be as explicit as possible (and try to use proper latex, heh):

1. $(A+B)^†=A^†+B^†$

2. LHS: $(A+B)^† = \Big( \sum_{k=1}^j(A+B)^{T^*}_{ij} \Big)$

3. The transpose flips the subscript ij to ji:
LHS: $(A+B)^† = \Big( \sum_{k=1}^j(A+B)^*_{ji} \Big)$

4. With an earlier question already proved, when taking the complex conjugate of the sum of two complex numbers you can take the complex conjugate of each individual part first and then sum:
$(A+B)^† = \Big( \sum_{k=1}^j(A_{ji}^*+B_{ji}^*) \Big)$

And similiarly, for the RHS we show that:

5. RHS: $A^† +B^† = \Big( \sum_{k=1}^j(A^†_{ij} +B^† _{ij}) \Big)$

which equals:
$A^† +B^† = \Big( \sum_{k=1}^j(A^*_{ji} +B^* _{ji}) \Big)$My questions left:
1) Is this the correct idea? Did I miss any blatantly poor step?
2) How do my indexes look? I am trying to sum up from k=1 to j. Or is something messing up when I take the transpose?

:D

Why are you summing at all? Taking the Hermitian conjugate does not involve a sum.

RJLiberator
Hm.
Well, my thoughts on summing were to show, in general, that each element on each side is identical.

But let me take a step back and see if I think it is necessary.

So you are saying, that this proof is as simple as:

(A+B)^†=A^†+B^†
(A+B)^*_ji=A*_ji+B*_ji
A*_ji+B*_ji=A*_ji+B*_ji

I suppose that does make sense, and the third line is brought on by earlier proposition. And yes, indeed the second line as well.Oh, no, it's even easier..
One step proof:
(A+B)^† = A^†+B^†
by earlier proposition...

Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.

You don't need to write out matrices at all. You only need to consider the elements aij, aji, bij, bji. Pretty easy.

RJLiberator
RJLiberator said:
Wopps... I am forgetting where I am trying to go.. let's make this real simple now:

(A+B)^† = A^†+B^† = A+B so it is hermitian.

First equal sign due to previous proposition.
Second equal sign as condition of the question (hermitian A and B matrices).
And we have shown that (A+B)^† = A+B so it is hermitian.

That's what I was looking for.

RJLiberator
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?

RJLiberator said:
Thank you kindly for the help. As usual, it is more easier and beautiful then I make it out to be. :)

@epenguin, @Dick,
Any guidance on how to prove that the same is NOT true for the product of two nxn hermitian matrices? It is part B of the question.
I tried to work on it using the same methods in this thread.

(AB)^†≠AB

This is where we would use summation notation for the ij'th elements?

Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?

RJLiberator
Oh dear, think simple again. I think you need only consider cij and cji of the product matrix and where it has come from. You can avoid hard-to-think-about complete generality because to disprove a statement you only need one counter example so you can oook for and hopefully find a simple one.

Last edited:
RJLiberator
Dick said:
Don't write out explicit indices unless you have to. Where do you go from ##(AB)^\dagger##?

Here's what I have:

$(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†$

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.

RJLiberator said:
Here's what I have:

$(AB)^†=[(AB)^T]^*=(B^TA^T)^*=(B^T)^*(A^T)^*=B^†A^†$

So, since B and A are hermitian matrices, this becomes (BA) which due to properties of matrices, this is not equal to AB.

Yes, the product is not Hermitian unless ##AB=BA## i.e. they commute. In general ##AB## is not equal to ##BA##, but it might happen. You might want to find a counterexample to show it isn't always true.

RJLiberator
I should have thought easiest and best proof is just write out a multiplication if two arbitrary 2×2 Hermitian matrices, (can leave diagonal product terms blank as it doesn't matter what they are) and you see the diagonally matching terms contain elements totally unrelated to anything in the opposite element. The elements can be submatrices, so there is nothing special about 2×2.

RJLiberator

## 1. What is the definition of a Hermitian matrix?

A Hermitian matrix is a square matrix that is equal to its own conjugate transpose. This means that the elements on the main diagonal are all real numbers, and the elements above and below the diagonal are conjugate pairs.

## 2. How do you prove that the sum of two Hermitian matrices is also Hermitian?

To prove that the sum of two Hermitian matrices is also Hermitian, we must show that the sum of the matrices is equal to their own conjugate transpose. This can be done by using the properties of matrix addition and the definition of a Hermitian matrix.

## 3. Can you give an example of two Hermitian matrices whose sum is not Hermitian?

Yes, for example, consider the matrices A = [1 2 + i; 2 - i 3] and B = [2 4 + 2i; 4 + 2i 6]. Both A and B are Hermitian matrices, but their sum A + B = [3 6 + 3i; 6 + 3i 9] is not Hermitian because the element in the (2,1) position is not equal to its own conjugate.

## 4. What is the significance of Hermitian matrices in linear algebra?

Hermitian matrices have many important properties that make them useful in linear algebra. They have real eigenvalues and orthogonal eigenvectors, which are important in diagonalization and solving systems of linear equations. They also have applications in quantum mechanics, signal processing, and statistics.

## 5. Is the sum of three or more Hermitian matrices always Hermitian?

Yes, the sum of any number of Hermitian matrices is also Hermitian. This can be proven by using mathematical induction or by extending the proof for the sum of two Hermitian matrices to multiple matrices. However, this does not hold true for other types of matrices, such as skew-Hermitian matrices.

• Calculus and Beyond Homework Help
Replies
2
Views
2K
• Calculus and Beyond Homework Help
Replies
3
Views
3K
• Calculus and Beyond Homework Help
Replies
1
Views
7K
• Calculus and Beyond Homework Help
Replies
20
Views
1K
• Calculus and Beyond Homework Help
Replies
4
Views
2K
• Calculus and Beyond Homework Help
Replies
31
Views
3K
• Calculus and Beyond Homework Help
Replies
1
Views
5K
• Linear and Abstract Algebra
Replies
2
Views
991
• Calculus and Beyond Homework Help
Replies
10
Views
2K
• Calculus and Beyond Homework Help
Replies
11
Views
3K