Something about hermitian matrixes

  • Thread starter Thread starter twoflower
  • Start date Start date
  • Tags Tags
    Hermitian
twoflower
Messages
363
Reaction score
0
Hi all,

I don't understand to one part of proof of this theorem:

All eigenvalues of each hermitian matrix A are real numbers and, moreover, there exists unitary matrix R such, that

<br /> R^{-1}AR<br />

is diagonal


Proof: By induction with respect to n (order of matrix A)

For n = 1 it's obvious.
Suppose that the theorem holds for 1, 2, ..., n-1
We know that \exists eigenvalue \lambda and appropriate eigenvector x \in \mathbb{C}.
Using Steinitz's theorem, we can extend x to orthonormal base of \mathbb{C}^{n}.
Suppose that ||x|| = 1 and construct matrix P_n from vectors of this base (P_n will have these vectors in its columns).

P_n is unitary \Leftarrow P_{n}^{H}P_n = I, because standard inner product of two different vectors in the orthonormal base is zero and inner product of two identical vectors is 1.

This holds:

<br /> \left(P_{n}^{H}A_{n}P_{n}\right)^{H} = P_{n}^{H}A_{n}^{H}\left(P_{n}^{H}\right)^{H} = P_{n}^{H}A_{n}P_{n}<br />

Last line is what I don't understand, probably it's trivial but I can't see that

<br /> \left(P_{n}^{H}A_{n}P_{n}\right)^{H} = \left(P_{n}^{H}\right)^{H}A_{n}^{H}P_{n}^{H} = P_{n}^{H}A_{n}^{H}\left(P_{n}^{H}\right)^{H}<br />

(the second equality)

Thank you for the explanation.
 
Physics news on Phys.org
because you're forgetting that taking the daggeer reverses the order of the matrices i'l; use start instead, but (AB)*=B*A*

the second equality as you have it is wrong, but then it isn't supposed to be true.
 
matt grime said:
because you're forgetting that taking the daggeer reverses the order of the matrices i'l; use start instead, but (AB)*=B*A*

the second equality as you have it is wrong, but then it isn't supposed to be true.

Thank you a lot Matt, I was looking at it for ten minutes and it's as simple as normal transposition :rolleyes:
 
Well,the first part (the real values of eigenvalues of hermitean operators) can be proven quite easily for a hermitean linear operator defined on a dense everywhere subset of a separable Hilbert space.

Daniel.
 
The key to the second part is to remark that that matrix

M^{\dagger}AM \ ,\ M\in U(n,\mathbb{C})

is hermitean,which means that the linear operator associated is hermitean.A hermitean linear operator in a finite dimensional complex Hilbert space admits a spectral decomposition (moreover,the spectrum is purely discrete),which means that the operator M^{\dagger}AM has zero off-diagonal matrix elements.

Daniel.
 
dextercioby said:
The key to the second part is to remark that that matrix

M^{\dagger}AM \ ,\ M\in U(n,\mathbb{C})

is hermitean,which means that the linear operator associated is hermitean.A hermitean linear operator in a finite dimensional complex Hilbert space admits a spectral decomposition (moreover,the spectrum is purely discrete),which means that the operator M^{\dagger}AM has zero off-diagonal matrix elements.

Daniel.

Thank you Daniel for this explanation, but I don't have a clue what Hilbert space is (I only heard of it) and what hermitean linear operator is.

However, I've been studying the proof on and I again encountered place I don't understand to.

If I continue from where I finished my initial post:

...

And thus P_{n}^{H}A_{n}P_{n} is hermitian matrix.

Next,

<br /> \left( \begin{array}{cc} \lambda &amp; 0 ... 0 \\ 0 &amp; A_{n-1} \\ 0 \end{array} \right)<br />

Because this matrix is equal to its hermitian transposition, \lambda \in \mathbb{R}

// I'm not sure why this matrix is here and whether it should mean that it is the matrix P_{n}^{H}A_{n}P_{n}, I really don't know...anyway, let's continue

From induction presumption \exists unitary matrix R_{n-1} such, that

<br /> R_{n-1}^{-1}A_{n-1}R_{n-1} = D_{n-1}<br />

Let's take

<br /> S = \left(\begin{array}{cc} 1 &amp; 0 ... 0 \\ 0 &amp; R_{n-1} \\ 0 \end{array} \right)<br />

<br /> R_n = P_{n}S<br />

S is unitary, as well as P_{n}. Is also their product unitary? (In another words, is product of two unitary matrixes unitary matrix?) Let's see.

<br /> R_{n}^{H}R_{n} = \left(P_{n}S\right)^{H}P_{n}S = S^{H}P_{n}^{H}P_{n}S = I<br />

So, it holds that R_{n} is also unitary. Is R_{n} the matrix we're looking for?

<br /> R_{n}^{-1}A_{n}R_{n} = \left(P_{n}S\right)^{H}AP_{n}S = S^{H}P_{n}^{H}AP_{n}S = \left(\begin{array}{cc} 1 &amp; 0 ... 0 \\ 0 &amp; R_{n-1}^{H} \\ 0 \end{array} \right)<br /> \left(\begin{array}{cc} \lambda &amp; 0 ... 0 \\ 0 &amp; A_{n-1} \\ 0 \end{array} \right)<br /> \left(\begin{array}{cc} 1 &amp; 0 ... 0 \\ 0 &amp; R_{n-1} \\ 0 \end{array} \right) = <br /> \left(\begin{array}{cc} \lambda &amp; 0 ... 0 \\ 0 &amp; D_{n-1} \\ 0 \end{array} \right) = D<br />

Q.E.D

What I don't understand is that according to this,

<br /> P_{n}^{H}AP_{n} = \left(\begin{array}{cc} \lambda &amp; 0 ... 0 \\ 0 &amp; A_{n-1} \\ 0 \end{array} \right)<br />

Why?

Thank you.
 

Similar threads

Back
Top