twoflower

Hi all,

I don't understand to one part of proof of this theorem:

All eigenvalues of each hermitian matrix A are real numbers and, moreover, there exists unitary matrix R such, that

$$R^{-1}AR$$

is diagonal

Proof: By induction with respect to n (order of matrix A)

For n = 1 it's obvious.
Suppose that the theorem holds for 1, 2, ..., n-1
We know that $\exists$ eigenvalue $\lambda$ and appropriate eigenvector $x \in \mathbb{C}$.
Using Steinitz's theorem, we can extend $x$ to orthonormal base of $\mathbb{C}^{n}$.
Suppose that $||x|| = 1$ and construct matrix $P_n$ from vectors of this base ($P_n$ will have these vectors in its columns).

$P_n$ is unitary $\Leftarrow P_{n}^{H}P_n = I$, because standard inner product of two different vectors in the orthonormal base is zero and inner product of two identical vectors is 1.

This holds:

$$\left(P_{n}^{H}A_{n}P_{n}\right)^{H} = P_{n}^{H}A_{n}^{H}\left(P_{n}^{H}\right)^{H} = P_{n}^{H}A_{n}P_{n}$$

Last line is what I don't understand, probably it's trivial but I can't see that

$$\left(P_{n}^{H}A_{n}P_{n}\right)^{H} = \left(P_{n}^{H}\right)^{H}A_{n}^{H}P_{n}^{H} = P_{n}^{H}A_{n}^{H}\left(P_{n}^{H}\right)^{H}$$

(the second equality)

Thank you for the explanation.

Related Linear and Abstract Algebra News on Phys.org

matt grime

Homework Helper
because you're forgetting that taking the daggeer reverses the order of the matrices i'l; use start instead, but (AB)*=B*A*

the second equality as you have it is wrong, but then it isn't supposed to be true.

twoflower

matt grime said:
because you're forgetting that taking the daggeer reverses the order of the matrices i'l; use start instead, but (AB)*=B*A*

the second equality as you have it is wrong, but then it isn't supposed to be true.
Thank you a lot Matt, I was looking at it for ten minutes and it's as simple as normal transposition

dextercioby

Homework Helper
Well,the first part (the real values of eigenvalues of hermitean operators) can be proven quite easily for a hermitean linear operator defined on a dense everywhere subset of a separable Hilbert space.

Daniel.

dextercioby

Homework Helper
The key to the second part is to remark that that matrix

$$M^{\dagger}AM \ ,\ M\in U(n,\mathbb{C})$$

is hermitean,which means that the linear operator associated is hermitean.A hermitean linear operator in a finite dimensional complex Hilbert space admits a spectral decomposition (moreover,the spectrum is purely discrete),which means that the operator $M^{\dagger}AM$ has zero off-diagonal matrix elements.

Daniel.

twoflower

dextercioby said:
The key to the second part is to remark that that matrix

$$M^{\dagger}AM \ ,\ M\in U(n,\mathbb{C})$$

is hermitean,which means that the linear operator associated is hermitean.A hermitean linear operator in a finite dimensional complex Hilbert space admits a spectral decomposition (moreover,the spectrum is purely discrete),which means that the operator $M^{\dagger}AM$ has zero off-diagonal matrix elements.

Daniel.
Thank you Daniel for this explanation, but I don't have a clue what Hilbert space is (I only heard of it) and what hermitean linear operator is.

However, I've been studying the proof on and I again encountered place I don't understand to.

If I continue from where I finished my initial post:

...

And thus $P_{n}^{H}A_{n}P_{n}$ is hermitian matrix.

Next,

$$\left( \begin{array}{cc} \lambda & 0 ..... 0 \\ 0 & A_{n-1} \\ 0 \end{array} \right)$$

Because this matrix is equal to its hermitian transposition, $\lambda \in \mathbb{R}$

// I'm not sure why this matrix is here and whether it should mean that it is the matrix $P_{n}^{H}A_{n}P_{n}$, I really don't know...anyway, let's continue

From induction presumption $\exists$ unitary matrix $R_{n-1}$ such, that

$$R_{n-1}^{-1}A_{n-1}R_{n-1} = D_{n-1}$$

Let's take

$$S = \left(\begin{array}{cc} 1 & 0 ..... 0 \\ 0 & R_{n-1} \\ 0 \end{array} \right)$$

$$R_n = P_{n}S$$

S is unitary, as well as $P_{n}$. Is also their product unitary? (In another words, is product of two unitary matrixes unitary matrix?) Let's see.

$$R_{n}^{H}R_{n} = \left(P_{n}S\right)^{H}P_{n}S = S^{H}P_{n}^{H}P_{n}S = I$$

So, it holds that $R_{n}$ is also unitary. Is $R_{n}$ the matrix we're looking for?

$$R_{n}^{-1}A_{n}R_{n} = \left(P_{n}S\right)^{H}AP_{n}S = S^{H}P_{n}^{H}AP_{n}S = \left(\begin{array}{cc} 1 & 0 ..... 0 \\ 0 & R_{n-1}^{H} \\ 0 \end{array} \right) \left(\begin{array}{cc} \lambda & 0 ..... 0 \\ 0 & A_{n-1} \\ 0 \end{array} \right) \left(\begin{array}{cc} 1 & 0 ..... 0 \\ 0 & R_{n-1} \\ 0 \end{array} \right) = \left(\begin{array}{cc} \lambda & 0 ..... 0 \\ 0 & D_{n-1} \\ 0 \end{array} \right) = D$$

Q.E.D

What I don't understand is that according to this,

$$P_{n}^{H}AP_{n} = \left(\begin{array}{cc} \lambda & 0 ..... 0 \\ 0 & A_{n-1} \\ 0 \end{array} \right)$$

Why?

Thank you.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving