# Polar decompostion of Hermitian Matrix

## Homework Statement

I need the steps to follow when finding the polar decomposition of a hermitian matrix

If someone could direct me to a website that would help, or put up an example here please.
thanks :)

## The Attempt at a Solution

The polar decomposition as stated in Horn and Johnson's Matrix Analysis (corollary 7.3.3):

"If A is an $n\times n$ complex matrix, then it may be written in the form

$$A = PU,$$

where P is a positive semidefinite matrix and U is unitary. The matrix P is always uniquely determined as $P = ( AA^*)^{1/2}$; if A is nonsingular, then U is uniquely determined as $U = P^{-1}A$."

So, we need to know how to take square roots of matrices. Let M be any positive semidefinite matrix. It is therefore hermitian. We know by the spectral theorem for hermitian matrices (theorem 4.1.5, Horn and Johnson) that it is unitarily equivalent to a diagonal matrix, a.k.a., it is diagonalizable via a unitary:

$$M = U \Lambda U^*$$

where $\Lambda$ is the usual matrix of eigenvalues. Define $M^{1/2} = U \Lambda^{1/2} U^*$, where

$$\Lambda^{1/2} = diag ( \lambda_1^{1/2}, \ldots, \lambda_n^{1/2})$$

and the unique NONNEGATIVE square root is taken in each case. This is possible since the spectrum of a positive definite matrix is nonnegative. Then $M^{1/2}$ is hermitian (this is the converse of the spectral theorem) and it is positive semidefinite since its spectrum is nonnegative (theorem 7.2.1, Horn and Johnson).

OK, so now we are to compute $(AA^*)^{1/2} = (A^2)^{1/2}$. Note that this is in general NOT equal to A itself. Why? Because the eigenvalues of A may be negative, and we take nonnegative roots when computing the square root. First, we had better verify that $AA^*=A^2$ is positive semidefinite: let x be any column vector, then

$$x^*AA^*x = (A^*x)^*(A^*x) = ||A^*x||^2 \geq 0,$$

and $(AA^*)^* = AA^*,$ as required.

Now, A is hermitian and therefore we may invoke the spectral theorem one more time to write

$$A = W D W^*,$$

where D is the diagonal matrix of eigenvalues and W is unitary. Then

$$AA^* = A^2 = W D^2 W^*$$

and thus

$$(AA^*)^{1/2} = (A^2)^{1/2} = W (D^2)^{1/2} W^*.$$

Remember to take the nonnegative roots! And we've found our P! Now, if A is nonsingular, then we can find U via the formula in the original theorem. Verifying that U is unitary (EDIT: and uniqueness of P and U!) can be found in Horn and Johnson.

If you have any more questions or want to see an example, write back.

Last edited:
could u please put up a worked example.
I want to follow the steps u posted, with the example and and see if I can get the correct answer.
Thanks so much.
I'm studying at the university of south africa. our prescribed text is "matrices and linear transformations" by cullen.
i'll see if i can get a copy of the text you suggested.
thanks again

No problem. PhysicsForums doesn't have the LaTeX commands that I'm used to working with when typing up matrices (I have my own custom commands), so I'll type up a quick .pdf file and post it in this thread when I'm done.

Here we go. I think all my computations are correct.

#### Attachments

• Temp.pdf
67.1 KB · Views: 258
Hi,
will this still work if my matrix is a REAL hermitian?
I'm going to find sometime today at work to work through your example.
thank you so much!
the text book i have just had theorems and no examples.
:(

thanks again.

Yup, it'll work! In the case that the matrix is real and hermitian, we call it symmetric.

Is it necessary to get unit vectors for the eigenvalues?
For the question I am working on, the answer is given. I notice that if I do not use unit vectors, then I will have the same answer as the book does for the matrix P.
I am still working out the matrix U.
The matrix P is quite ugly (well to me its ugly) and getting its inverse is taking me a while.
Im sure I am just making calculation errors with this as finding the inverse of a matrix is pretty straight forward.
Thanks again.
PS: did you see my post on finding the projectors for a matrix? can you help there?

No, unit eigenvectors are not necessary when diagonalizing. We just needed a guarantee that the matrix were diagonlizable, and the spectral theorem for hermitian matrices provides us with that guarantee. But the spectral theorem says something stronger in that not only is it diagonalizable, but it's diagonalizable via a unitary. What I mean by stronger is this: unitarily equivalent matrices are similar, but the converse does not hold in general.

But you are correct, we don't need those eigenvectors to be unit length. We just to diagonalize it.