# How to Prove a Matrix is Diagonal?

I have a complex matrix, $\textbf{A}$, whose columns are linearly independent. In other words, $\textbf{A}$ is either tall or square and $\left( \textbf{A}^H\textbf{A}\right)^{-1}$ exists (where $\left(\right)^H$ denotes conjugate transpose). I am trying to prove that the matrix:

$\textbf{B} \triangleq \left( \textbf{A}^H\textbf{A}\right)$

must be diagonal, based on the following:

$\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}$

for some real diagonal matrix $diag(\underline{\lambda})$. It may or may not also be useful to note that $\textbf{A}$ is also subject to the constraint:

$\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1}$

by which I mean that all the diagonal entries of $(\textbf{A}\textbf{A}^H)$ are equal to 1 (i.e. the Euclidean norms of the rows of $\textbf{A}$ are all 1).

I have deduced all sorts of properties of $\textbf{A}$, but strongly believe that it should be possible to show that $\textbf{B}$ is diagonal... but a proof escapes me. Any help is greatly appreciated!

Last edited:

micromass
Staff Emeritus
Homework Helper
I might misunderstand your problem but

$$\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)$$

This is not diagonal. It IS hermition though (as can easily be proven).

I might misunderstand your problem but

$$\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)$$

This is not diagonal. It IS hermition though (as can easily be proven).

I'm not sure what the consequences of that are in this context. The matrix you suggested cannot satisfy either of the equations:

$\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}$

$\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1}$

I like Serena
Homework Helper
by which I mean that all the diagonal entries of $(\textbf{A}\textbf{A}^H)$ are equal to 1 (i.e. the Euclidean norms of the rows of $\textbf{A}$ are all 1).

Does this mean that $(\textbf{A}\textbf{A}^H)$ is the identity matrix? Does this mean that $(\textbf{A}\textbf{A}^H)$ is the identity matrix? It could be any matrix with ones on the diagonal. For example:

$(\textbf{A}\textbf{A}^H) = % \left[ \begin{array}{lll} 1 & 3 & 2 \\ 3 & 1 & 7 \\ 2 & 7 & 1% \end{array}% \right]$

would be suitable in this sense.

I like Serena
Homework Helper
Oh, okay, so are the Euclidean norms of the rows of A not 1?

Oh, okay, so are the Euclidean norms of the rows of A not 1?

Yes, the Euclidean norms of the rows are 1.

Consider, for example:

$\textbf{A}=\left[ \begin{array}{ll} 1 & 0 \\ 1 & 0 \\ 0 & 1% \end{array}% \right] ,\quad \underline{\lambda }=\left[ \begin{array}{l} 0.25 \\ 0.25 \\ 1% \end{array}% \right]$

The norms of all rows of $\textbf{A}$ are equal to one, but $\textbf{A}\textbf{A}^H$ is not the identity matrix:

$\textbf{A}\textbf{A}^H=\left[ \begin{array}{lll} 1 & 1 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 1% \end{array}% \right]$

(but $\textbf{A}^H\textbf{A}$ is diagonal, and I want to show that this must always be true).

Last edited: