How to Prove a Matrix is Diagonal?

  • #1
I've been stuck on this problem for so long it's getting ridiculous. Please help!

I have a complex matrix, [itex]\textbf{A}[/itex], whose columns are linearly independent. In other words, [itex]\textbf{A}[/itex] is either tall or square and [itex] \left( \textbf{A}^H\textbf{A}\right)^{-1}[/itex] exists (where [itex]\left(\right)^H[/itex] denotes conjugate transpose). I am trying to prove that the matrix:

[itex]\textbf{B} \triangleq \left( \textbf{A}^H\textbf{A}\right)[/itex]

must be diagonal, based on the following:

[itex]\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}[/itex]

for some real diagonal matrix [itex]diag(\underline{\lambda})[/itex]. It may or may not also be useful to note that [itex]\textbf{A}[/itex] is also subject to the constraint:

[itex]\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1} [/itex]

by which I mean that all the diagonal entries of [itex](\textbf{A}\textbf{A}^H) [/itex] are equal to 1 (i.e. the Euclidean norms of the rows of [itex]\textbf{A}[/itex] are all 1).

I have deduced all sorts of properties of [itex]\textbf{A}[/itex], but strongly believe that it should be possible to show that [itex]\textbf{B}[/itex] is diagonal... but a proof escapes me. Any help is greatly appreciated!
 
Last edited:

Answers and Replies

  • #2
22,089
3,294
I might misunderstand your problem but

[tex]\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)[/tex]

This is not diagonal. It IS hermition though (as can easily be proven).
 
  • #3
I might misunderstand your problem but

[tex]\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)[/tex]

This is not diagonal. It IS hermition though (as can easily be proven).
I'm not sure what the consequences of that are in this context. The matrix you suggested cannot satisfy either of the equations:

[itex]\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}[/itex]

[itex]\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1} [/itex]
 
  • #4
I like Serena
Homework Helper
6,577
176
by which I mean that all the diagonal entries of [itex](\textbf{A}\textbf{A}^H) [/itex] are equal to 1 (i.e. the Euclidean norms of the rows of [itex]\textbf{A}[/itex] are all 1).
Does this mean that [itex](\textbf{A}\textbf{A}^H) [/itex] is the identity matrix? :confused:
 
  • #5
Does this mean that [itex](\textbf{A}\textbf{A}^H) [/itex] is the identity matrix? :confused:
It could be any matrix with ones on the diagonal. For example:

[itex]
(\textbf{A}\textbf{A}^H) =
%
\left[
\begin{array}{lll}
1 & 3 & 2 \\
3 & 1 & 7 \\
2 & 7 & 1%
\end{array}%
\right]


[/itex]

would be suitable in this sense.
 
  • #6
I like Serena
Homework Helper
6,577
176
Oh, okay, so are the Euclidean norms of the rows of A not 1?
 
  • #7
Oh, okay, so are the Euclidean norms of the rows of A not 1?
Yes, the Euclidean norms of the rows are 1.

Consider, for example:

[itex]
\textbf{A}=\left[
\begin{array}{ll}
1 & 0 \\
1 & 0 \\
0 & 1%
\end{array}%
\right] ,\quad \underline{\lambda }=\left[
\begin{array}{l}
0.25 \\
0.25 \\
1%
\end{array}%
\right]

[/itex]

The norms of all rows of [itex]\textbf{A}[/itex] are equal to one, but [itex]\textbf{A}\textbf{A}^H[/itex] is not the identity matrix:

[itex]
\textbf{A}\textbf{A}^H=\left[
\begin{array}{lll}
1 & 1 & 0 \\
1 & 1 & 0 \\
0 & 0 & 1%
\end{array}%
\right]

[/itex]

(but [itex]\textbf{A}^H\textbf{A}[/itex] is diagonal, and I want to show that this must always be true).
 
Last edited:

Related Threads on How to Prove a Matrix is Diagonal?

  • Last Post
Replies
4
Views
9K
Replies
3
Views
1K
Replies
1
Views
3K
  • Last Post
Replies
2
Views
3K
Replies
1
Views
13K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
3
Views
18K
  • Last Post
Replies
2
Views
4K
Top