How to Prove a Matrix is Diagonal?

  • Context: Graduate 
  • Thread starter Thread starter weetabixharry
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary

Discussion Overview

The discussion revolves around proving that a complex matrix \textbf{B} defined as \textbf{B} = \left( \textbf{A}^H\textbf{A}\right) is diagonal, given that the columns of another matrix \textbf{A} are linearly independent and subject to certain constraints. The scope includes theoretical exploration and mathematical reasoning.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested, Mathematical reasoning

Main Points Raised

  • One participant presents a complex matrix \textbf{A} with linearly independent columns and seeks to prove that \textbf{B} = \left( \textbf{A}^H\textbf{A}\right) is diagonal based on specific properties of \textbf{A} and \textbf{B}.
  • Another participant provides a counterexample using a specific matrix to show that \textbf{A}^H\textbf{A} can be Hermitian but not diagonal, questioning the validity of the initial claim.
  • Further discussion raises the question of whether \textbf{A}\textbf{A}^H must be the identity matrix, with one participant suggesting it could be any matrix with ones on the diagonal.
  • Another participant confirms that the Euclidean norms of the rows of \textbf{A} are indeed 1, but provides an example where \textbf{A}\textbf{A}^H is not the identity matrix, while \textbf{A}^H\textbf{A} is diagonal.

Areas of Agreement / Disagreement

Participants express differing views on whether \textbf{B} must be diagonal, with some providing counterexamples and questioning the assumptions made. The discussion remains unresolved regarding the proof of \textbf{B} being diagonal.

Contextual Notes

There are limitations regarding the assumptions made about the properties of \textbf{A} and the implications of the constraints on \textbf{A}\textbf{A}^H and \textbf{A}^H\textbf{A}. The relationship between these matrices is not fully established in the discussion.

weetabixharry
Messages
111
Reaction score
0
I've been stuck on this problem for so long it's getting ridiculous. Please help!

I have a complex matrix, \textbf{A}, whose columns are linearly independent. In other words, \textbf{A} is either tall or square and \left( \textbf{A}^H\textbf{A}\right)^{-1} exists (where \left(\right)^H denotes conjugate transpose). I am trying to prove that the matrix:

\textbf{B} \triangleq \left( \textbf{A}^H\textbf{A}\right)

must be diagonal, based on the following:

\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}

for some real diagonal matrix diag(\underline{\lambda}). It may or may not also be useful to note that \textbf{A} is also subject to the constraint:

\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1}

by which I mean that all the diagonal entries of (\textbf{A}\textbf{A}^H) are equal to 1 (i.e. the Euclidean norms of the rows of \textbf{A} are all 1).

I have deduced all sorts of properties of \textbf{A}, but strongly believe that it should be possible to show that \textbf{B} is diagonal... but a proof escapes me. Any help is greatly appreciated!
 
Last edited:
Physics news on Phys.org
I might misunderstand your problem but

\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)

This is not diagonal. It IS hermition though (as can easily be proven).
 
micromass said:
I might misunderstand your problem but

\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right)^H\left(\begin{array}{cc} 1 & 2\\ 3 & 4\end{array}\right) = \left(\begin{array}{cc} 10 & 14\\ 14 & 20\end{array}\right)

This is not diagonal. It IS hermition though (as can easily be proven).

I'm not sure what the consequences of that are in this context. The matrix you suggested cannot satisfy either of the equations:

\textbf{A}= diag(\underline{\lambda})\textbf{A}\textbf{A}^H \textbf{A} \textbf{A}^H \textbf{A}

\underline{diag}(\textbf{A}\textbf{A}^H) = \underline{1}
 
weetabixharry said:
by which I mean that all the diagonal entries of (\textbf{A}\textbf{A}^H) are equal to 1 (i.e. the Euclidean norms of the rows of \textbf{A} are all 1).

Does this mean that (\textbf{A}\textbf{A}^H) is the identity matrix? :confused:
 
I like Serena said:
Does this mean that (\textbf{A}\textbf{A}^H) is the identity matrix? :confused:

It could be any matrix with ones on the diagonal. For example:

<br /> (\textbf{A}\textbf{A}^H) =<br /> %<br /> \left[<br /> \begin{array}{lll}<br /> 1 &amp; 3 &amp; 2 \\ <br /> 3 &amp; 1 &amp; 7 \\ <br /> 2 &amp; 7 &amp; 1%<br /> \end{array}%<br /> \right]<br /> <br /> <br />

would be suitable in this sense.
 
Oh, okay, so are the Euclidean norms of the rows of A not 1?
 
I like Serena said:
Oh, okay, so are the Euclidean norms of the rows of A not 1?

Yes, the Euclidean norms of the rows are 1.

Consider, for example:

<br /> \textbf{A}=\left[ <br /> \begin{array}{ll}<br /> 1 &amp; 0 \\ <br /> 1 &amp; 0 \\ <br /> 0 &amp; 1%<br /> \end{array}%<br /> \right] ,\quad \underline{\lambda }=\left[ <br /> \begin{array}{l}<br /> 0.25 \\ <br /> 0.25 \\ <br /> 1%<br /> \end{array}%<br /> \right]<br /> <br />

The norms of all rows of \textbf{A} are equal to one, but \textbf{A}\textbf{A}^H is not the identity matrix:

<br /> \textbf{A}\textbf{A}^H=\left[ <br /> \begin{array}{lll}<br /> 1 &amp; 1 &amp; 0 \\ <br /> 1 &amp; 1 &amp; 0 \\ <br /> 0 &amp; 0 &amp; 1%<br /> \end{array}%<br /> \right]<br /> <br />

(but \textbf{A}^H\textbf{A} is diagonal, and I want to show that this must always be true).
 
Last edited:

Similar threads

  • · Replies 5 ·
Replies
5
Views
23K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K