Register to reply

Tricky complex square matrix problem

Share this thread:
weetabixharry
#1
Oct26-11, 06:46 AM
P: 108
I have a complex square matrix, [itex]\textbf{C}[/itex], which satisfies:

[itex]\textbf{C}\textbf{C} = (\textbf{I} \odot \textbf{C})[/itex]

where [itex]\textbf{I}[/itex] is the identity matrix and [itex]\odot[/itex] denotes the Hadamard (element-by-element) product. In other words, [itex]\textbf{C}\textbf{C}[/itex] is a diagonal matrix whose diagonal entries are the same as the diagonal entries of [itex]\textbf{C}[/itex], which is not necessarily diagonal itself.

Furthermore, [itex]\textbf{C}[/itex] is Hermitian:

[itex]\textbf{C}^{H}=\textbf{C}[/itex]

and [itex]\textbf{C}[/itex] must be full rank (because actually, in my problem, [itex]\textbf{C} \triangleq (\textbf{A}^{H}\textbf{A})^{-1}[/itex], where [itex]\textbf{A}[/itex] is complex square invertible).

I want to determine whether [itex]\textbf{C} = \textbf{I}[/itex] is the only solution (because this would imply that [itex]\textbf{A}[/itex] is unitary). (This is equivalent to proving that [itex]\textbf{C}[/itex] is diagonal). By expanding out terms, I've shown that [itex]\textbf{C} = \textbf{I}[/itex] is the only invertible solution for [itex](3 \times 3)[/itex] matrices, but I can't seem to obtain a general proof.

Any help or insight would be very much appreciated - I'm completely stumped!
Phys.Org News Partner Science news on Phys.org
New type of solar concentrator desn't block the view
Researchers demonstrate ultra low-field nuclear magnetic resonance using Earth's magnetic field
Asian inventions dominate energy storage systems
aesir
#2
Oct27-11, 08:42 AM
P: 27
you can take the square root of your equation

since C is positive definite ([itex]C=(A^\dagger A)^{-1}[/itex]) on the left you have C
and you obtain (in components):
[tex]C_{ij}=\sqrt{C_{ij}}\delta_{ij}[/tex]

from which you can conclude that C is the identity matrix
weetabixharry
#3
Oct27-11, 01:22 PM
P: 108
Quote Quote by aesir View Post
you can take the square root of your equation
Brilliant! Thank you very much indeed! I had really be scratching my head over that one. Many thanks again for your help!

AlephZero
#4
Oct27-11, 07:42 PM
Engineering
Sci Advisor
HW Helper
Thanks
P: 7,106
Tricky complex square matrix problem

You can show this from "first principles". Let the matrix be
[tex]\left(\begin{array}{cc} a & b \\ b* & c \end{array}\right)[/tex]
where a and c are real.

Mlutiplying the matrices out gives 3 equations

a^2 + bb* = a
c^2 + bb" = c
ab + bc = 0

Subtracting the first two equations, either a = c, or a+c = 1
From the third equation, either b = 0, or a+c = 0

So either b = 0, or a = c = 0

But from the first two equations, if a = c = 0 then b = 0 also.

So, the first two equations reduce to a^2 = a and c^2 = c, and the only solution which gives a matrix of full rank is C = I.
weetabixharry
#5
Oct28-11, 02:40 AM
P: 108
Thanks AlephZero. That is the approach I took in order to obtain a proof for [itex](2 \times 2)[/itex] and [itex](3 \times 3)[/itex] matrices. (If I understand correctly, your [itex]a[/itex], [itex]b[/itex] and [itex]c[/itex] are scalars.) However, aesir's solution is valid for the general [itex](n \times n)[/itex] case, which is especially important for me.

A final question on positive definiteness:
If [itex]\textbf{A}[/itex] is not square, but instead is tall (with linearly independent columns) then is it correct to say that [itex](\textbf{A}^{H}\textbf{A})^{-1}[/itex] is now positive semi-definite?

My reasoning is that [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 \geq 0[/itex] for any [itex]\textbf{z}[/itex] (with equality when [itex]\textbf{z}[/itex] lies in the null space of [itex]\textbf{A}[/itex]).

(Therefore aesir's square root still exists in this case).
aesir
#6
Oct28-11, 03:20 AM
P: 27
Quote Quote by weetabixharry View Post
...
A final question on positive definiteness:
If [itex]\textbf{A}[/itex] is not square, but instead is tall (with linearly independent columns) then is it correct to say that [itex](\textbf{A}^{H}\textbf{A})^{-1}[/itex] is now positive semi-definite?

My reasoning is that [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 \geq 0[/itex] for any [itex]\textbf{z}[/itex] (with equality when [itex]\textbf{z}[/itex] lies in the null space of [itex]\textbf{A}[/itex]).

(Therefore aesir's square root still exists in this case).
I don't think so.
It is true that if [itex]\textbf{z}[/itex] is in the null space of [itex]\textbf{A}[/itex] then [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 = 0[/itex], but this means that [itex](\textbf{A}^{H}\textbf{A})[/itex] is semi-positive definite, not its inverse (which does not exists if the null space is non-trivial). BTW if [itex]\textbf{A}[/itex] has linearly independent columns its null space is [itex]\{0\}[/itex]
weetabixharry
#7
Oct28-11, 05:08 AM
P: 108
Quote Quote by aesir View Post
I don't think so.
It is true that if [itex]\textbf{z}[/itex] is in the null space of [itex]\textbf{A}[/itex] then [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 = 0[/itex], but this means that [itex](\textbf{A}^{H}\textbf{A})[/itex] is semi-positive definite, not its inverse (which does not exists if the null space is non-trivial). BTW if [itex]\textbf{A}[/itex] has linearly independent columns its null space is [itex]\{0\}[/itex]
Ah yes, of course. Thanks for clearing that up!

So are the following two statements correct?
(1) [itex] (\textbf{A}^H\textbf{A}) [/itex] is positive definite when the columns of [itex]\textbf{A}[/itex] are independent (which requires that [itex]\textbf{A}[/itex] is tall or square). Therefore [itex] (\textbf{A}^H\textbf{A})^{-1} [/itex] is also positive definite.

(2) When the rank of [itex]\textbf{A}[/itex] is less than its number of columns (which includes all fat matrices), [itex](\textbf{A}^H\textbf{A})[/itex] is positive semidefinite. In this case, [itex](\textbf{A}^H\textbf{A})^{-1}[/itex] does not exist.
aesir
#8
Oct28-11, 07:10 AM
P: 27
Quote Quote by weetabixharry View Post
Ah yes, of course. Thanks for clearing that up!

So are the following two statements correct?
(1) [itex] (\textbf{A}^H\textbf{A}) [/itex] is positive definite when the columns of [itex]\textbf{A}[/itex] are independent (which requires that [itex]\textbf{A}[/itex] is tall or square). Therefore [itex] (\textbf{A}^H\textbf{A})^{-1} [/itex] is also positive definite.

(2) When the rank of [itex]\textbf{A}[/itex] is less than its number of columns (which includes all fat matrices), [itex](\textbf{A}^H\textbf{A})[/itex] is positive semidefinite. In this case, [itex](\textbf{A}^H\textbf{A})^{-1}[/itex] does not exist.
Yes, that's true.
In case (2) you can say a little more. If you split the vector space in null{A} and its orthogonal complement [itex]V_1[/itex] you have [tex]A^H A = \left(\begin{array}{cc} B^HB & 0 \\ 0 & 0 \end{array} \right)[/tex]
that has a positive definite inverse if restricted from [itex]V_1[/itex] to [itex]V_1[/itex]


Register to reply

Related Discussions
Building a least square problem design matrix Calculus & Beyond Homework 4
Complex analysis, deceptively tricky problem. Calculus & Beyond Homework 1
Least square problem matrix Calculus & Beyond Homework 3
Obtaining an invertible square matrix from a non-square matrix of full rank Linear & Abstract Algebra 0
Tricky complex numbers problem Precalculus Mathematics Homework 5