Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Tricky complex square matrix problem

  1. Oct 26, 2011 #1
    I have a complex square matrix, [itex]\textbf{C}[/itex], which satisfies:

    [itex]\textbf{C}\textbf{C} = (\textbf{I} \odot \textbf{C})[/itex]

    where [itex]\textbf{I}[/itex] is the identity matrix and [itex]\odot[/itex] denotes the Hadamard (element-by-element) product. In other words, [itex]\textbf{C}\textbf{C}[/itex] is a diagonal matrix whose diagonal entries are the same as the diagonal entries of [itex]\textbf{C}[/itex], which is not necessarily diagonal itself.

    Furthermore, [itex]\textbf{C}[/itex] is Hermitian:

    [itex]\textbf{C}^{H}=\textbf{C}[/itex]

    and [itex]\textbf{C}[/itex] must be full rank (because actually, in my problem, [itex]\textbf{C} \triangleq (\textbf{A}^{H}\textbf{A})^{-1}[/itex], where [itex]\textbf{A}[/itex] is complex square invertible).

    I want to determine whether [itex]\textbf{C} = \textbf{I}[/itex] is the only solution (because this would imply that [itex]\textbf{A}[/itex] is unitary). (This is equivalent to proving that [itex]\textbf{C}[/itex] is diagonal). By expanding out terms, I've shown that [itex]\textbf{C} = \textbf{I}[/itex] is the only invertible solution for [itex](3 \times 3)[/itex] matrices, but I can't seem to obtain a general proof.

    Any help or insight would be very much appreciated - I'm completely stumped!
     
  2. jcsd
  3. Oct 27, 2011 #2
    you can take the square root of your equation

    since C is positive definite ([itex]C=(A^\dagger A)^{-1}[/itex]) on the left you have C
    and you obtain (in components):
    [tex]C_{ij}=\sqrt{C_{ij}}\delta_{ij}[/tex]

    from which you can conclude that C is the identity matrix
     
  4. Oct 27, 2011 #3
    Brilliant! Thank you very much indeed! I had really be scratching my head over that one. Many thanks again for your help!
     
  5. Oct 27, 2011 #4

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    You can show this from "first principles". Let the matrix be
    [tex]\left(\begin{array}{cc} a & b \\ b* & c \end{array}\right)[/tex]
    where a and c are real.

    Mlutiplying the matrices out gives 3 equations

    a^2 + bb* = a
    c^2 + bb" = c
    ab + bc = 0

    Subtracting the first two equations, either a = c, or a+c = 1
    From the third equation, either b = 0, or a+c = 0

    So either b = 0, or a = c = 0

    But from the first two equations, if a = c = 0 then b = 0 also.

    So, the first two equations reduce to a^2 = a and c^2 = c, and the only solution which gives a matrix of full rank is C = I.
     
  6. Oct 28, 2011 #5
    Thanks AlephZero. That is the approach I took in order to obtain a proof for [itex](2 \times 2)[/itex] and [itex](3 \times 3)[/itex] matrices. (If I understand correctly, your [itex]a[/itex], [itex]b[/itex] and [itex]c[/itex] are scalars.) However, aesir's solution is valid for the general [itex](n \times n)[/itex] case, which is especially important for me.

    A final question on positive definiteness:
    If [itex]\textbf{A}[/itex] is not square, but instead is tall (with linearly independent columns) then is it correct to say that [itex](\textbf{A}^{H}\textbf{A})^{-1}[/itex] is now positive semi-definite?

    My reasoning is that [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 \geq 0[/itex] for any [itex]\textbf{z}[/itex] (with equality when [itex]\textbf{z}[/itex] lies in the null space of [itex]\textbf{A}[/itex]).

    (Therefore aesir's square root still exists in this case).
     
  7. Oct 28, 2011 #6
    I don't think so.
    It is true that if [itex]\textbf{z}[/itex] is in the null space of [itex]\textbf{A}[/itex] then [itex]\textbf{z}^H\textbf{A}^H\textbf{A}\textbf{z} = \left\Vert{\textbf{Az}}\right\Vert^2 = 0[/itex], but this means that [itex](\textbf{A}^{H}\textbf{A})[/itex] is semi-positive definite, not its inverse (which does not exists if the null space is non-trivial). BTW if [itex]\textbf{A}[/itex] has linearly independent columns its null space is [itex]\{0\}[/itex]
     
  8. Oct 28, 2011 #7
    Ah yes, of course. Thanks for clearing that up!

    So are the following two statements correct?
    (1) [itex] (\textbf{A}^H\textbf{A}) [/itex] is positive definite when the columns of [itex]\textbf{A}[/itex] are independent (which requires that [itex]\textbf{A}[/itex] is tall or square). Therefore [itex] (\textbf{A}^H\textbf{A})^{-1} [/itex] is also positive definite.

    (2) When the rank of [itex]\textbf{A}[/itex] is less than its number of columns (which includes all fat matrices), [itex](\textbf{A}^H\textbf{A})[/itex] is positive semidefinite. In this case, [itex](\textbf{A}^H\textbf{A})^{-1}[/itex] does not exist.
     
  9. Oct 28, 2011 #8
    Yes, that's true.
    In case (2) you can say a little more. If you split the vector space in null{A} and its orthogonal complement [itex]V_1[/itex] you have [tex]A^H A = \left(\begin{array}{cc} B^HB & 0 \\ 0 & 0 \end{array} \right)[/tex]
    that has a positive definite inverse if restricted from [itex]V_1[/itex] to [itex]V_1[/itex]
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Tricky complex square matrix problem
  1. Complex matrix problem (Replies: 4)

Loading...