Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Is a symmetric matrix with positive eigenvalues always real?

  1. Jan 26, 2017 #1

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    I split off this question from the thread here:

    https://www.physicsforums.com/threads/error-in-landau-lifshitz-mechanics.901356/

    In that thread, I was told that a symmetric matrix ##\mathbf{A}## with real positive definite eigenvalues ##\{\lambda_i\} \in \mathbb{R}^+## is always real. I feel that I must be overlooking something simple, as I can't seem to prove it. Clearly the determinant and trace are positive, (so the matrix is nonsingular) and if it's diagonalizable, then the matrix is similar to a real matrix (namely the diagonal eigenvalue matrix). But I'm not seeing how this implies that the original ##\mathbf{A}## is real.

    I've seen various claims that a symmetric matrix can be written as ##\mathbf{A} = \mathbf{O}^T\mathbf{D}\mathbf{O}##, where ##\mathbf{O}## is an orthogonal matrix and ##\mathbf{D}## is the diagonal eigenvalue matrix, but sometimes (e.g., Wikipedia) the claim is explicitly for real symmetric matrices and sometimes (e.g., Mathworld) it is unspecified whether the claim is for real matrices or for all symmetric matrices. If it's true for all symmetric matrices, then we're done, because orthogonal matrices are necessarily real. but if it's only true for real symmetric matrices, then we're back to square one.

    I also tried working out the case for 2x2 matrices explicitly. The characteristic polynomial is
    $$0 = \lambda^2-\lambda\mathrm{tr}(\mathbf{A})+\mathrm{det}(\mathbf{A})$$
    Since ##\lambda^2 > 0##, we have that
    $$\lambda(a_{11}+a_{22})-(a_{11}a_{22} - a_{12}^2) > 0$$
    must be real. Writing this out explicitly as complex numbers gives:
    $$\lambda((a+bi)+(c+di)) > [(a+bi)(c+di)-(e+fi)^2]$$
    where ##a,b,c,d,e,f \in \mathbb{R}##. To get rid of the imaginary part, we need to satisfy:
    $$0 = \lambda(b+d)-bc-ad+2ef$$
    Since ##\mathrm{tr}(\mathbf{A}) > 0##, we get ##b+d = 0##. Since ##\mathrm{det}(\mathbf{A}) > 0##, we get ##bc+ad-2ef = 0##. So we have two conditions to satisfy:
    $$b(c-a)+2ef=0$$
    ensures reality, and:
    $$\lambda(a+c)>ac+b^2-e^2+f^2$$
    ensures positivity.
    However, I don't see how in general, this implies ##b=d=f=0## such that the overall matrix is real. (EDIT: if one of ##b,d,f=0## then all of them must be zero by the first condition.)

    It's a weird problem because I'm so used to dealing with either real symmetric matrices or complex Hermitian matrices, that I'm not sure what linear algebra rules apply to complex symmetric matrices. Thanks for any insight you can provide.
     
  2. jcsd
  3. Jan 26, 2017 #2

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    Thanks for your response. Quick clarification: I'm assuming you mean every complex symmetric matrix may be diagonalized with a unitary matrix.

    Follow up questions:

    The Wikipedia link gives ##A=UDU^T##, possibly indicating the transpose of the unitary matrix, while you give ##A=UDU^*##, possibly indicating the conjugate transpose. Which is it? I'm guessing it's the latter; otherwise none of the rest of the proof goes through. (##A=UDU^T## doesn't imply that ##A## is hermitian.) However, this link:

    https://en.wikipedia.org/wiki/Matrix_decomposition#Takagi.27s_factorization

    seems to imply that it is the transpose, and not the conjugate transpose. It even mentions explicitly that it's not a special case of eigendecomposition.

    Assuming that the correct decomposition is ##A=UDU^*##, is the ##AA^*=A^*A## step required? It follows naturally from ##A=UDU^*##, but I think I can just start with ##A=UDU^*## directly and take the conjugate transpose. I get
    $$A^*=(UDU^*)^* = U(UD)^* = UD^*U^* = UDU^* = A$$
     
  4. Jan 26, 2017 #3

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    I think I found a counterexample:
    $$\begin{pmatrix}
    3+i & 2 \\
    2 & 3-i
    \end{pmatrix}$$
    The matrix is complex symmetric and the eigenvalues are ##\lambda_1 = 3+\sqrt{3}## and ##\lambda_2 = 3-\sqrt{3}##, both of which are real positive definite.

    EDIT: link to Wolframalpha:
    http://www.wolframalpha.com/input/?i=eigenvalues+{{3+i,2},{2,3-i}}
     
  5. Jan 26, 2017 #4

    StoneTemplePython

    User Avatar
    Gold Member

    If your matrix itself is complex, I presume that the word you're looking for is Hermitian. Is that right?

    To my mind the most fundamental and constructive way to understand this is by learning the Schur decomposition. You may want to check out chapter 6 of Linear Algebra Done Wrong ( https://www.math.brown.edu/~treil/papers/LADW/book.pdf ). It's an immensely important result in Linear Algebra from both a theoretical standpoint, and from a practical standpoint, it in effect (via the QR Algorithm) is how eigenvalues are actually numerically computed.

    Not everyone learns Schur though, for whatever reason. If you want to skip the building blocks and already have some familiarity with singular values, consider the SVD of some matrix ##A## and then using that factorization, consider the SVD of ##A^H##, then using those decompositions, consider what happens when you multiply ##A^H A## and also what happens when you mutliply ##A A^H##. Recall that singular values are always real and non-negative. https://en.wikipedia.org/wiki/Singular_value_decomposition
     
  6. Jan 26, 2017 #5

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    Thank you for your help. This question was specifically aimed at complex symmetric matrices, which are not, in general, Hermitian.
     
  7. Jan 26, 2017 #6

    StoneTemplePython

    User Avatar
    Gold Member

    It was kind of hard to figure out what exactly the question was, as I see multiple responses to yourself in here that seem to respond to yourself but also not to yourself.

    I also struggled to figure out why you'd bring up Takagi's factorization where $A$ is complex and symmetric, $A = VDV^T$ with D having non-negative reals as it tells you nothing about the individual eigenvalues of $A$ because $V^T \neq V^{-1}$. (Obviously we can get summary comparisons with determinants and you can bound the trace where ##trace(AA) \leq trace(AA^H)## with equality only when ##A## is Hermitian (or in Reals, symmetric)).

    I suppose the question is resolved -- I guess this is what you were getting at with your counter example post. But I only got this now.
     
  8. Jan 26, 2017 #7

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    Yeah originally another user had posted between my first and second posts, but that post seems to have disappeared. Sorry for the confusion.
     
  9. Jan 26, 2017 #8

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    In fact, a matrix with complex elements can not be both symmetric and Hermitian.
     
  10. Jan 27, 2017 #9

    TeethWhitener

    User Avatar
    Science Advisor
    Gold Member

    Well...imaginary elements. Teeeeeechnically the identity matrix is complex. :wink:
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Is a symmetric matrix with positive eigenvalues always real?
Loading...