Is a symmetric matrix with positive eigenvalues always real?

In summary: The Wikipedia link you provided does not seem to be about complex symmetric matrices, so I'm assuming you're talking about real symmetric matrices. The Takagi's factorization of a real symmetric matrix is UDU^*=UDU^*UD, so the step involving AA^*=A^*A is required.
  • #1
TeethWhitener
Science Advisor
Gold Member
2,618
2,228
I split off this question from the thread here:

https://www.physicsforums.com/threads/error-in-landau-lifshitz-mechanics.901356/

In that thread, I was told that a symmetric matrix ##\mathbf{A}## with real positive definite eigenvalues ##\{\lambda_i\} \in \mathbb{R}^+## is always real. I feel that I must be overlooking something simple, as I can't seem to prove it. Clearly the determinant and trace are positive, (so the matrix is nonsingular) and if it's diagonalizable, then the matrix is similar to a real matrix (namely the diagonal eigenvalue matrix). But I'm not seeing how this implies that the original ##\mathbf{A}## is real.

I've seen various claims that a symmetric matrix can be written as ##\mathbf{A} = \mathbf{O}^T\mathbf{D}\mathbf{O}##, where ##\mathbf{O}## is an orthogonal matrix and ##\mathbf{D}## is the diagonal eigenvalue matrix, but sometimes (e.g., Wikipedia) the claim is explicitly for real symmetric matrices and sometimes (e.g., Mathworld) it is unspecified whether the claim is for real matrices or for all symmetric matrices. If it's true for all symmetric matrices, then we're done, because orthogonal matrices are necessarily real. but if it's only true for real symmetric matrices, then we're back to square one.

I also tried working out the case for 2x2 matrices explicitly. The characteristic polynomial is
$$0 = \lambda^2-\lambda\mathrm{tr}(\mathbf{A})+\mathrm{det}(\mathbf{A})$$
Since ##\lambda^2 > 0##, we have that
$$\lambda(a_{11}+a_{22})-(a_{11}a_{22} - a_{12}^2) > 0$$
must be real. Writing this out explicitly as complex numbers gives:
$$\lambda((a+bi)+(c+di)) > [(a+bi)(c+di)-(e+fi)^2]$$
where ##a,b,c,d,e,f \in \mathbb{R}##. To get rid of the imaginary part, we need to satisfy:
$$0 = \lambda(b+d)-bc-ad+2ef$$
Since ##\mathrm{tr}(\mathbf{A}) > 0##, we get ##b+d = 0##. Since ##\mathrm{det}(\mathbf{A}) > 0##, we get ##bc+ad-2ef = 0##. So we have two conditions to satisfy:
$$b(c-a)+2ef=0$$
ensures reality, and:
$$\lambda(a+c)>ac+b^2-e^2+f^2$$
ensures positivity.
However, I don't see how in general, this implies ##b=d=f=0## such that the overall matrix is real. (EDIT: if one of ##b,d,f=0## then all of them must be zero by the first condition.)

It's a weird problem because I'm so used to dealing with either real symmetric matrices or complex Hermitian matrices, that I'm not sure what linear algebra rules apply to complex symmetric matrices. Thanks for any insight you can provide.
 
Physics news on Phys.org
  • #2
Thanks for your response. Quick clarification: I'm assuming you mean every complex symmetric matrix may be diagonalized with a unitary matrix.

Follow up questions:

The Wikipedia link gives ##A=UDU^T##, possibly indicating the transpose of the unitary matrix, while you give ##A=UDU^*##, possibly indicating the conjugate transpose. Which is it? I'm guessing it's the latter; otherwise none of the rest of the proof goes through. (##A=UDU^T## doesn't imply that ##A## is hermitian.) However, this link:

https://en.wikipedia.org/wiki/Matrix_decomposition#Takagi.27s_factorization

seems to imply that it is the transpose, and not the conjugate transpose. It even mentions explicitly that it's not a special case of eigendecomposition.

Assuming that the correct decomposition is ##A=UDU^*##, is the ##AA^*=A^*A## step required? It follows naturally from ##A=UDU^*##, but I think I can just start with ##A=UDU^*## directly and take the conjugate transpose. I get
$$A^*=(UDU^*)^* = U(UD)^* = UD^*U^* = UDU^* = A$$
 
  • #3
I think I found a counterexample:
$$\begin{pmatrix}
3+i & 2 \\
2 & 3-i
\end{pmatrix}$$
The matrix is complex symmetric and the eigenvalues are ##\lambda_1 = 3+\sqrt{3}## and ##\lambda_2 = 3-\sqrt{3}##, both of which are real positive definite.

EDIT: link to Wolframalpha:
http://www.wolframalpha.com/input/?i=eigenvalues+{{3+i,2},{2,3-i}}
 
  • Like
Likes FactChecker
  • #4
If your matrix itself is complex, I presume that the word you're looking for is Hermitian. Is that right?

To my mind the most fundamental and constructive way to understand this is by learning the Schur decomposition. You may want to check out chapter 6 of Linear Algebra Done Wrong ( https://www.math.brown.edu/~treil/papers/LADW/book.pdf ). It's an immensely important result in Linear Algebra from both a theoretical standpoint, and from a practical standpoint, it in effect (via the QR Algorithm) is how eigenvalues are actually numerically computed.

Not everyone learns Schur though, for whatever reason. If you want to skip the building blocks and already have some familiarity with singular values, consider the SVD of some matrix ##A## and then using that factorization, consider the SVD of ##A^H##, then using those decompositions, consider what happens when you multiply ##A^H A## and also what happens when you mutliply ##A A^H##. Recall that singular values are always real and non-negative. https://en.wikipedia.org/wiki/Singular_value_decomposition
 
  • Like
Likes FactChecker
  • #5
Thank you for your help. This question was specifically aimed at complex symmetric matrices, which are not, in general, Hermitian.
 
  • #6
It was kind of hard to figure out what exactly the question was, as I see multiple responses to yourself in here that seem to respond to yourself but also not to yourself.

I also struggled to figure out why you'd bring up Takagi's factorization where $A$ is complex and symmetric, $A = VDV^T$ with D having non-negative reals as it tells you nothing about the individual eigenvalues of $A$ because $V^T \neq V^{-1}$. (Obviously we can get summary comparisons with determinants and you can bound the trace where ##trace(AA) \leq trace(AA^H)## with equality only when ##A## is Hermitian (or in Reals, symmetric)).

I suppose the question is resolved -- I guess this is what you were getting at with your counter example post. But I only got this now.
 
  • #7
StoneTemplePython said:
It was kind of hard to figure out what exactly the question was, as I see multiple responses to yourself in here that seem to respond to yourself but also not to yourself.
Yeah originally another user had posted between my first and second posts, but that post seems to have disappeared. Sorry for the confusion.
 
  • #8
TeethWhitener said:
Thank you for your help. This question was specifically aimed at complex symmetric matrices, which are not, in general, Hermitian.
In fact, a matrix with complex elements can not be both symmetric and Hermitian.
 
  • Like
Likes TeethWhitener
  • #9
FactChecker said:
In fact, a matrix with complex elements can not be both symmetric and Hermitian.
Well...imaginary elements. Teeeeeechnically the identity matrix is complex. :wink:
 
  • Like
Likes FactChecker

1. What is a symmetric matrix?

A symmetric matrix is a square matrix where the values on either side of the main diagonal are equal. This means that the matrix is mirrored along the diagonal.

2. What are eigenvalues?

Eigenvalues are a set of numbers that represent the scaling factor of a matrix when it is multiplied by a corresponding eigenvector. They are important in understanding the behavior of a matrix in linear transformations.

3. Why are positive eigenvalues significant?

Positive eigenvalues indicate that the matrix has a consistent scaling factor in the direction of its eigenvectors. This can be interpreted as a stable transformation, which is important in many scientific and mathematical applications.

4. Can a symmetric matrix with positive eigenvalues ever be not real?

No, a symmetric matrix with positive eigenvalues is always real. This is because real numbers are necessary for the matrix to have a consistent scaling factor in the direction of its eigenvectors. If the matrix had imaginary eigenvalues, the scaling factor would vary and the matrix would not be symmetric.

5. Are there any exceptions to a symmetric matrix with positive eigenvalues always being real?

Yes, there are some cases where a symmetric matrix with positive eigenvalues may not be real. This can happen when the matrix is complex, and its complex eigenvalues have a nonzero imaginary component. However, in most scientific and mathematical applications, symmetric matrices with positive eigenvalues are assumed to be real.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
7
Views
836
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
791
  • Linear and Abstract Algebra
Replies
1
Views
928
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
604
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Mechanics
Replies
3
Views
827
Back
Top