Undergrad Is a symmetric matrix with positive eigenvalues always real?

Click For Summary
SUMMARY

A symmetric matrix with positive eigenvalues is not necessarily real. The discussion highlights that while a symmetric matrix can be expressed in the form ##\mathbf{A} = \mathbf{O}^T\mathbf{D}\mathbf{O}##, where ##\mathbf{O}## is an orthogonal matrix and ##\mathbf{D}## is a diagonal matrix of eigenvalues, this does not guarantee that the original matrix ##\mathbf{A}## is real. A counterexample provided is the complex symmetric matrix $$\begin{pmatrix} 3+i & 2 \\ 2 & 3-i \end{pmatrix}$$, which has real positive eigenvalues but is not real itself. The discussion emphasizes the distinction between symmetric and Hermitian matrices, particularly in the context of complex matrices.

PREREQUISITES
  • Understanding of symmetric matrices and their properties
  • Knowledge of eigenvalues and eigenvectors
  • Familiarity with matrix decompositions, specifically Schur and singular value decomposition (SVD)
  • Basic concepts of linear algebra, including Hermitian matrices
NEXT STEPS
  • Study the Schur decomposition and its implications for eigenvalue computation
  • Learn about singular value decomposition (SVD) and its applications in matrix analysis
  • Explore the properties of complex symmetric matrices versus Hermitian matrices
  • Investigate Takagi's factorization and its relevance to complex symmetric matrices
USEFUL FOR

Mathematicians, linear algebra students, and researchers in fields requiring matrix analysis, particularly those working with complex matrices and eigenvalue problems.

TeethWhitener
Science Advisor
Gold Member
Messages
2,630
Reaction score
2,244
I split off this question from the thread here:

https://www.physicsforums.com/threads/error-in-landau-lifshitz-mechanics.901356/

In that thread, I was told that a symmetric matrix ##\mathbf{A}## with real positive definite eigenvalues ##\{\lambda_i\} \in \mathbb{R}^+## is always real. I feel that I must be overlooking something simple, as I can't seem to prove it. Clearly the determinant and trace are positive, (so the matrix is nonsingular) and if it's diagonalizable, then the matrix is similar to a real matrix (namely the diagonal eigenvalue matrix). But I'm not seeing how this implies that the original ##\mathbf{A}## is real.

I've seen various claims that a symmetric matrix can be written as ##\mathbf{A} = \mathbf{O}^T\mathbf{D}\mathbf{O}##, where ##\mathbf{O}## is an orthogonal matrix and ##\mathbf{D}## is the diagonal eigenvalue matrix, but sometimes (e.g., Wikipedia) the claim is explicitly for real symmetric matrices and sometimes (e.g., Mathworld) it is unspecified whether the claim is for real matrices or for all symmetric matrices. If it's true for all symmetric matrices, then we're done, because orthogonal matrices are necessarily real. but if it's only true for real symmetric matrices, then we're back to square one.

I also tried working out the case for 2x2 matrices explicitly. The characteristic polynomial is
$$0 = \lambda^2-\lambda\mathrm{tr}(\mathbf{A})+\mathrm{det}(\mathbf{A})$$
Since ##\lambda^2 > 0##, we have that
$$\lambda(a_{11}+a_{22})-(a_{11}a_{22} - a_{12}^2) > 0$$
must be real. Writing this out explicitly as complex numbers gives:
$$\lambda((a+bi)+(c+di)) > [(a+bi)(c+di)-(e+fi)^2]$$
where ##a,b,c,d,e,f \in \mathbb{R}##. To get rid of the imaginary part, we need to satisfy:
$$0 = \lambda(b+d)-bc-ad+2ef$$
Since ##\mathrm{tr}(\mathbf{A}) > 0##, we get ##b+d = 0##. Since ##\mathrm{det}(\mathbf{A}) > 0##, we get ##bc+ad-2ef = 0##. So we have two conditions to satisfy:
$$b(c-a)+2ef=0$$
ensures reality, and:
$$\lambda(a+c)>ac+b^2-e^2+f^2$$
ensures positivity.
However, I don't see how in general, this implies ##b=d=f=0## such that the overall matrix is real. (EDIT: if one of ##b,d,f=0## then all of them must be zero by the first condition.)

It's a weird problem because I'm so used to dealing with either real symmetric matrices or complex Hermitian matrices, that I'm not sure what linear algebra rules apply to complex symmetric matrices. Thanks for any insight you can provide.
 
Physics news on Phys.org
Thanks for your response. Quick clarification: I'm assuming you mean every complex symmetric matrix may be diagonalized with a unitary matrix.

Follow up questions:

The Wikipedia link gives ##A=UDU^T##, possibly indicating the transpose of the unitary matrix, while you give ##A=UDU^*##, possibly indicating the conjugate transpose. Which is it? I'm guessing it's the latter; otherwise none of the rest of the proof goes through. (##A=UDU^T## doesn't imply that ##A## is hermitian.) However, this link:

https://en.wikipedia.org/wiki/Matrix_decomposition#Takagi.27s_factorization

seems to imply that it is the transpose, and not the conjugate transpose. It even mentions explicitly that it's not a special case of eigendecomposition.

Assuming that the correct decomposition is ##A=UDU^*##, is the ##AA^*=A^*A## step required? It follows naturally from ##A=UDU^*##, but I think I can just start with ##A=UDU^*## directly and take the conjugate transpose. I get
$$A^*=(UDU^*)^* = U(UD)^* = UD^*U^* = UDU^* = A$$
 
I think I found a counterexample:
$$\begin{pmatrix}
3+i & 2 \\
2 & 3-i
\end{pmatrix}$$
The matrix is complex symmetric and the eigenvalues are ##\lambda_1 = 3+\sqrt{3}## and ##\lambda_2 = 3-\sqrt{3}##, both of which are real positive definite.

EDIT: link to Wolframalpha:
http://www.wolframalpha.com/input/?i=eigenvalues+{{3+i,2},{2,3-i}}
 
  • Like
Likes FactChecker
If your matrix itself is complex, I presume that the word you're looking for is Hermitian. Is that right?

To my mind the most fundamental and constructive way to understand this is by learning the Schur decomposition. You may want to check out chapter 6 of Linear Algebra Done Wrong ( https://www.math.brown.edu/~treil/papers/LADW/book.pdf ). It's an immensely important result in Linear Algebra from both a theoretical standpoint, and from a practical standpoint, it in effect (via the QR Algorithm) is how eigenvalues are actually numerically computed.

Not everyone learns Schur though, for whatever reason. If you want to skip the building blocks and already have some familiarity with singular values, consider the SVD of some matrix ##A## and then using that factorization, consider the SVD of ##A^H##, then using those decompositions, consider what happens when you multiply ##A^H A## and also what happens when you mutliply ##A A^H##. Recall that singular values are always real and non-negative. https://en.wikipedia.org/wiki/Singular_value_decomposition
 
  • Like
Likes FactChecker
Thank you for your help. This question was specifically aimed at complex symmetric matrices, which are not, in general, Hermitian.
 
It was kind of hard to figure out what exactly the question was, as I see multiple responses to yourself in here that seem to respond to yourself but also not to yourself.

I also struggled to figure out why you'd bring up Takagi's factorization where $A$ is complex and symmetric, $A = VDV^T$ with D having non-negative reals as it tells you nothing about the individual eigenvalues of $A$ because $V^T \neq V^{-1}$. (Obviously we can get summary comparisons with determinants and you can bound the trace where ##trace(AA) \leq trace(AA^H)## with equality only when ##A## is Hermitian (or in Reals, symmetric)).

I suppose the question is resolved -- I guess this is what you were getting at with your counter example post. But I only got this now.
 
StoneTemplePython said:
It was kind of hard to figure out what exactly the question was, as I see multiple responses to yourself in here that seem to respond to yourself but also not to yourself.
Yeah originally another user had posted between my first and second posts, but that post seems to have disappeared. Sorry for the confusion.
 
TeethWhitener said:
Thank you for your help. This question was specifically aimed at complex symmetric matrices, which are not, in general, Hermitian.
In fact, a matrix with complex elements can not be both symmetric and Hermitian.
 
  • Like
Likes TeethWhitener
FactChecker said:
In fact, a matrix with complex elements can not be both symmetric and Hermitian.
Well...imaginary elements. Teeeeeechnically the identity matrix is complex. :wink:
 
  • Like
Likes FactChecker

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K