Prove matrix has all real eigenvalues

In summary, the spectral theorem for real symmetric matrices states that if a matrix is real symmetric, then all its eigenvalues are real.
  • #1
Jameson
Gold Member
MHB
4,541
13
Problem: Let $A$ be a $n \times n$ matrix with real entries. Prove that if $A$ is symmetric, that is $A = A^T$ then all eigenvalues of $A$ are real.

Solution: I'm definitely not seeing how to approach this problem. I know that to calculate the eigenvalues of a matrix I need to solve $\text{det }(A-\lambda I)=0$ and I have experience calculating them, but I've never seen commentary on whether the values will be real or complex. Any ideas to get started?
 
Last edited:
Physics news on Phys.org
  • #2
Jameson said:
Problem: Let $A$ be a $n \times n$ matrix (with real entries). Prove that if $A$ is symmetric, that is $A = A^T$ then all eigenvalues of $A$ are real.
Think of $A$ as acting on the complex inner-product space $\mathbb{C}^n$ (whose inner product satisfies $\langle y,x\rangle = \overline{\langle x,y\rangle}$). If $\lambda$ is an eigenvalue of $A$, with eigenvector $x$, then $$\lambda\langle x,x\rangle = \langle Ax,x\rangle = \langle x,A^{\mathrm{\scriptsize T}}x\rangle = \langle x,Ax\rangle = \overline{\langle Ax,x\rangle} = \overline{\lambda}\langle x,x\rangle,$$ and so $\overline{\lambda} = \lambda$.
 
  • #3
Thank you very much for the quick reply, Opalg! I don't believe this is the intended method to solve the problem though since any sort of complex analysis or complex theory isn't a prerequisite for the course. I'll take some time to read over your solution more and digest it, but am still looking perhaps for another potential method.
 
  • #4
Jameson said:
Problem: Let $A$ be a $n \times n$ matrix. Prove that if $A$ is symmetric, that is $A = A^T$ then all eigenvalues of $A$ are real.

I suppse there is a typo. In order to be precise: "Let $A$ be a real $n \times n$ matrix ...". Oherwise, the result is false. Choose for example $A=\text{diag }(1,i).$ :)
 
  • #5
Yes, you are very correct and my apologies for this error. :eek: I've fixed the OP now.
 
  • #6
The way that Opalg approached the problem is the most common way that I've seen it done.

So, if you don't want to use complex numbers or spaces...

3vo4us.jpg


The other way to do this then would be to change the statement slightly and prove the following:

An $n\times n$ matrix $A$ has all real eigenvalues and $n$ orthogonal real eigenvectors if and only if $A$ is real symmetric.

The proof of the revised statement would require a lot more (and I do mean a lot more) work than the way Opalg presented.
 
  • #7
If you're talking about roots of real polynomials, it's really hard to avoid a discussion of complex numbers.
 
  • #8
My mistake everybody and thank you for reassurance. :) Seems I have some reading to do. I took linear algebra last semester but we didn't cover this method at all so it's completely new material. I'll try to work through Opalg's solution.
 
  • #9
Hi Jameson!

What you are asking for is the proof of a general theorem called the "spectral theorem for real symmetric matrices".

Its proof is far from trivial, as you can see if you google for it, although Opalg's proof is quite elegant.
His proof is a bit concise though. I had to puzzle a bit to understand why the steps he's taking are valid.

First step is to realize that the characteristic polynomial you have, $\det(A-\lambda I)=0$, is a polynomial of degree n, meaning it has n (possible duplicate) roots if we allow complex numbers.
What's left to prove is that these eigenvalues are actually real numbers.
 
  • #10
If you start with complex numbers to begin with, you can actually say a bit more:

Every Hermetian matrix has real eigenvalues (Opalg's proof goes through as before).

As ILikeSerena has indicated, this is a consequence of the spectral theorem for normal matrices: every normal matrix is unitarily diagonalizable, and every unitarily diagonalizable matrix is normal

(normal matrices are those for which \(\displaystyle AA^H = A^HA\)).

This is another instance of a statement about real numbers that makes more sense when you consider the wider context of complex numbers.

As far as I understand it (which is poorly) physicists prefer to work with complex vector spaces, because you can always restrict to the special case of real numbers, and complex numbers are in a sense "more complete". There's not much more difficulty involved in doing so, many of the results in linear algebra hold for an arbitrary field (there are some special cases where a field of characteristic 2 is inappropriate), and the usual definition of an inner product in a vector space over \(\displaystyle \Bbb C\) resolves to a real-valued inner product when the scalars and coordinates are real.

What I'm trying to get across here is the idea that the "algebra" part of linear algebra, is based on the classical operations of addition, subtraction, multiplication and division (operations with matrices are "built-up" of operations of these kinds on individual entries) and a field is precisely the kind of algebraic object we can do those things IN.

For 99% of the examples one actually encounters in practice, the algebraic closure of the rational numbers would suffice (we rarely work with the full spectrum of transcendental numbers, a few important ones keep popping up). It is common practice when studying inner product spaces to assume the underlying field is a subfield of \(\displaystyle \Bbb C\), and it is perhaps unfortunate that in introductory courses so much emphasis is given to \(\displaystyle \Bbb R^n\), when you can do more without any extra effort.
 
  • #11
Opalg said:
Think of $A$ as acting on the complex inner-product space $\mathbb{C}^n$ (whose inner product satisfies $\langle y,x\rangle = \overline{\langle x,y\rangle}$). If $\lambda$ is an eigenvalue of $A$, with eigenvector $x$, then $$\lambda\langle x,x\rangle = \langle Ax,x\rangle = \langle x,A^{\mathrm{\scriptsize T}}x\rangle = \langle x,Ax\rangle = \overline{\langle Ax,x\rangle} = \overline{\lambda}\langle x,x\rangle,$$ and so $\overline{\lambda} = \lambda$.

Ok I've done some reading on inner-product spaces and feel comfortable with the basic axioms and some other conclusions from them. The steps I don't follow are $ \langle Ax,x\rangle = \langle x,A^{\mathrm{\scriptsize T}}x\rangle = \langle x,Ax\rangle$. The first two steps and the last three are clear but these three are not. Can someone explain?
 
  • #12
In some developments, one actually DEFINES the transpose of a square matrix \(\displaystyle A \) as the matrix \(\displaystyle B\) such that:

\(\displaystyle \langle Ax,x \rangle = \langle x,Bx \rangle, \forall x \)

(this assumes a REAL inner product space).

However, we can follow this entry-by-entry:

\(\displaystyle \langle Ax,x \rangle = \sum_i\left(\sum_j a_{ij}x_j\right)x_i\)

\(\displaystyle = a_{11}x_1x_1 + a_{12}x_2x_1 + \cdots + a_{1n}x_nx_1 + a_{21}x_1x_2 + a_{22}x_2x_2 + \cdots + a_{2n}x_nx_2 +\)

\(\displaystyle \cdots + a_{n1}x_1x_n + a_{n2}x_2x_n + \cdots + a_{nn}x_nx_n\)

\(\displaystyle = x_1(a_{11}x_1 + a_{21}x_2 + \cdots + a_{n1}x_n) + x_2(a_{12}x_1 + a_{22}x_2 + \cdots + a_{n2}x_n) +\)

\(\displaystyle \cdots + x_n(a_{1n}x_1 + a_{2n}x_2 + \cdots + a_{nn}x_n)\) <---plucking out the \(\displaystyle x_j\) in the "middle position"

\(\displaystyle = \sum_j x_j\left(\sum_i a_{ji}x_i \right) = \langle x,A^Tx \rangle\).

The equality \(\displaystyle \langle A^Tx,x \rangle = \langle Ax,x \rangle \) follows because \(\displaystyle A\) is symmetric, and thus \(\displaystyle A^T = A\).

(Note: this assumes the \(\displaystyle x_j\) are the coordinates of \(\displaystyle x\) in some basis (the standard basis works well), and that the \(\displaystyle a_{ij}\) are the entries of \(\displaystyle A\) in that same basis).

(Note #2: I hope it's clear that when we rearrange the order of the \(\displaystyle x_i\)'s and the \(\displaystyle x_j\)'s in the complex case, the sesquilinearity of the complex inner product requires we use \(\displaystyle \overline{a_{ji}}\) instead, leading to:

\(\displaystyle \langle Ax,x \rangle = \langle x,A^Hx \rangle\)

This reduces to the symmetric case for the reals, since real numbers are self-conjugate).
 
Last edited:
  • #13
Alternatively:

The standard inner product is given by $\langle x,y \rangle = y^\dagger x$, whiere $\dagger$ denotes the conjugate transpose.
Since A is a real matrix, its conjugate transpose is the same as its transpose.
And since A is symmetric, its transpose is the same as A: $A^\dagger = A^T = A$.

So:
$$\langle Ax,x \rangle = x^\dagger(Ax) = (x^\dagger A)x = (A^\dagger x)^\dagger x = \langle x, A^\dagger x \rangle = \langle x, A^T x \rangle = \langle x, A x \rangle$$

Oh, and you may already now that in general $(MN)^T = N^TM^T$, which is the reason that we have here that $(x^\dagger A) = (A^\dagger x)^\dagger$.
 
Last edited:

1. What does it mean for a matrix to have real eigenvalues?

Having real eigenvalues means that all of the eigenvalues of the matrix are real numbers. In other words, when the matrix is diagonalized, the resulting diagonal matrix will have only real numbers on the diagonal.

2. How do you prove that a matrix has all real eigenvalues?

To prove that a matrix has all real eigenvalues, we can use the characteristic polynomial of the matrix. If all of the roots of the characteristic polynomial are real, then the matrix will have all real eigenvalues. This can be checked by using the Rational Root Theorem or by using the quadratic formula to find the roots.

3. Can a matrix have both real and imaginary eigenvalues?

Yes, a matrix can have both real and imaginary eigenvalues. This occurs when the matrix is not symmetric, and therefore cannot be diagonalized. In this case, the eigenvalues will be complex numbers with both real and imaginary parts.

4. What are some real-life applications of matrices with real eigenvalues?

Matrices with real eigenvalues have various real-life applications, such as in physics and engineering. For example, in quantum mechanics, the energy levels of a system are represented by the eigenvalues of a matrix. In structural engineering, the eigenvalues of a matrix can represent the natural frequencies of a structure, which is important for earthquake analysis.

5. Can a matrix have an infinite number of real eigenvalues?

No, a matrix cannot have an infinite number of real eigenvalues. The number of eigenvalues of a matrix is equal to its dimension, so a matrix can only have a finite number of eigenvalues. However, the eigenvalues can be repeated, so a matrix can have multiple eigenvalues with the same value.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
814
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
611
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
616
Replies
2
Views
1K
Back
Top