Eigenvalues of a symmetric operator

psholtz
Messages
133
Reaction score
0
I'm reading from Wikipedia:
The spectrum of any bounded symmetric operator is real; in particular all its eigenvalues are real, although a symmetric operator may have no eigenvalues.

http://en.wikipedia.org/wiki/Self-adjoint_operator

I thought linear operators always had eigenvalues, since you could always form a characteristic equation for the corresponding matrix and solve it?

Is that not the case? Are there linear operators that don't have eigenvalues?
 
Physics news on Phys.org
A 2D rotation matrix has no real eigenvalues. You can write down the equation, but quickly find there are no (real) solutions.
 
That's true, but a 2D rotation matrix still has eigenvalues, they just aren't real eigenvalues. But the eigenvalues still exist.

Moreover, the 2D rotation matrix isn't symmetric/Hermitian. It's usually of the form:

T = \left(\begin{array}{cc} \cos\phi & \sin\phi \\ -\sin\phi & \cos\phi \end{array} \right)

which is not symmetric/Hermitian.
 
Reading more from Wikipedia:
It follows that we can compute all the eigenvalues of a matrix A by solving the equation pA(λ) = 0. If A is an n-by-n matrix, then pA has degree n and A can therefore have at most n eigenvalues. Conversely, the fundamental theorem of algebra says that this equation has exactly n roots (zeroes), counted with multiplicity.

http://en.wikipedia.org/wiki/Eigenvalue_algorithm

To me, it would seem that there must be n roots (counting multiplicities) for the characteristic polynomial for every square matrix of size n. In other words, every square matrix of size n must have n eigenvalues (counting multiplicities, i.e., eigenvalues are possibly non-distinct).

The only way I can reconcile the statement above, the "symmetric operators may not have eigenvalues" is if the symmetric operator so described is not square?? Is it possible to have a non-square matrix, and call it "symmetric" if only the "square" part of it is symmetric?
 
Complex numbers are not always automatically assumed. If you are working in a real vector space, the rotation matrix is a simple example of a linear operator that doesn't have eigenvectors.
 
Do you think that's what they were getting at in the Wikipedia article?

If we suppose the existence of complex numbers, or allow them at any rate, is it safe to say that a square matrix of size n will always have n eigenvalues (counting multiplicities)?
 
psholtz said:
Do you think that's what they were getting at in the Wikipedia article?

If we suppose the existence of complex numbers, or allow them at any rate, is it safe to say that a square matrix of size n will always have n eigenvalues (counting multiplicities)?

Yes, and yes. (At least, I think so!)
 
Nonono, this is not what the article is trying to say at all!

You are completely correct that a symmetric matrix will always have eigenvalues. But the article isn't talking about matrices, it's talking about operators. That is, bounded linear function on a possible infinite-dimensional Hilbert space.

The matrices correspond to operators on a finite-dimensional Hilbert space. The only thing that the article will say is that operators on an infinite-dimensional Hilbert space does not need to have eigenvalues.

The standard example: take a monotone increasing, bounded function u:[a,b]\rightarrow \mathbb{R}. Then the operator

T:L^2([a,b])\rightarrow L^2([a,b]):f\rightarrow uf

is called the multiplication operator. This is a hermitian/symmetric operator without eigenvalues. Indeed, if T(f)=cf for a complex number c and a nonzero f. Then uf=cf. So if f(x)\neq 0, then u(x)=\lambda. But then u(x) equal lambda for multiple values of x, which contradicts that u is monotone increasing.

So the thing that the wikipedia article is getting to is that symmetric operators on infinite-dimensional Hilbert spaces do not necessarily have eigenvalues. However, symmetric operators on finite-dimensional spaces always have eigenvalues, since they are matrices.
 
Thanks for that, micromass. A very instructive example and discussion! I was wrong, and I learned something.
 
  • #10
Thanks, micromass... excellent explanation.
 
Back
Top