# Eigenvalues of a symmetric operator

1. Jul 26, 2011

### psholtz

I thought linear operators always had eigenvalues, since you could always form a characteristic equation for the corresponding matrix and solve it?

Is that not the case? Are there linear operators that don't have eigenvalues?

2. Jul 26, 2011

### chogg

A 2D rotation matrix has no real eigenvalues. You can write down the equation, but quickly find there are no (real) solutions.

3. Jul 26, 2011

### psholtz

That's true, but a 2D rotation matrix still has eigenvalues, they just aren't real eigenvalues. But the eigenvalues still exist.

Moreover, the 2D rotation matrix isn't symmetric/Hermitian. It's usually of the form:

$$T = \left(\begin{array}{cc} \cos\phi & \sin\phi \\ -\sin\phi & \cos\phi \end{array} \right)$$

which is not symmetric/Hermitian.

4. Jul 26, 2011

### psholtz

To me, it would seem that there must be n roots (counting multiplicities) for the characteristic polynomial for every square matrix of size n. In other words, every square matrix of size n must have n eigenvalues (counting multiplicities, i.e., eigenvalues are possibly non-distinct).

The only way I can reconcile the statement above, the "symmetric operators may not have eigenvalues" is if the symmetric operator so described is not square?? Is it possible to have a non-square matrix, and call it "symmetric" if only the "square" part of it is symmetric?

5. Jul 26, 2011

### chogg

Complex numbers are not always automatically assumed. If you are working in a real vector space, the rotation matrix is a simple example of a linear operator that doesn't have eigenvectors.

6. Jul 26, 2011

### psholtz

Do you think that's what they were getting at in the Wikipedia article?

If we suppose the existence of complex numbers, or allow them at any rate, is it safe to say that a square matrix of size n will always have n eigenvalues (counting multiplicities)?

7. Jul 26, 2011

### chogg

Yes, and yes. (At least, I think so!)

8. Jul 26, 2011

### micromass

Nonono, this is not what the article is trying to say at all!!

You are completely correct that a symmetric matrix will always have eigenvalues. But the article isn't talking about matrices, it's talking about operators. That is, bounded linear function on a possible infinite-dimensional Hilbert space.

The matrices correspond to operators on a finite-dimensional Hilbert space. The only thing that the article will say is that operators on an infinite-dimensional Hilbert space does not need to have eigenvalues.

The standard example: take a monotone increasing, bounded function $u:[a,b]\rightarrow \mathbb{R}$. Then the operator

$$T:L^2([a,b])\rightarrow L^2([a,b]):f\rightarrow uf$$

is called the multiplication operator. This is a hermitian/symmetric operator without eigenvalues. Indeed, if T(f)=cf for a complex number c and a nonzero f. Then uf=cf. So if $f(x)\neq 0$, then $u(x)=\lambda$. But then u(x) equal lambda for multiple values of x, which contradicts that u is monotone increasing.

So the thing that the wikipedia article is getting to is that symmetric operators on infinite-dimensional Hilbert spaces do not necessarily have eigenvalues. However, symmetric operators on finite-dimensional spaces always have eigenvalues, since they are matrices.

9. Jul 26, 2011

### chogg

Thanks for that, micromass. A very instructive example and discussion! I was wrong, and I learned something.

10. Jul 26, 2011

### psholtz

Thanks, micromass... excellent explanation.