# Elementary Hilbert space concepts/definitions.

I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?

What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.

All I know about eigenvalues is that they're the roots of characteristic polynomials in linear algebra, 2nd order homogenous ODEs, the square of the frequencies in small oscillations problems... They're numbers that can make an equation go to zero maybe?

Hi Lavabug! I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.

I fear you're mixing two concepts here. We have a sequence $$(x_0,x_1,x_2,...)$$ that is square-summable if

$$\sum_{n=0}^{+\infty}{|x_n|^2}<+\infty$$.

On the other hand, if we have a function $$f:\mathbb{R}\rightarrow \mathbb{R}$$ that is square-integrable if

$$\sum_{-\infty}^{+\infty}{|f(x)|^2}<+\infty$$.

The set of all square-summable sequences is called $$\ell^2$$. The set of all square-integrable functions is $$L^2$$.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?

Did you mean total instead of complete? A complete set X is such that $$\overline{Span(X)}=H$$. Thus the closure of the span of X is the entire Hilbert space. A orthonormal set that is total is called a basis.

[/QUOTE]
What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.
[/QUOTE]

Well, if we have a continuous linear function F ->H, then an eigenvector is an element x of H such that F(x)=kx. The number k is called the eigenvalue. This is exactly the same definition as with linear algebra.
Note that elements of Hilbert space are often functions. This is probably why you called them eigenfunctions...

All I know about eigenvalues is that they're the roots of characteristic polynomials in linear algebra, 2nd order homogenous ODEs, the square of the frequencies in small oscillations problems... They're numbers that can make an equation go to zero maybe?

This notions are all good for finite vector spaces. But for infinite vector spaces (such as the usual Hilbert spaces), this does not work anymore. You cannot work with characteristic polynomials and other stuff anymore. This is why the study of eigenvalues and eigenvectors is significantly harder than in the finite-dimenisonal case (= the linear algebra).

I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

A Hilbert space is an extension of the idea of Euclidean space $\mathbb{R}^n$ or $\mathbb{C}^n$. These spaces have some structure:
1. They're vector spaces. You can add vectors and multiply them by scalars.
2. They've got a scalar products: i.e. the dot product. You can measure lengths and angles.
3. They're complete: this means you can do calculus on them, and sequences converge under sensible conditions.
If you're not sure what something means in a Hilbert space, it's usually worth thinking about one of these more familiar spaces for a bit of intuition.

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.
You might talk about a sequence {an} being square-summable, meaning that the series $\sum |a_n|^2$ converges. The space of such sequences is a Hilbert space, with inner product
$$\langle{a_n},{b_n}\rangle=\sum_n \bar{a_n} b_n$$
which is guaranteed to converge if an and bn are both square summable.

You might talk about a function f being square-integrable, meaning that the integral $\int |f|^2$ converges. The space of such functions also forms a Hilbert space, with inner product
$$\langle f,g\rangle=\int \bar{f} g$$
which converges if f and g are both square integrable.

I suspect what you are interested in is one of these.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?
A complete set is very similar to a basis, in that it is a set of vectors (i.e. elements of the Hilbert space) such that every member of the space can be expressed as a sum of elements of the complete set. But it is different in that the sum can be infinite. For example, in the space of square-summable sequences, every sequence can be expressed as a (usually infinite) sum of the sequences with 1 in the nth position and 0 elsewhere. So the set of all these sequences is a complete set.

What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.
That's exactly it. An operator O on a Hilbert space is just a linear map, taking a vector v to a vector Ov. Linearity means that it respects the addition and scalar multiplication on the space. (We often require the extra technical restriction that it's 'bounded', but don't worry too much about that for now). So for u,v in the Hilbert space and a,b scalars, we have O(a.u+b.v) = a.Ou + b.Ov. An eigenvector of O satisfies Ov=av for some scalar a, the eigenvalue. If the space is a space of functions, v is sometimes called an eigenfunction, and if we are doing quantum mechanics, it's sometimes called an eigenstate.

As an example, again on the space of square summable sequences, take the operator that moves every element one to the left, so $O(a_1,a_2,a_3,\ldots)=(a_2,a_3,a_4,\ldots)[\itex]. This is linear, and the sequence {1/2n} is an eigenvector with eigenvalue 1/2: [itex]O(1/2,1/4,1/8,\ldots)=(1/4,1/8,1/16,\ldots)=1/2(1/2,1/4,1/8,\ldots)$.

Hope some of that is helpful.