Elementary Hilbert space concepts/definitions.

Click For Summary
SUMMARY

This discussion clarifies foundational concepts in Hilbert spaces, focusing on square-summability, completeness, and eigenfunctions/eigenvalues. A sequence is square-summable if the series ∑|x_n|² converges, while a function is square-integrable if ∫|f|² converges. Completeness in Hilbert spaces means that every vector can be expressed as a sum of elements from a complete set, which is often infinite. Eigenfunctions correspond to eigenvalues in linear operators, maintaining the same definitions as in linear algebra but extending to infinite-dimensional spaces.

PREREQUISITES
  • Understanding of basic calculus and limits
  • Familiarity with linear algebra concepts, particularly eigenvalues and eigenvectors
  • Knowledge of vector spaces and scalar products
  • Basic comprehension of integrals and series convergence
NEXT STEPS
  • Study the properties of ℓ² and spaces in detail
  • Explore the concept of orthonormal bases in Hilbert spaces
  • Learn about linear operators and their applications in quantum mechanics
  • Investigate the implications of completeness in functional analysis
USEFUL FOR

Mathematicians, physics students, and anyone interested in functional analysis or quantum mechanics will benefit from this discussion, particularly those seeking to understand the foundational aspects of Hilbert spaces.

Lavabug
Messages
858
Reaction score
37
I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?

What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.

All I know about eigenvalues is that they're the roots of characteristic polynomials in linear algebra, 2nd order homogenous ODEs, the square of the frequencies in small oscillations problems... They're numbers that can make an equation go to zero maybe?
 
Physics news on Phys.org
Hi Lavabug! :smile:

Lavabug said:
I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.

I fear you're mixing two concepts here. We have a sequence (x_0,x_1,x_2,...) that is square-summable if

\sum_{n=0}^{+\infty}{|x_n|^2}<+\infty.

On the other hand, if we have a function f:\mathbb{R}\rightarrow \mathbb{R} that is square-integrable if

\sum_{-\infty}^{+\infty}{|f(x)|^2}<+\infty.

The set of all square-summable sequences is called \ell^2. The set of all square-integrable functions is L^2.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?

Did you mean total instead of complete? A complete set X is such that \overline{Span(X)}=H. Thus the closure of the span of X is the entire Hilbert space. A orthonormal set that is total is called a basis.

[/QUOTE]
What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.
[/QUOTE]

Well, if we have a continuous linear function F:H->H, then an eigenvector is an element x of H such that F(x)=kx. The number k is called the eigenvalue. This is exactly the same definition as with linear algebra.
Note that elements of Hilbert space are often functions. This is probably why you called them eigenfunctions...

All I know about eigenvalues is that they're the roots of characteristic polynomials in linear algebra, 2nd order homogenous ODEs, the square of the frequencies in small oscillations problems... They're numbers that can make an equation go to zero maybe?

This notions are all good for finite vector spaces. But for infinite vector spaces (such as the usual Hilbert spaces), this does not work anymore. You cannot work with characteristic polynomials and other stuff anymore. This is why the study of eigenvalues and eigenvectors is significantly harder than in the finite-dimenisonal case (= the linear algebra).
 
Lavabug said:
I'd appreciate it if anyone could help me clear up some concepts, the last chapter of one of my math courses is a (highly mysterious) introduction to Hilbert spaces (very very basic):

A Hilbert space is an extension of the idea of Euclidean space \mathbb{R}^n or \mathbb{C}^n. These spaces have some structure:
1. They're vector spaces. You can add vectors and multiply them by scalars.
2. They've got a scalar products: i.e. the dot product. You can measure lengths and angles.
3. They're complete: this means you can do calculus on them, and sequences converge under sensible conditions.
If you're not sure what something means in a Hilbert space, it's usually worth thinking about one of these more familiar spaces for a bit of intuition.

What does it mean for a function to be "square-summable"? Has something to do with the scalar product in Hilbert space.
You might talk about a sequence {an} being square-summable, meaning that the series \sum |a_n|^2 converges. The space of such sequences is a Hilbert space, with inner product
\langle{a_n},{b_n}\rangle=\sum_n \bar{a_n} b_n
which is guaranteed to converge if an and bn are both square summable.

You might talk about a function f being square-integrable, meaning that the integral \int |f|^2 converges. The space of such functions also forms a Hilbert space, with inner product
\langle f,g\rangle=\int \bar{f} g
which converges if f and g are both square integrable.

I suspect what you are interested in is one of these.

What is meant by a "complete set"? Something about multiplying a set of vectors in the Hilbert space by an orthonormal basis set and that being equal to a summation of scalars*basis vectors?
A complete set is very similar to a basis, in that it is a set of vectors (i.e. elements of the Hilbert space) such that every member of the space can be expressed as a sum of elements of the complete set. But it is different in that the sum can be infinite. For example, in the space of square-summable sequences, every sequence can be expressed as a (usually infinite) sum of the sequences with 1 in the nth position and 0 elsewhere. So the set of all these sequences is a complete set.

What is actually meant by eigenfunction/eigenvalues in the context of Hilbert spaces? All I know is if I apply some Hilbert space operator to a function, I get another function with a scalar out in front of it. That scalar is an eigenvalue and the whole function is an eigenfunction.
That's exactly it. An operator O on a Hilbert space is just a linear map, taking a vector v to a vector Ov. Linearity means that it respects the addition and scalar multiplication on the space. (We often require the extra technical restriction that it's 'bounded', but don't worry too much about that for now). So for u,v in the Hilbert space and a,b scalars, we have O(a.u+b.v) = a.Ou + b.Ov. An eigenvector of O satisfies Ov=av for some scalar a, the eigenvalue. If the space is a space of functions, v is sometimes called an eigenfunction, and if we are doing quantum mechanics, it's sometimes called an eigenstate.

As an example, again on the space of square summable sequences, take the operator that moves every element one to the left, so O(a_1,a_2,a_3,\ldots)=(a_2,a_3,a_4,\ldots)[\itex]. This is linear, and the sequence {1/2<sup>n</sup>} is an eigenvector with eigenvalue 1/2: O(1/2,1/4,1/8,\ldots)=(1/4,1/8,1/16,\ldots)=1/2(1/2,1/4,1/8,\ldots).<br /> <br /> Hope some of that is helpful.
 

Similar threads

  • · Replies 61 ·
3
Replies
61
Views
5K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 59 ·
2
Replies
59
Views
5K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 67 ·
3
Replies
67
Views
7K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 44 ·
2
Replies
44
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 18 ·
Replies
18
Views
1K