# Hilbert Spaces

#### jwsiii

I'm trying to understand Hilbert spaces and I need a little help. I know that it's a vector space of vectors with an infinite number of components, but a finite length. My biggest question is: how is a Hilbert space used to represent a function? Is each component of the vector a point on the function? Also, the text I'm reading says that the inner product of a function equals the length of that function and I don't really understand what this length is. Any help would be appreciated. Thanks!

Related Linear and Abstract Algebra News on Phys.org

#### AKG

Homework Helper
I'm not entirely sure, but if I remember correctly, then each basis vector of your Hilbert space is itself a function. Perhaps you're familiar with a vector space of polynomials. In such a space, x and x² are linearly independent, because there is no element of the field, k, such that kx = x² (for all x). Similarly, if you have two functions f and g, and there is no scalar k such that kf = g (where kf = g if and only if, for all x, kf(x) = g(x)), then f and g are linearly independent. But, for example, the functions f defined by f(x) = sin(x) and g defined by g(x) = 3sin(x) are clearly linearly dependent, because there is a k such that kf = g, namely k = 3. Also, note that sin(x) is a function and x² is a function, and the two are linearly independent. But the sum, 3sin(x) - 1.2x² (a linear combination of sin(x) and x²) is also a function.

So the best way to think of this, is, perhaps to generalize from the example of polynomial vector spaces. They are spanned by a basis of independent vectors {1, x, x², ...} and consists of elements like:

ax + bx³ + cx23 + d

A Hilbert space of functions would be spanned by independent functions/vectors:

{1, x, x², ...., sin, sin², sin³, ..., cos, cos², cos³, ..., exp, exp², ..., f, g, h, ...}

You can see, for example, why sin³ and exp² are linearly independent, right? Remember, they are lin. indep. iff there is no k such that ksin³ = exp², iff there is no k such that for all x, ksin³x = exp²x, which is certainly true.

"the inner product of a function" doesn't make sense, since the inner product takes two arguments. Do you mean, "the inner product of a function with itself"? If so, then the length of a vector v in an inner product space is generally defined to be <v,v>1/2, where <v,v> is the inner product of v with v. Maybe that's what they mean? In your second sentence, you also mention something about length, but it seems to be in a different context. As far as I can tell, all that is saying (by finite length) is that you can't have, for example an infinite sum:

a + bx + csin(x) + ex + ...

the Hilbert space only contains all linear combinations of finitely many basis vectors.

#### jwsiii

Yeah, I meant the inner product of a function with itself. Doesn't a space in R^n have to be spanned by n vectors? A Hilbert space would have to be spanned by infinitely many vectors, each with an infinite number of components. That vector you gave is only one vector.

#### George Jones

Staff Emeritus
Gold Member
AKG said:
the Hilbert space only contains all linear combinations of finitely many basis vectors.
When generalizing finite-dimensional inner product spaces to infinite-dimensional (separable) Hilbert spaces, there are two ways that the concept of basis generalizes.

Given the axiom of choice, any vector space, finite-dimensional or infinite dimensional, inner product space or not, has a Hamel basis, i.e., a set of linearly independent vectors in terms of which any vector in the space can be written as a finite linear combination.

The other generalization is the one that, as far as I can tell, gets used more often in fuctional analysis and the theory of Hilbert spaces. An orthonormal basis for a separable Hilbert space is a countable set of linearly independent, pairwise orthogonal unit vectors in terms of which any vector in the space can be written as a finite linear combination, *or* as the strong limit of finite linear combinations.

Basically, this allows vectors to be expanded in terms of "infinite linear combinations" of basis elements.

Regards,
George

#### HallsofIvy

Homework Helper
I think you have it backwards. It is not that a Hilbert space is used to "represent" functions. It is just that certain sets of functions are Hilbert spaces.
jwsiii said:
I know that it's a vector space of vectors with an infinite number of components
No, that's not true. A "Hilbert space" is a vector space having an inner product and such that the "Cauchy criterion" is satisfied- that is, that every Cauchy Sequence of vectors converges.

It is true that "l2", the set of all sequences of real (or complex) numbers, such that $$\Sigma_{i=n}^\infty a_n^2$$ converges, forms a Hilbert Space with inner product $$<{a_n},{b_n}>= \Sigma_{n=0}^\infty (a_n)(b_n)$$. I think that's what you are talking about when you say "infinite number of components". That's kind of a "toy" example. Actually all finite dimensional vector spaces are Hilbert Spaces- again trivially and some texts explicitely forbid them.
More to the point, the set of all functions, f(x), defined on some set A, such that $$\int_A f^2(x)dx$$ exists forms a Hilbert space.

As far as " but a finite length" is concerned, I think you are talking about the fact that a "linear combination" is, by definition, a finite sum. Even for infinite dimensional vector spaces, every vector space has a basis such that every vector can be written as a finite sum of basis vectors. However, for function spaces in general, such a basis may be very hard to find and it is more common to represent them in terms of infinite series.

Last edited by a moderator:

#### mathwonk

Homework Helper
a sequence is a vector with an infinite number of components, the entries opf thes equence.

The Euclidean length of such a sequence is the square root of the sum of the squares of the entries. hence the vectors included in the hilbert space are those of finite length, i.e. those such that the sequence of squares converges.

a sequence can be considered as a function from the set of positive integers to the reals, (or complexes), with the discrete measure. More generally a function on some other measure space can be considered as a vector with as many entries as there are points in the space, the length is again the square ropot of the integral of the squares of the values of the function, and we include only those functions of finite length, in the hilbert space.

#### jcsd

Gold Member
jwsiii said:
I'm trying to understand Hilbert spaces and I need a little help. I know that it's a vector space of vectors with an infinite number of components, but a finite length. My biggest question is: how is a Hilbert space used to represent a function? Is each component of the vector a point on the function? Also, the text I'm reading says that the inner product of a function equals the length of that function and I don't really understand what this length is. Any help would be appreciated. Thanks!

Consider the set of real functions (that is functions from R to R)

Cleray for any real functions f, g and h and any real numbers and b :

f(x)+g(x) = g(x)+f(x)

(f(x)+g(x))+h(x) = f(x)+(g(x)+h(x))

f(x)+0(x) = 0(x)+f(x) = f(x) where 0(x) is the constant real function 0(x) = 0

f(x) + -f(x) = 0(x)

a(b*f(x)) = (ab)*f(x)

(a+b)*f(x) = a*f(x)+b*f(x)

a*(f(x)+g(x)) = a*f(x) + a*g(x)

1*f(x) = f(x)

Clearly then the set of real functions satisfies all the axioms of a vector space (see: http://mathworld.wolfram.com/VectorSpace.html) over the real numbers, for exactly the same reasons the set of function from C to C form a vector space over the complex numbers.

Consider the set of all real functions that have the property f(a) = 1 for some a otherwise f(x) = 0. Clearly any subset of this set is linearly independent (see: http://mathworld.wolfram.com/LinearlyIndependent.html) and these are all real functions thus the maximal set of linearly independent real functions is infinite, thus the set of real functions is an infinite dimensional vector space over R (as dim(V) for a vector space V is simply the cardianlity of the maximal set of linearly independent vectors).

If you really must talk about components then b where f(a) = b can be considered a component of f(x) in some sense.

The above explains why funtions may be considerd to be elemnts in a vector space and how such spaces can be infinite.

Now the vector space of real functions and the vector space of complex functions have subspaces, for example as:

$$\int af(x) dx = a\int f(x)dx$$

and

$$\int f(x)+g(x) = \int f(x)dx + \int g(x)$$

the set of functions whose intergral over the whole real line is finite is a subspace

Sometimes we like to define an inner product on a vector space, for example an inner product on the above space is (note the below does not dfefine an inner product on the vector space of real functions, just subspaces):

$$<f,g> = \int^{+\infty}_{-\infty}f(x)g(x)dx$$

For a real or complex vector space we may automatically use this inner product to define a norm and in turn a metric, where this metric is complete the space is a hilbert space. So the above example (the set of all functions with a finite inetrgral over the real line) is a Hilbert space.

Last edited:

#### masudr

jcsd said:
...we may automatically use this inner product to define a norm and in turn a metric, where this metric is complete the space is a hilbert space.
I'd just like to quickly add that this norm is defined as follows:

$$\|a\|=\sqrt{\langle a,a\rangle}$$

#### HallsofIvy

Homework Helper
And, just to complicate things more, it is possible to have a metric (a 'distance function') that does not derive from any norm, or a norm that does not derive from any inner product.

L1(A), the set of all functions such that $$\int_A |f(x)|dx$$ exists, forms a vector space with precisely that integral as norm. But includes functions for which $$\int_A f^2(x)dx$$ does not exist. There is no "inner product" that will give that norm.
It is true that Cauchy Sequences of such functions still converge (in norm). This is a "Banach" space.
I believe (this is off the top of my head- I haven't see the phrase in years) that a complete (Cauchy sequences converge) metric space is called a "Frechet" space.

#### jcsd

Gold Member
I must admit when I was more than little bit lazy introducting inner products, norms and metrics and how an inner product may define a norm which induces a metric.

I think the main poitns in functional anlaysis are to understand how a set of functions can form a vector space and also tangentially (as it is true for many interesting cases) why a vector space may be infinite dimensional.

I think genreally a Cauchy complete metric space is in the modern parlance simply known as a "complete metric space".

#### jwsiii

So if we were talking about a Fourier Series, would this be the basis for the Hilbert space:

{0.5, cos x, sin x, cos 2x, sin 2x, cos 3x, sin 3x, cos 4x, sin 4x, ... }

and this be the vector that represents the function?:

{a0, a1, b1, a2, b2, a3, b3, a4, b4, a5, b5, a6, b6, ...}

If you take the dot product of these two vectors you would get a Fourier Series wouldn't you?

#### HallsofIvy

Homework Helper
Properly normalized, yes.

#### jwsiii

How would you normalize it?

#### jcsd

Gold Member
jwsiii said:
So if we were talking about a Fourier Series, would this be the basis for the Hilbert space:
{0.5, cos x, sin x, cos 2x, sin 2x, cos 3x, sin 3x, cos 4x, sin 4x, ... }
and this be the vector that represents the function?:
{a0, a1, b1, a2, b2, a3, b3, a4, b4, a5, b5, a6, b6, ...}
If you take the dot product of these two vectors you would get a Fourier Series wouldn't you?

You can see that the basis functions for a Fourirer series are e^inx with the Fourier coefficents being the components, these can be further broken down into basis which includes sine and cosine functions. An inner product on a vector space V over a field K is a map VxV-->K. A Foruier series represnets a function (a vector), so the inner product will not be a Fourier series.

#### DrGreg

Gold Member
jwsiii said:
So if we were talking about a Fourier Series, would this be the basis for the Hilbert space:
{0.5, cos x, sin x, cos 2x, sin 2x, cos 3x, sin 3x, cos 4x, sin 4x, ... }
and this be the vector that represents the function?:
{a0, a1, b1, a2, b2, a3, b3, a4, b4, a5, b5, a6, b6, ...}
If you take the dot product of these two vectors you would get a Fourier Series wouldn't you?
I don't think you've quite grasped the concept.

You shouldn't really think of a vector as a sequence of components. Vectors are just things that you can add to each other and multiply by scalars. In the context of a Hilbert space of functions, the word "vector" means a function.

So cos t is one vector, sin t is another vector, and cos 2t is another. The expression

{0.5, cos t, sin t, cos 2t, ...}

is not a vector; it is a set of basis vectors. It plays the same role in this example as does the set {i, j, k} in a 3D vector space.

The Fourier series formula

$$f(t) = \frac{1}{2} a_0 + a_1 \cos t + b_1 \sin t + a_2 \cos 2t + ...$$

is a Hilbert space equivalent of the 3D equation

$$\textbf r = x \textbf{i} + y \textbf{j} + z \textbf{k}$$

The Fourier series is not an inner product. In fact an is the inner product between the vector f(t) and the vector $\frac{1}{\pi} \cos n t$ (in the same way that x is the inner product between r and i).

"Normalizing" means multiplying each basis vector by a scalar so that its length is equal to 1.

Last edited:

#### Lonewolf

HallsofIvy said:
I believe (this is off the top of my head- I haven't see the phrase in years) that a complete (Cauchy sequences converge) metric space is called a "Frechet" space.
Going off topic slightly, a Frechet space is a locally convex topological linear space whose topology can be induced by a countable set of semi-norms, and the metric space is complete. Better to call a Cauchy-complete space just a complete metric space.

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving