# Prove that the linear space is infinite dimensional

• Hall
Oh! ##\cos2x## is not a linear combination of cos x and sin x, it involves product of elements.Yes, all of those trig functions are independent.Since, we can find as many independent elements in S as we like, therefore, S doesn't have a basis of any fixed number of elements.In summary, the first problem involves finding an infinite sequence of linearly independent elements for the space of all sequences whose limit is zero, which is possible by constructing sequences similar to the basis of ##\mathbb{R}^n##. For the second problem, the key is to find an infinite set of linearly independent functions, and this can be achieved by using trigonometric functions with specific powers (

#### Hall

Homework Statement
Prove that the Linear space of all sequences whose limit is zero is infinite dimensional. (And one more).
Relevant Equations
Infinite dimension means having an infinite basis.
A space is infinite dimensional when its basis is infinite. But how can I ensure that the basis of the space of all sequences whose limit is zero is infinite?

(After solving that, I would like to have a hint on this very similar problem which says: let V be a Linear space of all continuous functions in the interval ##[-\pi, \pi]##. Let S be the subset of V which contain all those functions which satisfy:

1. $$\int_{-\pi}^{\pi} f dt =0$$
2. $$\int_{-\pi}^{\pi} f \cos t dt =0$$
3. $$\int_{-\pi}^{\pi} f \sin t dt =0$$

Prove that S is infinite dimensional.)

You don't need to find a basis which is infinite (what that even means is a little tricky). You just have to find an infinite sequence of linearly independent elements. For the set of sequences whose limit is 0, this is pretty easy actually - you have something that kind of looks like ##\mathbb{R}^{\infty}##.

FactChecker and PeroK
Office_Shredder said:
You just have to find an infinite sequence of linearly independent elements
All right, that idea is clear to me.

Office_Shredder said:
For the set of sequences whose limit is 0, this is pretty easy actually - you have something that kind of looks like
I find myself unable to do that. Can you give me a little example?

Hall said:
All right, that idea is clear to me.

I find myself unable to do that. Can you give me a little example?

If I asked you to give me linearly independent elements of ##\mathbb{R}^n## for large n, what would you say?

FactChecker
Office_Shredder said:
If I asked you to give me linearly independent elements of ##\mathbb{R}^n## for large n, what would you say?
$$\{ (1,0,0,0,\cdots 0), (0,1,0,0\cdots 0), \cdots, (0,0,0,0, \cdots 1) \}$$

Hall said:
$$\{ (1,0,0,0,\cdots 0), (0,1,0,0\cdots 0), \cdots, (0,0,0,0, \cdots 1) \}$$
Can you make those into linearly independent sequences?

PeroK said:
Can you make those into linearly independent sequences?
I'm not very sure, but
$$1, 0,0,0 \cdots 0$$
$$0,1,0,0 \cdots 0$$
##\cdots##
$$0,0,0 \cdots 0$$

Are sequences with limit zero and they are independent (in the sense that one cannot be born no matter how the rest of them breed).

Hall said:
I'm not very sure, but
$$1, 0,0,0 \cdots 0$$
$$0,1,0,0 \cdots 0$$
##\cdots##
$$0,0,0 \cdots 0$$

Are sequences with limit zero and they are independent (in the sense that one cannot be born no matter how the rest of them breed).
So, that's ##n## linearly independent sequences. And, ##n## is arbitrary, therefore ...

Hall
PeroK said:
So, that's ##n## linearly independent sequences. And, ##n## is arbitrary, therefore ...
Can we construct any sequence with limit zero from them?

(Can we move to the second problem?)

Hall said:
Can we construct any sequence with limit zero from them?
We don't need to try to prove that. We don't need to find a basis; simply an infinite set of linearly independent sequences.
Hall said:
(Can we move to the second problem?)
The key here must be harmonic functions.

Hall said:
Can we construct any sequence with limit zero from them?
The idea here is that if a vector space has a finite dimension ##n##, you can have at most ##n## linearly independent vectors in a set of vectors. Since you found an infinite set of independent vectors, what does that imply?

Hall and PeroK
For the second problem, I went for the same idea, i.e. finding infinite number of independent elements, but I could find only four functions that are independent and belong to ##S##: ##f=0##, ## f = t^{(2n-1)}##, ## f= \cos nx##, and ##f= \sin nx## (where n represents a natural number).

Hall said:
For the second problem, I went for the same idea, i.e. finding infinite number of independent elements, but I could find only four functions that are independent and belong to ##S##: ##f=0##, ## f = t^{(2n-1)}##, ## f= \cos nx##, and ##f= \sin nx## (where n represents a natural number).
Can ##n## be any natural number?

PeroK said:
Can ##n## be any natural number?
Yes.

Hall said:
Yes.

So... How many choices of n are there?

How many different functions have you actually found?

Hall said:
Yes.
So, you have an infnite number of such functions. Can you prove that any finite set are linearly independent?

Office_Shredder said:
So... How many choices of n are there?

How many different functions have you actually found?
PeroK said:
So, you have an infnite number of such functions. Can you prove that any finite set are linearly independent?
0, ##t##, ##t^3##, ##t^5## ##\cdots##, ##\cos x##, ##\cos 2x##, ##\cos 3x## ##\cdots##, ##\sin x##, ##\sin 2x##, ##\cdots##

Oh! ##\cos2x## is not a linear combination of cos x and sin x, it involves product of elements.

Yes, all of those trig functions are independent.

Since, we can find as many independent elements in S as we like, therefore, S doesn't have a basis of any fixed number of elements.

Proving these are independent is not easy. Just because cos(2x) can be expressed as a sum of products of sin(x) and cos(x), doesn't mean there isn't a more clever way to write it down that's a linear combination.

The orthogonal condition in the op is actually the typical way to prove these are all linearly independent. For example
If ##\cos(2x)=a\sin(x)+b\cos(x)##, we have
$$0<\int_{-\pi}^{\pi} \cos^2(2x) dx = a\int_{-\pi}^{\pi} \sin(x) \cos(2x)dx + b \int_{-\pi}^{\pi} \cos(x) \cos(2x) dx=0$$

Hall
What about an induction on ##n##? Using the second derivatives of ##\cos(kx)##:
$$A_1\cos(x) + A_2\cos(2x) + \dots + A_n\cos(nx) = 0$$$$\Rightarrow A_1\cos(x) + 4A_2\cos(2x) + \dots + n^2A_n\cos(nx) = 0$$$$\Rightarrow \ (n^2 - 1)A_1\cos(x) + (n^2 - 4)A_2\cos(2x) + \dots + (n^2 - (n-1)^2)A_{n-1}\cos((n-1)x) = 0$$

Hall