Is the Subset A Closed in C([0,1]) with the Given Metric?

r4nd0m
Messages
96
Reaction score
1
This seems to be a very easy excercise, but I am completely stuck:
Prove that in C([0,1]) with the metric
\rho(f,g) = (\int_0^1|f(x)-g(x)|^2 dx)^{1/2}

a subset
A = \{f \in C([0,1]); \int_0^1 f(x) dx = 0\} is closed.

I tried to show that the complement of A is open - it could be easily done if the metric was \rho(f,g) = sup_{x \in [0,1]}|f(x)-g(x)| - but with the integral metric it's not that easy.

Am I missing something?
Thanks for any help.
 
Last edited:
Physics news on Phys.org
I would attempt to show that the map f--->int f(x)dx is continuous.

And you need / not \ in your closing tex tags.
 
Consider if:
\left| \int_0^1 g(x)dx \right| = \epsilon
then you might be able to show that
N_{\epsilon}(g(x)) \cap A = \emptyset
 
Last edited:
Well, I tried both, but the problem is that I still miss some kind of inequality that I could use.

I mean - if I want to show the continuity for example - I have to show that:
\forall \varepsilon > 0 \quad \exists \delta >0 \quad \forall g \in C([a,b]) : \rho(f,g)<\delta \quad |\int^1_0 f(x)-g(x) dx|< \varepsilon.

But what then?
|\int^1_0 f(x) - g(x) dx| \leq \int^1_0 |f(x) - g(x)| dx
but I miss some other inequality where I could compare it with \rho(f,g)
 
Well, there is another inequality lying around. So use it That last inequality is also <=p(f,g).
 
Of course :rolleyes: - a little modified Cauchy-Schwartz inequality is the key.
I hate algebraic tricks :smile:
Thanks for help
 
It's definitely not an algebraic trick. It is an application of the Jensen inequality from analysis. You might know it from probability theory, since it just states that the variance of a random variable is positive, i.e.

E(X^2)>E(X)^2

where E is the expectation operator and X an r.v.
 
How did you prove it for the supremum case? If you can prove it for the supremum, this proof here is self-contained because the given metric is always less than the supremum.
 
Back
Top