Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Closed subset of a metric space

  1. May 17, 2007 #1
    This seems to be a very easy excercise, but I am completely stuck:
    Prove that in C([0,1]) with the metric
    [tex] \rho(f,g) = (\int_0^1|f(x)-g(x)|^2 dx)^{1/2} [/tex]

    a subset
    [tex]A = \{f \in C([0,1]); \int_0^1 f(x) dx = 0\}[/tex] is closed.

    I tried to show that the complement of A is open - it could be easily done if the metric was [tex] \rho(f,g) = sup_{x \in [0,1]}|f(x)-g(x)| [/tex] - but with the integral metric it's not that easy.

    Am I missing something?
    Thanks for any help.
     
    Last edited: May 17, 2007
  2. jcsd
  3. May 17, 2007 #2

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    I would attempt to show that the map f--->int f(x)dx is continuous.

    And you need / not \ in your closing tex tags.
     
  4. May 17, 2007 #3

    NateTG

    User Avatar
    Science Advisor
    Homework Helper

    Consider if:
    [tex]\left| \int_0^1 g(x)dx \right| = \epsilon[/tex]
    then you might be able to show that
    [tex]N_{\epsilon}(g(x)) \cap A = \emptyset[/tex]
     
    Last edited: May 17, 2007
  5. May 17, 2007 #4
    Well, I tried both, but the problem is that I still miss some kind of inequality that I could use.

    I mean - if I want to show the continuity for example - I have to show that:
    [tex] \forall \varepsilon > 0 \quad \exists \delta >0 \quad \forall g \in C([a,b]) : \rho(f,g)<\delta \quad |\int^1_0 f(x)-g(x) dx|< \varepsilon [/tex].

    But what then?
    [tex]|\int^1_0 f(x) - g(x) dx| \leq \int^1_0 |f(x) - g(x)| dx [/tex]
    but I miss some other inequality where I could compare it with [tex]\rho(f,g)[/tex]
     
  6. May 17, 2007 #5

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Well, there is another inequality lying around. So use it That last inequality is also <=p(f,g).
     
  7. May 17, 2007 #6
    Of course :rolleyes: - a little modified Cauchy-Schwartz inequality is the key.
    I hate algebraic tricks :smile:
    Thanks for help
     
  8. May 18, 2007 #7

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    It's definitely not an algebraic trick. It is an application of the Jensen inequality from analysis. You might know it from probability theory, since it just states that the variance of a random variable is positive, i.e.

    E(X^2)>E(X)^2

    where E is the expectation operator and X an r.v.
     
  9. Oct 30, 2010 #8
    How did you prove it for the supremum case? If you can prove it for the supremum, this proof here is self-contained because the given metric is always less than the supremum.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Closed subset of a metric space
  1. Metric space (Replies: 5)

  2. Metric Spaces (Replies: 26)

Loading...