F: [0,infinity) -> R is continuous at every point of its domain

  1. Problem:
    Assume that f: [0,infinity) -> R is continuous at every point of its domain. Show that if there exists a b>0 so that f is uniformly continuous on the set [b,infinity) then f is uniformly continuous on [0,infinity).

    I don't really know where to start with this one, any help would be greatly appreciated
     
  2. jcsd
  3. mathman

    mathman 6,621
    Science Advisor
    Gold Member

    Re: Continuity

    The main point would be to prove that if f is continuous on [0,b], it will be uniformly continuous on that interval, and therefore on [0,∞).
     
  4. Re: Continuity

    Like suggested by mathman, since f is continuous on [0,infinity) by the hypothesis, and since for some b>0 f is uniformly continuous on the set [b,infinity), again by the hypothesis of the problem, then to show that f is uniformly continuous on [o,infinity) all you need to do is to show that the continuity of f on [0,b] implies the uniform continuity of f on that interval. Actually there is a more general theorem of this. Namely, if f:A->R is a continuous function on A, with A compact, then f is uniformly continuous on A. Have you proved such theorem during your course? if yes, use it, if not then prove it. Note, this same theorem can be proved by exploiting the compactness concept, and without using it. So, if you don't know what compact means, then there is a way around it. But compactness in this case makes things easier, since we know that every closed and bounded set is a compact set (heine-borel theorem), so for your problem that would mean that [0,b] is a compact set as well.
    Like said, if you haven't learned compactness,then you can prove it by contradiction. that is, start by assuming that even though f is continuous on (lets take a more general case) [a,b], it is not uniformly continuous on [a,b]. What does this mean in terms of the definition of uniform continuity?? Try to generate two sequences a_n and b_n, such that |a_n-b_n|<d, implies |f(a_n)-f(b_n)|>=e, and come to a contradiction somewhere along those lines.

    This theorem, (without using the notion of compactness, was part of one of my projects for HOnors Real Analysis, so if interested, let me know, i can email the whole project to you). But try it first on your own.

    Cheers!
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?