Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simple Extension of a Function

  1. Nov 8, 2008 #1
    Suppose we have a function f from R to R that is continuous on (a,b]. Define g by g(x) = f(x) if x <> a and g(a) = lim f(x) as x approaches a. Is it true that g is continuous on [a,b]?

    I would think it is, but I'm having a hard time proving it. I'm trying to use sequences to do this: Suppose {s_n} is a sequence in [a,b] that converges to s.

    If s <> a, then then {s_n} contains only finitely many a's, whence there exists an M such that s_n <> a for all n > M. Thus, the sequence {s_m} where m > M contains no a's, so {g(s_m) = f(s_m)} converges to f(s) = g(s) by the continuity of f, i.e. for any e > 0, there is an N such that |g(s_m) - g(s)| < e for all n > N. Thus, |g(s_n) - g(s)| < e for all n > N > M, so g is continuous at s <> a.

    Now suppose s = a. If {s_n} contains finitely many a's, then I can use the same strategy as mentioned above to conclude that {g(s_n)} converges to a. If {s_n} contains infinitely many a's, then surely there's a subsequence {s_m} of {s_n} such that {g(s_m)} converges g(a). How can I extend this so that {g(s_n)} converges to g(a)?
  2. jcsd
  3. Nov 9, 2008 #2


    User Avatar
    Science Advisor

    Assuming that limit exists, then g(x) will be continuous from the right at x= a. Of course, you can't say anything about what happens when x< a.

    Again, you never used the fact that, in order to define g(a) to be [itex]\lim_{x\rightarrow a} f(x)[/itex] that limit must exist. If {xn} is a sequence of numbers, in (a,b], converging to a, then [itex]\lim_{n\rightarrow \infty}g(x_n)= \lim_{\rightarrow\infty}f(x_n)[/itex] and, by definition, that last limit is g(a). That's all you need. I have no idea why you are even worrying about "how many 'a's" such a sequence must have.
  4. Nov 9, 2008 #3
    Oops. I forgot to state that. Let's assume the limit exists.

    But what happens if the sequence is in [a,b]? I want to state that if {x_n} is a sequence in [a,b], then {g(x_n)} converges to g(a). Is this not true?
  5. Nov 9, 2008 #4


    User Avatar
    Science Advisor

    Just after I wrote that, it finally dawned on me that you were considering the case that the sequence contains a itself. No problem. If a only occurs a finite number of times, then there is no difficulty- you can discard them. If a appears an infinite number of times, look at the two subsequences, the constant sequence of all g(a)'s and the sequence with the a's discarded. Obviously the first sequence converges to g(a) and then you can use the fact that, for the other sequence g(xn)= g(xn) and argue as I said.
  6. Nov 9, 2008 #5
    I think you meant to write g(xn)= f(xn).

    But what if the sequence with the a's discarded isn't a infinite sequence anymore? I guess I would argue as follows: If all subsequences of {x_n} have an infinite amount of a's, then {g(x_n)} converges to g(a). If there is a subsequence of {x_n} with a finite amount of a's, then I would argue as you said.
  7. Nov 9, 2008 #6
    I think I misunderstood what you said. Are you saying that if {x_n} has an infinite amount of a's, then I can break it up into two subsequences, one with all a's, say {x_j}, and the other with no a's, say {x_k}. And that for both these sequences, {g(x_j)} and {g(x_k)} converge to g(a), so the original sequence must converge to g(a)?
  8. Nov 9, 2008 #7
    Nevermind. I've figured it out. Thanks for the help.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook