- #1

- 123

- 10

## Homework Statement

The problem (Spivak's Calculus, chapter 6, problem 17):

"Let ##f## be a function with the property that every point of discontinuity is a removable discontinuity. This means that ##\underset {y \rightarrow x} {\lim} {f(y)}## exists for all ##x##, but ##f## may be discontinuous at some (even infinitely many) numbers ##x##. Define ##g(x)=\underset {y \rightarrow x} {\lim} {f(y)}##. Prove that ##g## is continuous."

After struggling with articulating a proof, I checked the answer, which I understood except for the last part:

"Since ##g(a)=\underset {y \rightarrow a} {\lim} {f(y)}##, by definition, it follows that for every ##\epsilon>0## there is a ##\delta>0## such that ##|f(y)-g(a)|<\epsilon## for ##|y-a|<\delta##. This means that

##g(a)-\epsilon<f(y)<g(a)+\epsilon## for ##|y-a|<\delta##."

I understand everything in this part, the next part is where I struggle:

"So if ##|x-a|<\delta##, we have

##g(a)-\epsilon≤\underset {y \rightarrow x} {\lim} f(y)≤g(a)+\epsilon##

which shows that ##|g(x)-g(a)|≤\epsilon## for all ##x## satisfying ##|x-a|<\delta##. Thus ##g## is continuous at ##a##."

I don't understand how he made the leap from ##g(a)-\epsilon<f(y)<g(a)+\epsilon## for ##|y-a|<\delta## to ##g(a)-\epsilon≤\underset {y \rightarrow x} {\lim} f(y)≤g(a)+\epsilon## for ##|x-a|<\delta##. It makes intuitive sense, if the values of ##f## in the interval are all bounded it's pretty obvious that the limit of ##f## at any given point will also be within the bounds (since the values of ##f## for ##x##'s arbitrarily close to it are within the bounds). But I cant find a reasonable mathematical proof of it.

## Homework Equations

Continuity definiton: ##\underset {x \rightarrow a} {\lim} {f(x)} = f(x)##

## The Attempt at a Solution

After racking my brain at this for several hours I eventually came up with a stupidly long proof, which is not even complete since it relies on the idea that a bounded function always has a point of least value ##f(x')## within an interval ##b_1≤x≤b_2## such that ##f(x')≤f(x)## for all ##x## in ##b_1≤x≤b_2##, an idea which I have not proven yet and probably can't at this point. Even if I did, I am pretty sure that there is a much simpler explanation considering the fact that Spivak allowed himself to skip it completely... It's important to note that at this point the book only covered up to limits and basic ideas of continuity, no derivatives, integrals, series or anything of the sort.

Help on the matter would be very appreciated.