Why Continuous Functions Don't Preserve Cauchy Sequences

I like number
Messages
10
Reaction score
0

Homework Statement


Why is it that continuous functions do not necessarily preserve cauchy sequences.


Homework Equations


Epsilon delta definition of continuity
Sequential Characterisation of continuity


The Attempt at a Solution


I can't see why the proof that uniformly continuous functions preserve cauchy sequences doesn't hold for 'normal' continuous functions.
In particular the example of f(x) = 1/x on (0,1)
I have worked through the examples
http://www.mathcs.org/analysis/reals/cont/answers/fcont3.html
and here
http://www.mathcs.org/analysis/reals/cont/answers/contuni4.html

where they address this issue directly, but I can't get my head around it.

I understand that if we have a cauchy sequence converging to 0, then f(xn) is going to diverge to infinity, but I still can't see what the problem is.

Any explanation you can offer would be appreciated.

Kind regards
 
Physics news on Phys.org


I like number said:
I understand that if we have a cauchy sequence converging to 0, then f(xn) is going to diverge to infinity, but I still can't see what the problem is.

Recall that Cauchy sequences are bounded. So if \{f(x_n)\}_{n \in \mathbb{N}} diverges, then the sequence cannot be Cauchy. In particular, f does not take Cauchy sequences to Cauchy sequences.
 


The reason that we need uniform continuity is that we need to be able to find one \delta for each \epsilon that works for all x in a certain interval. This is because in the proof, we do a "double triangle inequality." So, if \{f(x_n)\} is a sequence of continuous functions that converges to f(x) for each x in the interval (a,b) then we want to show that \forall \epsilon \exists \delta such that |f(x_0) - f(x)| < \epsilon whenever |x_0 - x| < \delta. We do this by writting:
|f(x_0) - f(x)| = |f(x_0) - f_n(x_0) + f_n(x_0) - f_n(x) + f_n(x_0)-f(x)| \leq<br /> |f(x_0) - f_n(x_0)| + |f_n(x_0) - f_n(x)| + |f_n(x_0)-f(x)|

Now, since the sequence is Cauchy, we can control the outer two terms with a big enough n and make them both less than \epsilon / 3. So, we need to be able to ensure that |f_n(x) - f_n(x_0)| \leq \epsilon / 3 for every x such that |x_0-x|\leq \delta. The only way we can do this is by making f_n uniformly continuous.

As an example, consider the function f_n(x) = x^n on [0,1).
 


Thanks very much to you both.
I think I can see it more clearly now, (and a good nights sleep always helps too!).
I will continue to play around with these ideas and if I have any more questions I'll be back.

Thanks again
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top