Is a Continuously Converging Function on [0,∞) Uniformly Continuous?

  • Context: Graduate 
  • Thread starter Thread starter A-ManESL
  • Start date Start date
  • Tags Tags
    Continuous Function
Click For Summary

Discussion Overview

The discussion revolves around the uniform continuity of a continuous function defined on the interval [0, ∞) that converges to 1 as x approaches infinity. Participants explore the implications of continuity and limits in the context of uniform continuity, particularly in relation to the properties of compact sets.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant proposes that if f:[0, ∞) → ℝ is continuous and converges to 1 as x approaches infinity, then it should be uniformly continuous.
  • Another participant points out that the earlier statement about compact sets does not apply since [0, ∞) is not compact.
  • A participant suggests that for large x, f(x) will be close to 1, implying a potential approach to uniform continuity.
  • There is a discussion about the definition of uniform continuity and how it might relate to the observed behavior of f as x increases.
  • One participant outlines a method to show uniform continuity by dividing the analysis into two cases: one for x values in a bounded interval and another for x values beyond a certain threshold.
  • The method involves finding a delta for a given epsilon and using properties of limits and continuity on compact intervals.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether the function is uniformly continuous. There are multiple viewpoints regarding the application of continuity and limits, and the discussion remains unresolved.

Contextual Notes

Participants note the importance of understanding the proof that continuous functions on compact sets are uniformly continuous, suggesting that this knowledge is crucial for addressing the main question.

A-ManESL
Messages
8
Reaction score
0
Suppose f:[0,[tex]\infty]\rightarrow \mathbb{R}[/tex] is a continuous function such that [tex]\lim_{x\rightarrow \infty} f(x)=1[/tex]. I want to show that f is uniformly continuous. Thanks.
 
Physics news on Phys.org
Every continuous function on a compact set is uniformly continuous.

Did you mean [itex][0, +\infty)[/itex]?
 
Yes I meant [tex]f:[0,\infty)\rightarrow \mathbb{R}[/tex]. Sorry for the mistake. Obviously [tex][0,\infty)[/tex] is not compact and so the above stated result doesn't apply.
 
P.S. I was being sneaky -- my question is also a hint.
 
Can you please me more explicit? All I can think of is that for large x, f(x) will be close to 1.
 
A-ManESL said:
Can you please me more explicit?
Well, I do have to give you a chance to see for yourself how to fit the ideas you've learned together. Otherwise, you won't learn!

If f really were a function on [itex][0, +\infty][/itex], the problem would be easy...



All I can think of is that for large x, f(x) will be close to 1.
Or, if you don't want to think my way, then we can think your way. What is the definition of uniform continuity? How might we be able to apply this fact you've observed?

p.s. do you understand the proof that any continuous function on a compact set is uniformly continuous? If not, then you really don't have much chance of getting this proof to work... It would definitely be worth your time to review that proof.
 
That's very easy to show, once you can use the theorem "f continuous on a compact => f uniformly continuous", even though I had to stress some tecnichal detail.

Take eps > 0.

You have to find a delta > 0 such that for every couple of elements x1, x2 in [0, +infty) such that | x1 - x2 | < delta you have |f(x1) - f(x2)| < eps.

From the definition of limit you have that an M positive real exist such that for every x > M
you are sure |f(x) - 1| < eps/2.

Now on [0,M+1] f is uniformly continuous, so you have a delta1 >0 such that for every x1,x2 in [0,M+1] we have |f(x1) - f(x2)| < eps.

Take delta = min(delta1, 1).

We have trivially that, for every x1,x2 in [0, M+1] with | x1 - x2 | < delta we get
|f(x1) - f(x2)| < eps and delta is in (0,1].



Ok. Now take x1, x2 in [0, + infty) with | x1 - x2 | < delta. If the greatest of the 2 xs lays in [0,M+1] you have, of course, |f(x1) - f(x2)| < eps .
If the greatest xs lays in (M+1, +infty), since | x1 - x2 | < delta <= 1, we have the lowest x stays in (M, +infty). In conclusion we have x1, x2 > M.

So | f(x1) - f(x2)| = | f(x1) - 1 + 1 - f(x2) | <= |f(x1) -1| + |1 - f(x2)| =
|f(x1) - 1| + |f(x2) -1| < eps/2 + eps/2 = eps.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K