Real Analysis Homework: Show Uniform Continuity & Density in R

  • Thread starter Thread starter MIT2014
  • Start date Start date
Click For Summary

Homework Help Overview

The problem involves demonstrating uniform continuity of a function defined in terms of a set X in the real numbers, as well as establishing a condition for the density of X in R. The subject area is real analysis, focusing on properties of functions and set density.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the definition of uniform continuity and explore how to manipulate the expression for |f(x) - f(a)|. There is also a consideration of how the density of the set X relates to the behavior of the function f.

Discussion Status

Some participants have provided insights into the proof for uniform continuity, suggesting specific approaches and reasoning. Others are exploring the implications of density in R and how to formalize their ideas into a proof, indicating a productive exchange of ideas without reaching a consensus.

Contextual Notes

Participants express uncertainty about the formal proof structure and the implications of the definitions involved, highlighting the need for clarity in the assumptions regarding the set X.

MIT2014
Messages
10
Reaction score
0

Homework Statement


Let X[tex]\subset[/tex]R be nonempty. Let f:R[tex]\rightarrow[/tex]R be defined by f(x)=inft E X|t-x|.

Show:
1. f is uniformly continuous
2. f[tex]\equiv[/tex]0 if and only if X is dense in R

Homework Equations


none


The Attempt at a Solution


I am clueless as to how to go about this. Help!
 
Physics news on Phys.org
For the first part, you'll need to show

[tex]\forall \epsilon >0: \exists \delta>0: \forall x,a: |x-a|<\delta~\Rightarrow |f(x)-f(a)|<\epsilon[/tex]

So we'll need to do something with |f(x)-f(a)|. This is what we do (make sure you understand all steps):

[tex]|f(x)-f(a)|=|\inf_{t\in X}{|t-x|}-\inf_{t\in X}{|t-a|}|\leq|\inf_{t\in X}{|t-x|-|t-a|}|\leq \inf_{t\in X}{|x-a|}=|x-a|[/tex]

So take [tex]\delta=\epsilon[/tex]


Another way you can proof 1, is to pick [tex]x_0\in X[/tex] and then notice that [tex]0\leq f(x)\leq |x-x_0|[/tex]. Then f is uniform continuous since it is sandwiched between uniform continuous functions...
 
Thanks for the answer! It makes a lot of sense.

For the second part, I have a general idea of how to solve the problem. If X is dense in R, then inf|t-x| will always be zero. However, how can this be expressed with a formal proof?
 
Take X dense and take x arbitrarly. For every [tex]\epsilon>0[/tex], there exists t in X such that [tex]|t-x|<\epsilon[/tex]. Thus [tex]\inf_{t\in X}{|t-x|}<\epsilon[/tex]. This holds for every epsilon, so the value must be zero.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
1
Views
1K
Replies
7
Views
2K
Replies
14
Views
3K
Replies
3
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K