# Uniform continuous function and distance between sets

Homework Statement .

Let ##f: (X,d) → (Y,d')## a uniform continuous function, and let ##A, B \subseteq X## non-empty sets such that ##d(A,B)=0##. Prove that ##d'(f(A),f(B))=0##

I've been thinking this exercise but I don't have any idea where to or how to start, could someone give me a hint?

jbunniii
Homework Helper
Gold Member
Maybe start by stating the definition of a uniformly continuous function, and the definition of the distance between two sets.

1 person
pasmith
Homework Helper
Homework Statement .

Let ##f: (X,d) → (Y,d')## a uniform continuous function, and let ##A, B \subseteq X## non-empty sets such that ##d(A,B)=0##. Prove that ##d'(f(A),f(B))=0##

I've been thinking this exercise but I don't have any idea where to or how to start, could someone give me a hint?

You need to show that for all $\epsilon > 0$ there exist $x \in A$ and $y \in B$ such that $d'(f(x),f(y)) < \epsilon$.

1 person
Ok, I see it. I want to prove that if ##d'(f(A),f(B))≠0## but ##d(A,B)=0## then f is not uniformly continuous.

Assume ##d'(f(A),f(B))≠0## and ##d(A,B)=0##. By hypothesis, ##d(A,B)=0##, which means that ##\forall## ##δ>0##, ##\exists## x, y in A and B respectively : ##d(x,y)<δ##. But the distance between the two images of the sets A and B is not 0, so ##\exists## ##ε>0## such that ##\forall## ##x \in A## and ##\forall## ##y \in B##, ##d'(f(x),f(y))≥ε##. It follows that f is not uniformly continuous.

pasmith
Homework Helper
Ok, I see it. I want to prove that if ##d'(f(A),f(B))≠0## but ##d(A,B)=0## then f is not uniformly continuous.

Are you sure you want to prove that? I think it's equivalent to the proposition you are asked to prove, but it's not obviously so.

By hypothesis, ##d(A,B)=0##, which means that ##\forall## ##δ>0##, ##\exists## x, y in A and B respectively : ##d(x,y)<δ##.

This is correct.

But the distance between the two images of the sets A and B is not 0, so ##\exists## ##ε>0## such that ##\forall## ##x \in A## and ##\forall## ##y \in B##, ##d'(f(x),f(y))≥ε##.

This is true.

It follows that f is not uniformly continuous.

You have so far that

"There exists $\epsilon > 0$ such that for all $\delta > 0$ there exist $x \in A \subset X$ and $y \in B \subset X$ such that $d(x,y) < \delta$ and $d'(f(x),f(y)) \geq \epsilon$."

That is indeed the negation of the definition of uniform continuity, so you have shown that if $d(A,B) = 0$ and $d'(f(A),f(B)) > 0$ then $f$ is not uniformly continuous.

But it's an excessively convoluted and not at all obvious proof of the result you were asked to prove.

There is an easier way. These exercises usually solve themselves if you just write down the formal definitions of the concepts involved and string them together in the right order. The trick is to find the right order.

First let's think about what we want to prove. From the definition of $d'(f(A),f(B))$ we see that want to prove that the greatest lower bound of $\{d'(f(x),f(y)) : x \in A, y \in B\}$ is zero.

Now zero is trivially a lower bound, because metrics are by definition non-negative. So all we really need to show is that there is no greater lower bound. In other words, we must show that for all $\epsilon > 0$ there exist $x \in A$ and $y \in B$ such that $d'(f(x),f(y)) < \epsilon$, so that $\epsilon$ is not a lower bound.

That statement includes the formula "$d'(f(x),f(y)) < \epsilon$", which suggests that the definition of uniform continuity is a good place to start.

As you said, 0 is always a lower bound by definition of distance. So suppose 0 is not the inf {##d'(f(x),f(y))##, with ##f(x) \in A## and ##f(y) \in B##}. Then there is ##β>0## such that ##β≤d(f(x),f(y))## ##\forall## ##f(x) \in f(A)##, ##f(y) \in f(B)##. The function is uniformly continuous, which means that ##\forall## ##ε>0## there is some ##δ_ε## : ##d(x,y)<δ_ε## ##→## ##d'(f(x),f(y))<ε##. Let ##ε=β##, we know that ##dist(A,B)=0##, in particular, for ##δ_β##, ##d(x,y)<δ_β## ##\forall## ##x \in A, y \in B##. But then, ##β≤d'(f(x),f(y))<β## ##\forall## ##f(x) \in f(A), f(y) \in f(B)##, which is absurd. The absurd comes from the assumption that 0 wasn't the greatest lower bound. It follows that 0 is inf{##d'(f(x),f(y))##, with ##f(x) \in A## and ##f(y) \in B##}, and by definition, this is the distance between the sets ##f(A)## and ##f(B)##.

Is this ok? Thanks for your help!