# Metric Spaces

1. Feb 4, 2010

### kingwinner

"Given a metric space (X,ρ), define a new metric on X by σ(x,y)=min{ρ(x,y),1}. The reader can check that σ is indeed a metric on X.
Claim 1: xn->x in (X,ρ) iff xn->x in (X,σ).
Claim 2: {xn} is Cauchy in (X,ρ) iff it is Cauchy in (X,σ). (Hence completeness is the same for these two metrics.)"
=================================

Right now, I'm trying to prove claim 1.
xn->x in (X,ρ) means:
for all ε>0, there exists N s.t. if n>N, then ρ(xn,x)<ε

xn->x in (X,σ) means:
for all ε>0, there exists M s.t. if n>M, then σ(xn,x)<ε

Now if 0<ε<1, then ρ(xn,x)<ε <=> σ(xn,x)<ε, so taking N=M above works in the definition of limit. Am I correct so far?
But if ε>1, I think it is NOT true that ρ(xn,x)<ε <=> σ(xn,x)<ε, right? So what should I take N or M to be in the case when ε>1? What should I do for this case?

Thanks for any help!

2. Feb 4, 2010

Let ε > 1. Then ρ(xn,x) <= 1 must hold for all n > N. By definition, σ(x,y)=min{ρ(x,y),1}, so either σ(xn,x) = 1 or σ(xn,x) < 1, so obviously σ(xn,x) < ε holds for M = N.

3. Feb 4, 2010

### kingwinner

So for this problem, to prove claim 1, we really need to consider two cases 0<ε<1, ε>1 separately?

thanks.

4. Feb 4, 2010

Yes, since the definition of σ(x,y) depends on these cases.

5. Feb 4, 2010

### kingwinner

OK, but how can we prove the other direction?

Let ε > 1. Then there exists M s.t. σ(xn,x) <= 1.5 for all n > M. By definition, σ(x,y)=min{ρ(x,y),1}, but now σ(xn,x) <= 1.5 does NOT imply that ρ(x,y)<=1.5. So how can we prove it?

Thanks!

6. Feb 4, 2010

### JSuarez

It's not necessary to consider the cases $\epsilon \ge 1$ and $\epsilon < 1$ seperately in the first implication, just for the second; just note that $\sigma\left(x,y\right) \leq \rho\left(x,y\right)$, for all $x,y\in X$.

Then, if $x_n\rightarrow x$ in $\left(X,\rho\right)$, then the above inequality immediately implies that it also converges in $\left(X,\sigma\right)$.

On the other hand, if $x_n$ does not converge to $x$ in $\left(X,\rho)$, then there is a subsequence $x_{n_k}$, such that:

(a) $\rho\left(x_{n_k},x\right) \ge 1$; in this case, $\sigma\left(x_{n_k},x\right)= 1$ and we're done.

(b)There exists an $\epsilon>0$, such that $0<\epsilon<\rho\left(x_{n_k},x\right) < 1$; in this case, the same must happen with $\sigma$, and we are, again, done.

7. Feb 4, 2010

### kingwinner

Is it possible to prove directly from the definition that:
xn->x in (X,σ) => xn->x in (X,ρ)
for the case ε>1?

Thanks!

8. Feb 4, 2010

### JSuarez

Remember that a sequence doesn't converge to a point iff there is, at least, a subsequence of it that doesn't converge to that point.

Yes. For $\epsilon > 1$, take as the index n_0 (the one that, for all n>n_0, all the terms in the sequence belong to the given neighborhood of the limit) the one obtained for $1/\epsilon < 1$.

9. Feb 5, 2010

### kingwinner

hmm...what is n_0? And how can this help us prove that xn->x in (X,σ) => xn->x in (X,ρ) for the case ε>1? Could you please explain a little more?

Let ε > 1. Then there exists M s.t. σ(xn,x) <= 1.5 for all n > M. By definition, σ(x,y)=min{ρ(x,y),1}, but now σ(xn,x)<=1.5 does NOT imply that ρ(x,y)<=1.5. For the case ε > 1, I'm facing troubles like this, so I'm not sure what to do...

Thanks for helping!

10. Feb 5, 2010

### JSuarez

It seems that you are stuck with a few point in the definition of convergence. When do we say that a sequence $x_n\in \left(X,d\right)$, converges to a point $x$? Well, when the following is satisfied:

$$\forall \epsilon>0 \exists n_0 \in \mathbb N \forall n>n_0 \left(d(x_n,x)<\epsilon\right)$$

For this particular case, suppose that $x_n\rightarrow x$ in $\left(X,\sigma\right)$, and you want to prove that it also converges in $\left(X,\rho\right)$. Then, if you give me an $\epsilon = 3/2$ (for example), I have to come up with an $n_0$, such that the above is satisfied, that is, for all $n>n_0$, we must have $\rho\left(x_n,x\right)<\epsilon$.

How can I find this $n_0$? I take $\epsilon'=1/\epsilon<1$ and find the $n_0$ that satisfies the definition for the metric $\sigma$, where I know it exists by hypothesis. But for this $n_0$, we have, for all $n>n_0$, $\rho\left(x_n,x\right)=\sigma\left(x_n,x\right)<\epsilon'<\epsilon$.

11. Feb 6, 2010

### kingwinner

OK, now I understand. But how did you come up with the brilliant idea of using ε' = 1/ε <1? Is there any natural/systmatic way to see this?

Can I also prove the claim like this...?

Claim: xn->x in (X,σ) => xn->x in (X,ρ) for the case ε>1

Proof:(?)
Given ε > 1.
By hypothesis, there exists M s.t. σ(xn,x)<0.5 for all n > M. By definition, σ(x,y)=min{ρ(x,y),1}, so σ(xn,x)<0.5 implies that ρ(x,y) < 0.5 < 1 < ε. (i.e. we can take N=M)

Is this "proof" correct?
(and the converse (for the case ε>1) can be proved in the EXACT same manner (i.e. by taking the epsilon in the hypothesis to be 0.5), right?)

Can somebody confirm this, please? (or correct my mistake if there is one)

Thanks! :)

12. Feb 6, 2010

### Werg22

Yes, but you really don't need to break it up into cases. Once you've proved it for all epsilon bounded above by some number (in this case, 1), you are immediately done.

13. Feb 7, 2010

### kingwinner

But actually, do we really need 2 separate cases for 0<ε<=1 and ε>1?
If we've proved it for 0<ε<=1, then that exact same N would also work in the definition of convergence for any ε>1, right?

14. Feb 7, 2010

### Werg22

If I understand you correctly, yes.

15. Feb 7, 2010

### kingwinner

How can we prove claim 2: {xn} is Cauchy in (X,σ) iff it is Cauchy in (X,ρ) ?

Definition: (xn) is Cauchy iff for all ε>0, there exists N s.t. if n,m≥N, then d(xn,xm)<ε.

I know the definition, but I am not too sure how to prove that something is Cauchy. How can we find a workable N here? (I've always been struggling to prove that something is Cauchy, and it's one of my biggest challenges and frustration so far... )

Any help is greatly appreciated!

16. Feb 7, 2010

### JSuarez

Exactly the same way as before; the only difference is that, instead of estimating the distance to a limit point $x$, you have to prove that, given $\epsilon$ there is an $n_0$, such that, forl all $m,n>n_0$:

$$\sigma\left(x_m,x_n\right)<\epsilon \Leftrightarrow \rho\left(x_m,x_n\right)<\epsilon$$

17. Feb 7, 2010

### kingwinner

Do we actually need to PROVE that $$\sigma\left(x_m,x_n\right)<\epsilon \Leftrightarrow \rho\left(x_m,x_n\right)<\epsilon$$? It is trivially true for any 0<ε≤1, isn't it?

I'm trying to prove that {xn} is Cauchy in (X,σ) => it is Cauchy in (X,ρ).

Proof:(?)
Suppose {xn} is Cauchy in (X,σ), i.e. for all ε'>0, there exists N s.t. if n,m≥N, then σ(xn,xm)<ε'.
Given any 0<ε≤1, there exists N1 s.t. if n,m≥N1, then σ(xn,xm)<ε
=> ρ(xn,xm)<ε
The same N1 also works for any ε>1.
So {xn} is Cauchy in (X,ρ).
(the proof of the converse is identical)

Is this a valid proof? Please kindly confrim/correct me if it's wrong.
Thank you!

18. Feb 7, 2010

### JSuarez

Yes. In this case, the metrics are identical. And it's also immediate in the $\Leftarrow$ direction by the inequality $\sigma\left(x,y\right) \leq \rho\left(x,y\right)$.

Yes, because if $\rho\left(x,y\right) < \epsilon' < \epsilon$. Your proof is correct.

19. Feb 8, 2010

### kingwinner

I mean, if the N1 works for 0<ε≤1, then certainly the same N1 also works for any ε>1. (e.g. Say if the distance between xn and xm is less than 0.5, then certainly the distance between xn and xm is less than 1.5. The same N1 must work.)

20. Feb 8, 2010

### kingwinner

But there is another part of claim 2.
Claim 2: {xn} is Cauchy in (X,ρ) iff it is Cauchy in (X,σ). (Hence completeness is the same for these two metrics.)"

Now we've proved that {xn} is Cauchy in (X,ρ) iff it is Cauchy in (X,σ). But then how can we show that completeness is the same for these two metrics?

By definition, a metric space X is complete iff every Cauchy sequence in X converges in X.

To say that completeness is the same for these two metrics, we need to prove that:
with the metric ρ, every Cauchy sequence in X converges in X
<=> with the metric σ, every Cauchy sequence in X converges in X.

But this doesn't seem so obvious to me. How can we actually prove it?

I am still puzzled on this part and I would appreciate if anyone can help.
Thank you!