# Subsequences and sequential compactness

## The Attempt at a Solution

If $(x_n)_{n\in\mathbb{N}}$ is a sequence in $(X,d)$ and $(n_k)_{ k \in \mathbb{N}}$ is a strictly increasing sequence in $\mathbb{N}$ then $(x_ {n_k} )_{k \in \mathbb{N}}$ is a subsequence of $(x_n)_{n\in\mathbb{N}}$.

How would you prove that if $x_n$ converges to $x$ in $X$ then any subsequence of $x_n$ converges to $x$?

In (a) how would you justify that the sequence $$x_n = n\;\;(n\in\mathbb{N})$$ has no convergent subsequence?

For (b) the alternating sequence $$0, 1, 0, 1, 0, 1, 0, 1, ...$$ diverges but if $n_k = 2k$ then $x_{n_k} = 1$ which trivially converges to 1.

$(X,d)$ is sequentially compact if every sequence has a convergent subsequence.

For (a) the open interval $(0,1) \subset \mathbb{R}$ is not sequentially compact as, letting $x_n = \frac{1}{n}$ we have that $x_n \to 0$ in $\mathbb{R}$, hence every subsequence converges to 0, but $0\notin (0,1)$ so by uniqueness of limits, no subsequence of $\frac{1}{n}$ converges in $(0,1)$, so $(0,1)$ is not sequentially compact.

What counterexample could I use for (b)?

Last edited:

How would you prove that if $x_n$ converges to $x$ in $X$ then any subsequence of $x_n$ converges to $x$?
What is the definition of $\{ x_n \}$ converging to $x$ in $X$? And remember $(\forall n\in \mathbb{N})\ (\exists n_k \in \mathbb{N}) : n_k\geq n.$

So since $x_n \to x$, for any $\varepsilon > 0$ there exists $N\in\mathbb{N}$ such that if $n>N$ then $d(x_n , x) < \varepsilon$.

But for all $n\in\mathbb{N}$ there exists $n_k \in\mathbb{N}$ with $n_k > n$ (Archimedes - $\mathbb{N}$ is unbounded above).

But then for any $\varepsilon > 0$ there exists $N\in\mathbb{N}$ such that if $n_k > n >N$ then $d(x_{n_k} , x) < \varepsilon$ showing $x_{n_k} \to x$.

How would you give brief justification that the sequence $x_n = n\in\mathbb{N}$ cannot have a convergent subsequence? (it's divergent but (b) shows that you can have a divergent sequence with a convergent subsequence.)

I suppose you would prove it by the same reasoning as you'd prove that the sequence $\{n\}_{n\in \mathbb{N}}$ diverges, reductio ad absurdum using the definition of convergence to get a contradiction.

For the last problem, try to find a sequence of continuous functions that converges to a non-continuous function. I have one in mind, but haven't thought of the details yet. Consider piecewise functions...

edit: Sorry, the path I suggested might not lead you anywhere. It seems that if a sequence of continuous functions converges to a function, then the function is continuos (using the metric $d=d_{\infty})$ if the following proof is correct:

Suppose that the sequence of continuous functions $\{f_n\}$ converges to $g.$ Then for all $n$ and $x_0 \in [0,1]$ we have:
$(\forall \epsilon>0) (\exists \delta_{\epsilon} > 0) (\forall x \in [0,1]) : |x-x_0|<\delta_{\epsilon} \Rightarrow |f_n(x)-f_n(x_0)| < \epsilon$
and
$(\forall \epsilon' > 0) (\exists n_0) (\forall n>n_0) :\ d(f_n,g) < \epsilon'.$

Now for any $\epsilon>0$ and $x_0\in [0,1]$ let $n$ be s.t. $d(f_n,g) < \frac{\epsilon}{3}$ and let $|x-x_0|< \displaystyle \delta_{\frac{\epsilon}{3}} (\Rightarrow |f_n(x)-f_n(x_0)|<\frac{\epsilon}{3}).$

Now $|g(x)-g(x_0)|=|g(x)-f_n(x)+f_n(x)-f_n(x_0)+f_n(x_0)-g(x_0)| \leq |g(x)-f_n(x)|+|f_n(x)-f_n(x_0)|+|f_n(x_0)-g(x_0)|$
$< 2\cdot d(f_n,g) + \frac{\epsilon}{3} < 3\cdot \frac{\epsilon}{3}=\epsilon,$
consequently $g$ is continuous.

I haven't got any ideas for the problem.

Last edited:
Ted, I think the usual example shown on the cover of this book works. Just ordered this--they should pay me for advertising. :)

FurryGoat, I'm pretty sure one needs to assume uniform convergence to prove that the limiting function is continuous. Remember, point-wise convergence only tells us that for each x, ##\lim_{n \to \infty} f_n(x) = f(x)##. Using your notation, the ##n_0## you choose depends on the x in question.

I think the example I gave checks out, but let me know if I'm mistaken.

Ted, I think the usual example shown on the cover of this book works. Just ordered this--they should pay me for advertising. :)

FurryGoat, I'm pretty sure one needs to assume uniform convergence to prove that the limiting function is continuous. Remember, point-wise convergence only tells us that for each x, ##\lim_{n \to \infty} f_n(x) = f(x)##. Using your notation, the ##n_0## you choose depends on the x in question.

I think the example I gave checks out, but let me know if I'm mistaken.

Mayby I'm confused with the notations but I thought that $\displaystyle d_{\infty}(f,g)=\sup_{x\in[a,b]}|f(x)-g(x)|<\epsilon$ meant uniform convergence, since doesn't this mean that
$\forall x\in[a,b]:\ |f(x)-g(x)|<\epsilon?$

Mayby I'm confused with the notations but I thought that $\displaystyle d_{\infty}(f,g)=\sup_{x\in[a,b]}|f(x)-g(x)|<\epsilon$ meant uniform convergence, since doesn't this mean that
$\forall x\in[a,b]:\ |f(x)-g(x)|<\epsilon?$

Yeah, you're right convergence w.r.t. ##d_\infty## does imply uniform convergence... Well, the example sequence I had in mind was ##f_n(x) = x^n##, which converges pointwise to
$$f(x) = \begin{cases} 0 & \text{if} \; \; x \in [0,1) \\ 1 & \text{if} \; \; x = 1 \end{cases}$$
but now I realize that this is not the limit with respect to ##d_\infty## since ##\sup\{x^n - f(x) : x \in [0,1]\} = 1## for all n. Sorry!

Yeah, that's the part that caught me as well. I remember seing your function somewhere also, it seems to be the most straight forward. The function sequence I had in mind was
$f_n(x) = \begin{cases} n\cdot x & \text{if} \; \; 0 \leq x < \frac{1}{n} \\ 1 & \text{if} \; \; \frac{1}{n} \leq x \leq 1 \end{cases}$ which would converge to a similar, non-continuous function $g$, but still $d_{\infty}(f_n,g)=1.$ So it seems we have to come up with a sequence which converges to a function not defined on some points in $[0,1],$ or are there other possibilities?

edit: Or a function with an absolute value $> 1.$

I now realize I was confusing myself--I was trying to prove that the subspace is not complete, when I should be thinking about sequential compactness. That is, we need to find a sequence that has no convergent subsequence. And as Ted mentioned above, this sequence needs to be divergent since if it is convergent then every subsequence will converge to the original limit.

So now I do think my example works. I haven't proved it, but given ##n_0##, I'm fairly certain it's possible to choose ##m,n \geq n_0## such that ##d_\infty(x^m,x^n) > 1/2##. One just has to pick m much larger than n so that ##|x^m - x^n|## approaches 1 near x=1. I think one can show similarly that the sequence has no convergent subsequence.

I now realize I was confusing myself--I was trying to prove that the subspace is not complete, when I should be thinking about sequential compactness. That is, we need to find a sequence that has no convergent subsequence. And as Ted mentioned above, this sequence needs to be divergent since if it is convergent then every subsequence will converge to the original limit.

So now I do think my example works. I haven't proved it, but given ##n_0##, I'm fairly certain it's possible to choose ##m,n \geq n_0## such that ##d_\infty(x^m,x^n) > 1/2##. One just has to pick m much larger than n so that ##|x^m - x^n|## approaches 1 near x=1. I think one can show similarly that the sequence has no convergent subsequence.

I agree with this. I got carried away and forgot what was to be proved.