MHB The Metric Space R^n and Sequences .... Remark by Carothers, page 47 ....

Click For Summary
The discussion focuses on the convergence of sequences of vectors in R^n as described in N. L. Carothers' "Real Analysis." It establishes that a sequence of vectors converges if and only if each of its coordinate sequences converges in R. Participants explore proofs using different norms, particularly the 1-norm and the 2-norm, to demonstrate both sufficiency and necessity of component-wise convergence. They discuss the implications of convergence and Cauchy sequences, emphasizing the importance of strict inequalities in their proofs. Overall, the conversation deepens the understanding of convergence in metric spaces.
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading N. L. Carothers' book: "Real Analysis". ... ...

I am focused on Chapter 3: Metrics and Norms ... ...

I need help with a remark by Carothers concerning convergent sequences in \mathbb{R}^n ...Now ... on page 47 Carothers writes the following:
View attachment 9216
In the above text from Carothers we read the following:

" ... ... it follows that a sequence of vectors $$x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)$$ in $$\mathbb{R}^n$$ converges (is Cauchy) if and only if each of the coordinate sequences $$( x_j^k )$$ converges in $$\mathbb{R}$$ ... ... "
My question is as follows:

Why exactly does it follow that a sequence of vectors $$x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)$$ in $$\mathbb{R}^n$$ converges (is Cauchy) if and only if each of the coordinate sequences $$( x_j^k )$$ converges in $$\mathbb{R}$$ ... ... ?
Help will be appreciated ...

Peter
 

Attachments

  • Carothers - Remarks on R^n as a metric space .... Page 47 .png
    Carothers - Remarks on R^n as a metric space .... Page 47 .png
    12.6 KB · Views: 153
Physics news on Phys.org
it may be convenient to switch norms slightly here...
in particular with

$\mathbf z:= \mathbf x - \mathbf y$

and $\mathbf z \in \mathbb R^n$

convince yourself that
$\big \Vert \mathbf z \big \Vert_2 \leq \big \Vert \mathbf z \big \Vert_1 \leq \sqrt{n}\cdot \big \Vert \mathbf z \big \Vert_2$

where the first inequality is triangle inequality and 2nd one is cauchy-schwarz (with 1's trick). to a large extent the 1 norm allows you linearize the distance computed on each component... can you prove the Carothers comment of convergence iff each $x_k$ converges in $\mathbb R$ using the 1 norm? The first leg should be easy -- select $\frac{\epsilon}{n}$ for each component and import favorite single variable real analysis results. The second leg is similar...
- - - -
Then using the above chain of inequalities this givess you the result with the 2 norm / standard metric on x,y.
 
steep said:
it may be convenient to switch norms slightly here...
in particular with

$\mathbf z:= \mathbf x - \mathbf y$

and $\mathbf z \in \mathbb R^n$

convince yourself that
$\big \Vert \mathbf z \big \Vert_2 \leq \big \Vert \mathbf z \big \Vert_1 \leq \sqrt{n}\cdot \big \Vert \mathbf z \big \Vert_2$

where the first inequality is triangle inequality and 2nd one is cauchy-schwarz (with 1's trick). to a large extent the 1 norm allows you linearize the distance computed on each component... can you prove the Carothers comment of convergence iff each $x_k$ converges in $\mathbb R$ using the 1 norm? The first leg should be easy -- select $\frac{\epsilon}{n}$ for each component and import favorite single variable real analysis results. The second leg is similar...
- - - -
Then using the above chain of inequalities this givess you the result with the 2 norm / standard metric on x,y.
Thanks for the help steep ...

Will try to prove the following based on your advice ...

... a sequence of vectors $$x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)$$ in $$\mathbb{R}^n$$ converges if and only if each of the coordinate sequences $$( x_j^k )$$ converges in $$\mathbb{R}$$ ... ... I think we may proceed as follows where $$z = x - y$$ ...$$\| z \mid \mid_2 \ = \| \sum_{ j = 1}^n z_j e_j \| \leq \sum_{ j = 1}^n \| z_j e_j \| = \sum_{ j = 1}^n \mid z_j \mid \| e_j \| = \sum_{ j = 1}^n \mid z_j \mid $$Thus $$\| x - y \| = \left( \sum_{ j = 1}^n \mid x_j - y_j \mid^2 \right)^{ \frac{1}{2} } \leq \sum_{ j = 1}^n \mid x_j - y_j \mid $$ ... ... ... (1)Now ... given (1) above ...

... if $$( x_j^k)_{ k = 1}^{ \infty }$$ converges to a limit $$y \in \mathbb{R}^n$$ ...

... then for every $$\frac{ \epsilon }{ n } \gt 0 \ \exists \ N(j ; \epsilon )$$ such that for $$k \geq N(j ; \epsilon )$$ ...

... we have $$\mid x_j^k - y \mid \lt \frac{ \epsilon }{ n }$$ ...But then we have $$x^{ (k) }$$ converges to $$y$$ since ...

... for every $$\epsilon \gt 0 \ \exists \ N( \epsilon )$$ such that for $$k \geq N( \epsilon )$$ we have ...

... $$\| x^{ (k) } - y \| \leq \sum_{ j = 1}^n \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n } + \ ... \ ... \ \frac{ \epsilon }{ n } = \epsilon$$
Is that correct?Proof is similar for Cauchy sequences in \mathbb{R}^n ...Peter
 
Last edited:
Peter said:
Thanks for the help steep ...

Will try to prove the following based on your advice ...

... a sequence of vectors $$x^{ (k) } = ( x_1^k, \ ... , x_n^k)$$ in $$\mathbb{R}^n$$ converges if and only if each of the coordinate sequences $$( x_j^k )$$ converges in $$\mathbb{R}$$ ... ... I think we may proceed as follows where $$z = x - y$$ ...$$\| z \mid \mid_2 \ = \| \sum_{ j = 1}^n z_j e_j \| \leq \sum_{ j = 1}^n \| z_j e_j \| = \sum_{ j = 1}^n \mid z_j \mid \| e_j \| = \sum_{ j = 1}^n \mid z_j \mid $$Thus $$\| x - y \| = \left( \sum_{ j = 1}^n \mid x_j - y_j \mid^2 \right)^{ \frac{1}{2} } \leq \sum_{ j = 1}^n \mid x_j - y_j \mid $$ ... ... ... (1)Now ... given (1) above ...

... if $$( x_j^k)_{ k = 1}^{ \infty }$$ converges to a limit $$y \in \mathbb{R}^n$$ ...

... then for every $$\frac{ \epsilon }{ n } \gt 0 \ \exists \ N(j ; \epsilon )$$ such that for $$k \geq N(j ; \epsilon )$$ ...

... we have $$\mid x_j^k - y \mid \lt \frac{ \epsilon }{ n }$$ ...But then we have $$x^{ (k) }$$ converges to $$y$$ since ...

... for every $$\epsilon \gt 0 \ \exists \ N( \epsilon )$$ such that for $$k \geq N( \epsilon )$$ we have ...

... $$\| x^{ (k) } - y \| \leq \sum_{ j = 1}^n \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n } + \ ... \ ... \ \frac{ \epsilon }{ n } = \epsilon$$
Is that correct?Proof is similar for Cauchy sequences in \mathbb{R}^n ...Peter

I think this is basically right. You may be approaching it in a more succinct manner than I am... I have this in my head as 2 steps, first sufficiency, then necessity. The above clearly gives sufficiency. I'm not sure I saw the second leg, necessity, though.

Another way to get it is to use the infinity /max norm, so

$\big\Vert \mathbf z \big \Vert_\infty^2 = \max\big(z_1^2, z_1^2, ..., z_n^2\big) \leq \sum_{i=1}^n z_i^2 = \big \Vert \mathbf z\big \Vert_2^2$
hence taking square roots over non-negative numbers gives

$\big\Vert \mathbf z \big \Vert_\infty \leq \big \Vert \mathbf z\big \Vert_2$

so for the second leg, suppose that there is (at least one) coordinate j that doesn't converge -- i.e. where there is no N large enough such that $\vert z_j^{(n)} \vert \lt \epsilon_0$, for all $n\geq N$... then if the sequence still convergences you'd have

$\epsilon_0 \leq \vert z_j^{(n)} \vert \leq \big \Vert \mathbf z^{(n)}\big \Vert_\infty \leq \big \Vert \mathbf z^{(n)}\big \Vert_2 \lt \epsilon$
for some $n \geq N$ for any $N$. Selecting $\epsilon := \epsilon_0$ then contradicts the definition of convergence.

There's probably a slightly nicer way of showing it, but this is at the heart of the necessity of component-wise convergence.
 
steep said:
I think this is basically right. You may be approaching it in a more succinct manner than I am... I have this in my head as 2 steps, first sufficiency, then necessity. The above clearly gives sufficiency. I'm not sure I saw the second leg, necessity, though.

Another way to get it is to use the infinity /max norm, so

$\big\Vert \mathbf z \big \Vert_\infty^2 = \max\big(z_1^2, z_1^2, ..., z_n^2\big) \leq \sum_{i=1}^n z_i^2 = \big \Vert \mathbf z\big \Vert_2^2$
hence taking square roots over non-negative numbers gives

$\big\Vert \mathbf z \big \Vert_\infty \leq \big \Vert \mathbf z\big \Vert_2$

so for the second leg, suppose that there is (at least one) coordinate j that doesn't converge -- i.e. where there is no N large enough such that $\vert z_j^{(n)} \vert \lt \epsilon_0$, for all $n\geq N$... then if the sequence still convergences you'd have

$\epsilon_0 \leq \vert z_j^{(n)} \vert \leq \big \Vert \mathbf z^{(n)}\big \Vert_\infty \leq \big \Vert \mathbf z^{(n)}\big \Vert_2 \lt \epsilon$
for some $n \geq N$ for any $N$. Selecting $\epsilon := \epsilon_0$ then contradicts the definition of convergence.

There's probably a slightly nicer way of showing it, but this is at the heart of the necessity of component-wise convergence.
Hi steep ...

Thanks again for your considerable assistance ... Thought I would try a direct approach to demonstrate that ...

... if a sequence of vectors $$x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)$$ in $$\mathbb{R}^n$$ converges ...

... then ... each of the coordinate sequences $$( x_j^k )$$ converges in $$\mathbb{R}$$ ... ... Proceed as follows, where $$z = x - y$$ ...$$\mid z_j \mid = \| z_j \|_2 = \sum_{ j = 1}^1 (z_j^2)^{ \frac{1}{2} } = \left( \sum_{ j = 1}^1 z_j^2) \right)^{ \frac{1}{2} } \leq \left( \sum_{ j = 1}^n z_j^2) \right)^{ \frac{1}{2} } = \| z \|_2$$ ... ... ... ... ... (2)Given (2) above ... we have ...

... if $$x^{ (k) }$$ converges to $$y$$ in $$\mathbb{R}^n$$ ...

... then ... for every $$\epsilon \gt 0 \ \exists \ N( \epsilon )$$ such that for $$k \geq N( \epsilon )$$ we have ...

... $$\| x^{ (k) } - y \|_2 \leq \epsilon$$ ...But then, for arbitrary j, we have $$( x_j^k)_{ k = 1}^{ \infty }$$ converges to a limit $$y_j \in \mathbb{R}$$ ...

... since for every $$\epsilon \gt 0 \ \exists \ N( \epsilon )$$ such that for $$k \geq N( \epsilon )$$ we have ...

$$\mid x_j -y_j \mid = \| x_j -y_j \|_2 \leq \| x^{ (k) } - y \|_2 \leq \epsilon$$ ...
Is that correct?
Thanks once again for the help!

Peter
 
Peter said:
...
But then, for arbitrary j, we have $$( x_j^k)_{ k = 1}^{ \infty }$$ converges to a limit $$y_j \in \mathbb{R}$$ ...

... since for every $$\epsilon \gt 0 \ \exists \ N( \epsilon )$$ such that for $$k \geq N( \epsilon )$$ we have ...

$$\mid x_j -y_j \mid = \| x_j -y_j \|_2 \leq \| x^{ (k) } - y \|_2 \leq \epsilon$$ ...

re-reading this thread with a fresh set of eyes I see that your post 3 really was

$\text{convergence in each component} \longrightarrow \text{convergence of vector in } \mathbb R^n$

and this current post 5 is the other leg
$\text{convergence of vector in } \mathbb R^n \longrightarrow \text{convergence in each component} $

and yes I think it works. The only nitpick I'll do is to make sure to use strictness in the inequality with the epsilon, i.e.
$\| x^{ (k) } - y \|_2 \lt \epsilon$

any other items are immaterial... and looking back through my posts it looks like I overloaded $n$ both for the dimension in reals as well as the number in the sequence, so no need to nitpick too much ;)
 
steep said:
re-reading this thread with a fresh set of eyes I see that your post 3 really was

$\text{convergence in each component} \longrightarrow \text{convergence of vector in } \mathbb R^n$

and this current post 5 is the other leg
$\text{convergence of vector in } \mathbb R^n \longrightarrow \text{convergence in each component} $

and yes I think it works. The only nitpick I'll do is to make sure to use strictness in the inequality with the epsilon, i.e.
$\| x^{ (k) } - y \|_2 \lt \epsilon$

any other items are immaterial... and looking back through my posts it looks like I overloaded $n$ both for the dimension in reals as well as the number in the sequence, so no need to nitpick too much ;)
Thanks for all your help, steep ...

I really appreciate it ...

Peter
 

Similar threads

  • · Replies 17 ·
Replies
17
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K