Absolute and Conditional Convergence .... Sohrab Proposition 2.3.22 ....

In summary: N_0 we have \mid s_m - s_n \mid \ \lt \epsilon. So by the Cauchy Criterion, $\sum x_n$ is convergent. In summary, Proposition 2.
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Houshang H. Sohrab's book: "Basic Real Analysis" (Second Edition).

I am focused on Chapter 2: Sequences and Series of Real Numbers ... ...

I need help with the proof of Proposition 2.3.22 ...

Proposition 2.3.12 reads as follows:
View attachment 9054Can someone please demonstrate (formally and rigorously) how Proposition 2.322 is ( ... as Sohrab claims but does not show ... ) an immediate consequence of Cauchy's Criterion and the Triangle Inequality ...

Help will be appreciated ...

Peter
 

Attachments

  • Sohrab - Proposition 2.3.22 ...  .png
    Sohrab - Proposition 2.3.22 ... .png
    14.6 KB · Views: 133
Physics news on Phys.org
  • #2
Peter said:
Can someone please demonstrate (formally and rigorously) how Proposition 2.322 is ( ... as Sohrab claims but does not show ... ) an immediate consequence of Cauchy's Criterion and the Triangle Inequality ...

Peter -- can I ask that you take the 1st half of my response here:
https://mathhelpboards.com/analysis-50/convergence-geometric-series-sohrab-proposition-2-3-8-a-26202-post115842.html#post115842

and re-write it as a "special case" of the above criterion? I mean explicitly show what is contained in the telescoping sum--- write the partial sums $s_n$ and $s_m$ in explicit form and you do get the desired result of a cauchy sequence (with help from triangle inequality) almost immediately. You don't have a telescope in general, but you do have partial sums...

For what its worth I often get annoyed when authors say things are immediate or obvious, but I really think it is immediate here. Please give it a shot and I'll revert if needed
 
  • #3
steep said:
Peter -- can I ask that you take the 1st half of my response here:
https://mathhelpboards.com/analysis-50/convergence-geometric-series-sohrab-proposition-2-3-8-a-26202-post115842.html#post115842

and re-write it as a "special case" of the above criterion? I mean explicitly show what is contained in the telescoping sum--- write the partial sums $s_n$ and $s_m$ in explicit form and you do get the desired result of a cauchy sequence (with help from triangle inequality) almost immediately. You don't have a telescope in general, but you do have partial sums...

For what its worth I often get annoyed when authors say things are immediate or obvious, but I really think it is immediate here. Please give it a shot and I'll revert if needed
Thanks steep ...

... here is an attempt (which I fear is a bit rough around the edges ...)To show \(\displaystyle \sum x_n\) is convergent using the Cauchy Criterion (for the sequence of partial sums ...) we need to show ... ...... that for each \(\displaystyle \epsilon \gt 0 \ \exists\) an integer N such that \(\displaystyle \ \forall \ m, n \geq N\) we have \(\displaystyle \mid s_m - s_n \mid \lt \epsilon\) ... ...Assume \(\displaystyle m \gt n\) ... then ...\(\displaystyle \mid s_m - s_n \mid \ = \ \left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert \) \(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \leq \ \sum_{ k= n + 1 }^m \mid x_k \mid\) ( I think this step is correct... ... maybe this was the time to use Triangle Inequality ? )\(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \leq \ \sum_{ k= n + 1 }^m c_n \text{ for } n \geq N_0\) \(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \lt \ \epsilon\) since \(\displaystyle \sum c_n\) is convergent ( and hence $\displaystyle\lim_{ n \to \infty} c_n = 0 )$

I am very dubious that above is correct as I didn't use the triangle inequality ... which is supposedly necessary ...

Peter
 
Last edited:
  • #4
Peter said:
Thanks steep ...

... here is an attempt (which I fear is a bit rough around the edges ...)To show \(\displaystyle \sum x_n\) is convergent using the Cauchy Criterion (for the sequence of partial sums ...) we need to show ... ...... that for each \(\displaystyle \epsilon \gt 0 \ \exists\) an integer N such that \(\displaystyle \ \forall \ m, n \geq N\) we have \(\displaystyle \mid s_m - s_n \mid \lt \epsilon\) ... ...Assume \(\displaystyle m \gt n\) ... then ...\(\displaystyle \mid s_m - s_n \mid \ = \ \left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert \) \(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \leq \ \sum_{ k= n + 1 }^m \mid x_k \mid\) ( I think this step is correct... ... maybe this was the time to use Triangle Inequality ? )\(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \leq \ \sum_{ k= n + 1 }^m c_n \text{ for } n \geq N_0\) \(\displaystyle \Longrightarrow \mid s_m - s_n \mid \ \lt \ \epsilon\) since \(\displaystyle \sum c_n\) is convergent ( and hence $\displaystyle\lim_{ n \to \infty} c_n = 0 )$

I am very dubious that above is correct as I didn't use the triangle inequality ... which is supposedly necessary ...

Peter

You did use triangle inequality, on the second line where you wrote "I think this step is correct..." that is precisely the triangle inequality.
if it helps, you can re-write it in vector algebra as

$\left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert$
$ = \big \Vert \mathbf x_{n+1} + \mathbf x_{n+2} + ... + \mathbf x_m \big \Vert_1$
$ \leq \big \Vert \mathbf x_{n+1}\big \Vert_1 + \big \Vert \mathbf x_{n+2} \big \Vert_1 + ... + \big \Vert \mathbf x_m \big \Vert_1 $
$= \sum_{ k= n + 1 }^m \mid x_k \mid $

where $\mathbf x_k = x_k \cdot \mathbf e_1$ (the 1st standard basis vector in say $\mathbb C^2$). Writing it out like this isn't needed but, may make it more obvious that you are applying triangle inequality. = = = =
so taking what you have, I suppose I'd write it as \(\displaystyle \mid s_m - s_n \mid \ \)

\(\displaystyle = \ \left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert \)

\(\displaystyle \leq \ \sum_{ k= n + 1 }^m \mid x_k \mid\)
\(\displaystyle = \sum_{ k= n + 1 }^m c_n \)
\(\displaystyle \lt \ \epsilon\)

i.e. we know the partial sums for $c_n$ are Cauchy Sequences because the we know the series involving $c_n$ converges to a finite limit (and one implies the other in single variable analysis).

= = = =
edit:
one word of caution -- I didn't follow why you said
\(\displaystyle \sum c_n\) is convergent ( and hence $\displaystyle\lim_{ n \to \infty} c_n = 0 )$
and in particular that $c_n \to 0$
The point really is that $c_n \to 0$ rapidly -- i.e. the series converges and hence the sequences of partial sums involving $c_n$ are Cauchy.
 
Last edited:
  • #5
steep said:
You did use triangle inequality, on the second line where you wrote "I think this step is correct..." that is precisely the triangle inequality.
if it helps, you can re-write it in vector algebra as

$\left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert$
$ = \big \Vert \mathbf x_{n+1} + \mathbf x_{n+2} + ... + \mathbf x_m \big \Vert_1$
$ \leq \big \Vert \mathbf x_{n+1}\big \Vert_1 + \big \Vert \mathbf x_{n+2} \big \Vert_1 + ... + \big \Vert \mathbf x_m \big \Vert_1 $
$= \sum_{ k= n + 1 }^m \mid x_k \mid $

where $\mathbf x_k = x_k \cdot \mathbf e_1$ (the 1st standard basis vector in say $\mathbb C^2$). Writing it out like this isn't needed but, may make it more obvious that you are applying triangle inequality. = = = =
so taking what you have, I suppose I'd write it as \(\displaystyle \mid s_m - s_n \mid \ \)

\(\displaystyle = \ \left\vert \ \sum_{ k= n + 1 }^m x_k \ \right\vert \)

\(\displaystyle \leq \ \sum_{ k= n + 1 }^m \mid x_k \mid\)
\(\displaystyle = \sum_{ k= n + 1 }^m c_n \)
\(\displaystyle \lt \ \epsilon\)

i.e. we know the partial sums for $c_n$ are Cauchy Sequences because the we know the series involving $c_n$ converges to a finite limit (and one implies the other in single variable analysis).

= = = =
edit:
one word of caution -- I didn't follow why you said
\(\displaystyle \sum c_n\) is convergent ( and hence $\displaystyle\lim_{ n \to \infty} c_n = 0 )$
and in particular that $c_n \to 0$
The point really is that $c_n \to 0$ rapidly -- i.e. the series converges and hence the sequences of partial sums involving $c_n$ are Cauchy.
Hi steep ...

Thanks again for all your help ...You write:

" ... ... I didn't follow why you said
\(\displaystyle \sum c_n\) is convergent ( and hence $\displaystyle\lim_{ n \to \infty} c_n = 0 )$
and in particular that $c_n \to 0$ ... ... "Just to give you an idea f what I was thinking ...
... \(\displaystyle \sum c_n\) is convergent and therefore $\displaystyle\lim_{ n \to \infty} c_n = 0 )$ [Sohrab: Proposition 2.3.4 ]Therefore ... taking \(\displaystyle t = m - n\) ... we have ...... for every \(\displaystyle \frac{ \epsilon }{ t } \gt 0 \ \exists \ N\) such that \(\displaystyle \ \forall \ n \geq N\) ...... we have \(\displaystyle \vert c_n - 0 \vert = c_n \lt \frac{ \epsilon }{ t } \)... so then ... \(\displaystyle \sum_{ k= n + 1 }^m c_n = c_{ n + 1} + c_{ n + 2} + \ ... \ ... \ + c_m \lt \frac{ \epsilon }{ t } + \frac{ \epsilon }{ t } + \ ... \ ... \ + \frac{ \epsilon }{ t } = \frac{ t \epsilon }{ t } = \epsilon\) ... ... ... that is \(\displaystyle \sum_{ k= n + 1 }^m c_n \lt \epsilon\) ... ...
Now I think the above analysis is valid ... but is it? ... but the analysis is unnecessary ... because ...... since we are given that \(\displaystyle \sum c_n\) is convergent we have immediately that the sequence of partial sums is Cauchy ...... and so ... for every \(\displaystyle \epsilon \gt 0 \ \exists \ N\) such that \(\displaystyle \ \forall \ m \geq n \geq N\) ...... we have \(\displaystyle \mid s_m - s_n \mid = \sum c_n \lt \epsilon\) ...
Hope that makes sense and clarifies what I was thinking ...Is the above analysis basically correct ...Peter
 
  • #6
Peter said:
but the analysis is unnecessary ... because ...

... since we are given that \(\displaystyle \sum c_n\) is convergent we have immediately that the sequence of partial sums is Cauchy ...

... and so ... for every \(\displaystyle \epsilon \gt 0 \ \exists \ N\) such that \(\displaystyle \ \forall \ m \geq n \geq N\) ...

... we have \(\displaystyle \mid s_m - s_n \mid = \sum c_n \lt \epsilon\) ...Hope that makes sense and clarifies what I was thinking ...Is the above analysis basically correct ...Peter
This, second part I like. I infer that $s_n$ refers to partial sums involving $c_n$ not $x_n$ here, and yes this is correct. We have
$s_n \to L$ where $\vert L\vert \lt \infty$ which is equivalent to $s_n$ satisfying Cauchy criterion. Great.

Peter said:
Just to give you an idea f what I was thinking ...

... \(\displaystyle \sum c_n\) is convergent and therefore $\displaystyle\lim_{ n \to \infty} c_n = 0 )$ [Sohrab: Proposition 2.3.4 ]Therefore ... taking \(\displaystyle t = m - n\) ... we have ...... for every \(\displaystyle \frac{ \epsilon }{ t } \gt 0 \ \exists \ N\) such that \(\displaystyle \ \forall \ n \geq N\) ...... we have \(\displaystyle \vert c_n - 0 \vert = c_n \lt \frac{ \epsilon }{ t } \)... so then ... \(\displaystyle \sum_{ k= n + 1 }^m c_n = c_{ n + 1} + c_{ n + 2} + \ ... \ ... \ + c_m \lt \frac{ \epsilon }{ t } + \frac{ \epsilon }{ t } + \ ... \ ... \ + \frac{ \epsilon }{ t } = \frac{ t \epsilon }{ t } = \epsilon\) ... ... ... that is \(\displaystyle \sum_{ k= n + 1 }^m c_n \lt \epsilon\) ... ...

Now I think the above analysis is valid ... but is it?
This concerns me on two qualitative grounds. One is that when you focus on $c_n \to 0$ you are giving something up which you need, i.e. how rapid it decays, and you're not getting anything in return. What if you had $v_n := \frac{1}{n}$, i.e. the harmonic sequence /series... ?

A related issue is, the way cauchy criterion is setup,

(1.) I choose some $\epsilon \gt 0$
(2.) then you pronounce some N and certify that
$\big \vert s_n - s_m \big \vert \lt \epsilon$
(3.) I get to then play around and select any and all $m\geq n \geq N$ that I like. At this point it is my turn indefinitely -- your turn really is turn (2.). But your argument here has selection of $t$ which strictly depends on what I choose for $m$ and $n$. In particular I can select $n:= N$ and make $m$ arbitrarily large like 1 trillion times $n$ and then 1 trillion squared times $n$ and so on, which tells you that $t$ is unbounded and hence you can't possibly select some satisfactory finite $t$ in turn (2.).

(And for avoidance of doubt you can't duck this by using a limiting value of $t$ because $\lim_{t \to \infty}\frac{\epsilon}{t} = 0$ and you cannot select some value of 0 for the argument here.)

Put differently the dependencies are backwards in your argument.

if this isn't quite clicking, you could also try running your argument here on the harmonic series $\sum_n v_n $ and then 'prove' its sequence of partial sums are cauchy -- a contradiction.
 
Last edited:

FAQ: Absolute and Conditional Convergence .... Sohrab Proposition 2.3.22 ....

1. What is the difference between absolute and conditional convergence?

Absolute convergence refers to a series where the individual terms decrease in value and the series converges regardless of the order of the terms. Conditional convergence refers to a series where the individual terms decrease in value, but the series only converges when the terms are arranged in a specific order.

2. How is Sohrab Proposition 2.3.22 related to absolute and conditional convergence?

Sohrab Proposition 2.3.22 states that if a series is absolutely convergent, then it is also conditionally convergent. This means that if a series converges regardless of the order of its terms, it will also converge when the terms are arranged in a specific order.

3. Can a series be absolutely convergent but not conditionally convergent?

No, according to Sohrab Proposition 2.3.22, if a series is absolutely convergent, it must also be conditionally convergent.

4. What is the significance of absolute and conditional convergence in mathematics?

Absolute and conditional convergence are important concepts in mathematical analysis, as they help determine whether a series will converge or diverge. They also have applications in various fields such as physics, engineering, and economics.

5. How can one test for absolute and conditional convergence?

One can use the Ratio Test or the Root Test to determine absolute and conditional convergence. If the limit of the ratio or root of the terms is less than 1, the series is absolutely convergent. If the limit is equal to 1, further analysis is needed to determine conditional convergence.

Back
Top