# Convergence of an alternating series

• I

## Main Question or Discussion Point

Consider a sequence with the $n^{th}$ term $u_n$. Let $S_{2m}$ be the sum of the $2m$ terms starting from $u_N$ for some $N\geq1$.

If $\lim_{N\rightarrow\infty}S_{2m}=0$ for all $m$, then the series converges. Why?

This is not explained in the following proof:

Last edited:

mfb
Mentor
Consider a sequence with the $n^{th}$ term $u_n$. Let $S_{2m}$ be the sum of the $2m$ terms starting from $u_N$ for some $N\geq1$.

If $\lim_{N\rightarrow\infty}S_{2m}=0$ for all $m$, then the series converges. Why?
This is not true, consider $u_n=(-1)^n$. You also have to add that un are alternating and decreasing in magnitude.

The proof looks a bit sloppy, but you can use a similar approach to show that the partial sum is always between two numbers that approach the same limit. Sandwich theorem.