- #1

davidge

- 554

- 21

## Homework Statement

Given ##b_n = 1 / n## if ##n## odd and ##b_n = 1 / n^2## if ##n## even, show that the series $$\sum_{n=1}^{\infty} (-1)^n b_n$$ diverges.

## Homework Equations

Did'nt find any for this problem

## The Attempt at a Solution

I assumed that ##\sum_{n=1}^{\infty} (-1)^n b_n = \sum_{n \ \text{even}} b_n - \sum_{n \ \text{odd}} b_n##.

If we make the substitution ##n = 2k, k \in \mathbb{N}## for ##n## even and ##n = 2k + 1, k \in \mathbb{N} \cup 0## for ##n## odd, then the original series becomes

$$\sum_{n=1}^{\infty} (-1)^n b_n = \sum_{k=1}^{\infty} \frac{1}{(2k)^2} - \sum_{k=0}^{\infty} \frac{1}{2k+1}$$

By using the integral test for divergence, we easily find that the second sum on the RHS diverges, what causes the divergence of the whole series.

The professor said it can't be done the way I did, because any rearrengement of the series changes the final result. But this doesn't seem true, since we are dealing with

__all__the terms, How can the result be different?

BTW, this was an exam question and she gave me a

__zero__, claiming that I was wrong on that.