# Let's suppose that both sequences are convergent

1. Dec 22, 2004

Consider the following statement:

If $$\left\{ a_n \right\}$$ and $$\left\{ b_n \right\}$$ are divergent, then $$\left\{ a_n b_n \right\}$$ is divergent.

I need to decide whether it is true or false, and explain why. The real problem is that I checked the answer in my book; it's false, but I don't understand it. Here is what I think:

Let's suppose that both sequences are convergent. Then, it follows that

$$\lim _{n\to \infty} a_n \cdot \lim _{n\to \infty} a_n = \lim _{n\to \infty} \left( a_n b_n \right) \tag{1}$$​

But, the truth is that both are divergent. So, $$\lim _{n\to \infty} a_n \neq 0$$ and $$\lim _{n\to \infty} b_n \neq 0$$. If neither is zero, then how can $$\lim _{n\to \infty} \left( a_n b_n \right) = 0$$ (so that the statement is false)? It doesn't sound reasonable if you consider (1).

Thank you very much.

2. Dec 22, 2004

### quasar987

If a sequence is convergent, it doesn't decessarily means it converges towards 0, like you seemed to be implying in your post.

To decide wheter the proposition it's true or false, a simple counter exemple suffice. Consider $a_n = (-1)^n$ and $b_n = (-1)^n$. These are both divergent series because when n is pair, $a_n = 1$ and when n is odd $a_n = -1$, such that the limit is dependent upon n ==> it is not unique ==> it doesn't exist ==> the sequences diverge. But $a_n b_n = (-1)^{2n} = 1 \ \forall n \in \mathbb{N}$ is a sequence that converges towards 1.

Or take $a_n = (-1)^n$ and $b_n = (-1)^n+1$. Then $a_n b_n = (-1)^{2n+1} = -1 \ \forall n \in \mathbb{N}$, which converges towards -1.

Last edited: Dec 22, 2004
3. Dec 22, 2004

### quasar987

N.B. But in the case where $a_n$ and $b_n$ are divergent because they increase or decrease without limit (i.e. because their limit is plus or minus infinity), then it is true that $a_n b_n$ is also a divergent sequence.

4. Dec 22, 2004

Oh... I see what you mean. I thought that way because I had in mind the theorem that says that when a series is convergent, the terms go to 0. It doesn't apply in this case, since we only have sequences. Thanks.

5. Dec 22, 2004

I see, so it isn't true in general.

6. Dec 22, 2004

### NateTG

The following is a problem step:
The hypothesis is that both sequences are divergent, not convergent, so this thought does not apply to this question in any useful way.

For example, $a_n=b_n=(-1)^n$ are two divergent sequences, but $\{a_nb_n\}$ is constant, so it clearly converges.

By the way, the theorem about series on indicates that terms in a convergent series go to zero. There are divergent series that go to zero like the harmonic series:
$$\sum_{i=1}^{\infty} \frac{1}{i}$$
And, since
$$\sum_{i=1}^{\infty} \frac{1}{i^2} = \frac{\pi^2}{6}$$
is convergent, the above is false for series as well as sequences.