# Convergence of a series

1. Jan 30, 2010

### talolard

Hey guys,
1. The problem statement, all variables and given/known data
Prove: If for every n $$a_{n}>0$$ and $$\frac{a_{n+1}}{a_{n}}<1$$ then the series $$lim_{n->\infty} a_{n}<0$$

3. The attempt at a solution
We know that $$a_{n}$$ is lowerly bounded by 0 and upwardly bounded by $$a_{1}$$. we also know that it is monotonic and decreasing and so congerges. But how do I show that it converges to 0. What is to stop it from converging to, say, .5?
Thanks
Tal

Last edited: Jan 30, 2010
2. Jan 30, 2010

### VeeEight

When you say series, are you referring to a summation or a sequence?

If an < 0 for all n, then all your terms are negative. So apply this fact to an+1/an < 1

3. Jan 30, 2010

### talolard

Sorry, I made a typo. that was an>0.

4. Jan 30, 2010

### talolard

and I am refering to a sequence, not a summation. Pardon me, english is not my native language.

5. Jan 30, 2010

### talolard

Ahh, I misread the question. it was prove or disprove. I found a counter example.
Thanks anyway.
Tal