# Sequence Convergence proof

1. Mar 14, 2013

### DotKite

1. The problem statement, all variables and given/known data
Let {a_n} be a sequence | (a_n+1)^2 < (a_n)^2, 0 < (a_n+1) + (a_n). Show that the sequence is convergent

2. Relevant equations

n/a

3. The attempt at a solution

So I am feeling like monotone convergence theorem is the way to go there. It seems to me that (a_n+1)^2 < (a_n)^2 would imply the sequence is decreasing, but I do not know what to do with 0 < (a_n+1) + (a_n) to show it is bounded.

2. Mar 14, 2013

### Staff: Mentor

Without the second inequality, you could construct series like 1+1/2, -1-1/3, 1+1/4, -1-1/5, ... - it has to be bounded based on the first inequality alone, but this is not sufficient for convergence.
With both inequalities, you can rule out sign switches of a_n and get monotony.

3. Mar 14, 2013

### DotKite

I do not understand how the first inequality shows that a_n is bounded.

4. Mar 15, 2013

### Staff: Mentor

$a_{n+1}^2 < a_n^2$ is equivalent to $|a_{n+1}| < |a_n|$, which leads to $|a_{n}| < |a_0|\, \forall n \in \mathbb{N}$.