# Sequence Convergence proof

## Homework Statement

Let {a_n} be a sequence | (a_n+1)^2 < (a_n)^2, 0 < (a_n+1) + (a_n). Show that the sequence is convergent

n/a

## The Attempt at a Solution

So I am feeling like monotone convergence theorem is the way to go there. It seems to me that (a_n+1)^2 < (a_n)^2 would imply the sequence is decreasing, but I do not know what to do with 0 < (a_n+1) + (a_n) to show it is bounded.

Related Calculus and Beyond Homework Help News on Phys.org
mfb
Mentor
Without the second inequality, you could construct series like 1+1/2, -1-1/3, 1+1/4, -1-1/5, ... - it has to be bounded based on the first inequality alone, but this is not sufficient for convergence.
With both inequalities, you can rule out sign switches of a_n and get monotony.

I do not understand how the first inequality shows that a_n is bounded.

mfb
Mentor
##a_{n+1}^2 < a_n^2## is equivalent to ##|a_{n+1}| < |a_n|##, which leads to ##|a_{n}| < |a_0|\, \forall n \in \mathbb{N}##.