Is the Sequence {a_n} Convergent?

DotKite
Messages
81
Reaction score
1

Homework Statement


Let {a_n} be a sequence | (a_n+1)^2 < (a_n)^2, 0 < (a_n+1) + (a_n). Show that the sequence is convergent


Homework Equations



n/a

The Attempt at a Solution



So I am feeling like monotone convergence theorem is the way to go there. It seems to me that (a_n+1)^2 < (a_n)^2 would imply the sequence is decreasing, but I do not know what to do with 0 < (a_n+1) + (a_n) to show it is bounded.
 
Physics news on Phys.org
Without the second inequality, you could construct series like 1+1/2, -1-1/3, 1+1/4, -1-1/5, ... - it has to be bounded based on the first inequality alone, but this is not sufficient for convergence.
With both inequalities, you can rule out sign switches of a_n and get monotony.
 
I do not understand how the first inequality shows that a_n is bounded.
 
##a_{n+1}^2 < a_n^2## is equivalent to ##|a_{n+1}| < |a_n|##, which leads to ##|a_{n}| < |a_0|\, \forall n \in \mathbb{N}##.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top