# Question on convergence.

1. Sep 1, 2006

### MathematicalPhysicist

let a_n be a sequence which satisfies lim (a_n)^1/n<1 as n appraoches infinity, a_n>=0 for every n. prove that lim a_n=0.
what i did is as follows, for every e>0 there exists n0 such that for every n>=n0 |a_n^1/n-a|<e.
then 0=<a_n<(a+e)^n
we get a<1 a_n<(1+e)^n
but how do i procceed from here?

2. Sep 1, 2006

### 0rthodontist

You should mention what a is (the limit of {(a_n)^1/n})
Remember, you can choose e to be whatever you like, and you know that a < 1. So why not choose e to be small enough that a + e < 1? Then use the sandwich theorem.

3. Sep 2, 2006

### MathematicalPhysicist

i thought about this, but even then i get that 0<=a_n<1^n, i also thought to write: e=1/a-1>0 but 1/a^n doesnt converge.
perhaps i should write 0<=a_n<1/n<=1, but no one guarntees us that a_n is smaller than 1/n, a_n could be equal 2/n.

4. Sep 2, 2006

### MathematicalPhysicist

ok i think i got it (a+e)^n appraoches 0 when e is small enough.

5. Sep 2, 2006

### nocturnal

Yes, in fact you want e small enough so that a + e < 1 as Ortho mentioned. The reason for this is because one can prove that if c<1 then lim(c)^n = 0 as n goes to infinity.

6. Sep 3, 2006

### nocturnal

It is too late to edit the above post, so I would just like to add a correction. There should be an absolute value sign around c, so it should say if |c| < 1, then lim(c)^n = 0.