Why Does (r^n) Converge to 0?

  • Thread starter Thread starter Lily@pie
  • Start date Start date
  • Tags Tags
    Convergence
Lily@pie
Messages
108
Reaction score
0
I know this is like very basic, but my brain just somehow couldn't accept it!

Homework Statement


I don't understand why does the sequence (rn) converges to 0 as n -> infinity when -1<|r|<1

The Attempt at a Solution


i did quite a few ways to convince myself.
Firstly, we know that (1/r)n<(1/r) only if 0<r<1. So by squeeze theorm, (1/r)n converges to 0. Then it also holds for -1<r<0 cz |(1/r)n| = (1/r)n.

this way seems to be correct but it doesn't seems to be convincing enough.
 
Physics news on Phys.org
If you want to make it more tangible, you could try it with some numbers:

If r = 0.1, then you get { 0.1, 0.01, 0.001, 0.0001, ... } and you see this quickly tends to 0.
If r = 0.5, then again { 0.5, 0.25, 0.125, 0.0625, ...}
Even for r = 0.99, { 0.99, 0.9801, ... } goes much more slowly, but 0.99100 is already of the order of 5 x 10-5.

Clearly, the boundary case is r = 1, as { 1, 1, 1, ... } never converges to 0.

For the rigorous proof, refer to your own post :-) What you did there is correct.
 
|r| < 1 so r = 1/x where |x|>1. r^n = 1/(x^n). If |x| >1 then clearly x^n becomes infinitely large as n goes to infinity. So 1/(big number) goes to zero.
 
Ok. Thanks so much
 
Back
Top