I know this is like very basic, but my brain just somehow couldn't accept it!! 1. The problem statement, all variables and given/known data I don't understand why does the sequence (rn) converges to 0 as n -> infinity when -1<|r|<1 3. The attempt at a solution i did quite a few ways to convince myself. Firstly, we know that (1/r)n<(1/r) only if 0<r<1. So by squeeze theorm, (1/r)n converges to 0. Then it also holds for -1<r<0 cz |(1/r)n| = (1/r)n. this way seems to be correct but it doesn't seems to be convincing enough.