I'm reading How to Ace the Rest of Calculus and on page 31 there's a test for divergence that says if the limit (as n goes to infinity) of a_n is NOT equal to zero, then the infinite series [tex]\sum a_n[/tex] (that goes from n = 1 to infinity) diverges. 3 pages before that, on 28, there's an example on a series that converges to 1... The sum is represented by [tex]S_n = (2^n - 1)/2^n[/tex] I'm confused here: since taking the limit of that [tex]S_n[/tex] equals to 1... shouldn't the series diverge? Come to think of it I know I'm missing something here because as I understand it according to that test ANY series that has a limit to any number but 0 diverges which makes no sense. I know that test does nor work in reverse btw (limit going to 0 doesn't mean it converges) but this still sounds like all series convergence must go to 0... which I know is wrong.