JG89
- 724
- 1
As far as I understand, a sequence converges if and only if it is Cauchy. So say for some sequence a_n and for all epsilon greater than zero we have |a_n - a_{n+1}| < \epsilon for large enough n.
We could then say a_n converges if and only if \lim_{n \rightarrow \infty} a_n - a_{n+1} = 0.
But what about if a_n = ln(n)?
ln(n) - ln(n+1) = ln(n/(n+1)) so for n tending to infinity ln(n) - ln(n+1) goes to 0. So I should be able to say that the sequence converges, but ln(n) obviously goes to infinity for increasing n.
What's the mistake in my reasoning?
We could then say a_n converges if and only if \lim_{n \rightarrow \infty} a_n - a_{n+1} = 0.
But what about if a_n = ln(n)?
ln(n) - ln(n+1) = ln(n/(n+1)) so for n tending to infinity ln(n) - ln(n+1) goes to 0. So I should be able to say that the sequence converges, but ln(n) obviously goes to infinity for increasing n.
What's the mistake in my reasoning?