CaptainADHD said:
Both series are just 1 divided by a gigantic number... 1/n somehow does not converge, yet 1/n^2 does...
It looks like you're still missing my earlier point about the distinction between a sequence of numbers {a_n} and an infinite series whose terms are those of the sequence.
A sequence is nothing more than a list of numbers. For the first, it is {1, 1/2, 1/3, 1/4, ..., 1/n, ...}. For the second, it is {1, 1/4, 1/9, 1/16, ..., 1/n^2, ...}.
As n gets large, both of the sequences you gave exhibit the same behavior; namely, their nth terms both converge to zero. In symbols,
\lim_{n \rightarrow \infty} \frac{1}{n} = 0 and
\lim_{n \rightarrow \infty} \frac{1}{n^2} = 0
BUT, the infinite series do very different things.
\sum_{n = 1}^\infty \frac{1}{n} diverges , while
\sum_{n = 1}^\infty \frac{1}{n^2} converges to, IIRC, \frac{\pi ^2}{8}
This means that the sum of the terms in the first series gets larger without bound, while the sum of the terms in the second series also gets larger, but never goes past a certain fixed number.
CaptainADHD said:
I know that you just integrate 1/n to get ln(n) which goes to infinity, but that does not really say anything beyond just "the formula says so, so it does"...
The real question is why two patterns of numbers that have the same behavior somehow have different convergence. Think about it logically: why would 1/n^1.000001 converge, yet 1/n not? They both do almost the exact same thing. The only real difference (from my perspective) is that some arbitrary math person declared "the harmonic series diverges".
How about this for an explanation? Can you accept that maybe \sum \frac{1}{n^{.1}} diverges? After all, with a smaller number in the denominator, the overall fraction is larger.
Can you accept that \sum \frac{1}{n^{10}} converges? Here we are adding successive terms that are very small. Take it on faith for the time being that the two series do exactly as I have described. Once you have accepted that, then maybe you can realize that there is some exponent on the denominator that is the dividing line between a series that converges and one that diverges, and in fact that dividing line is an exponent of exactly 1. So \sum \frac{1}{n^{.999}} diverges while \sum \frac{1}{n^{1.001}} converges.
Now so that your mathematical training doesn't rely on faith in dogmatic rules, there was no single mathematician or even a cartel of them in collusion who decided arbitrarily that one series would converge while another diverged. Mathematics doesn't work that way. Someone came up with the determination,
and showed why it was so, namely, by comparing these series to a definite integral, which either converged or diverged.
CaptainADHD said:
Edit: Fixed exponents so they would display correctly