Divergent series and the limit of the nth term as n approaches infinity

Click For Summary
SUMMARY

This discussion focuses on the concepts of convergence and divergence in series, specifically addressing the behavior of the nth term as n approaches infinity. It is established that while a convergent series requires its nth term to approach 0, a divergent series can also have an nth term that approaches 0, highlighting that the rate at which terms approach zero is crucial for determining convergence. Examples such as the series \(\sum \frac{1}{n}\) (which diverges) and \(\sum \frac{(-1)^n}{n}\) (which converges conditionally) illustrate these principles. The discussion emphasizes that convergence is not solely dependent on the terms approaching zero but also on their rate of decrease.

PREREQUISITES
  • Understanding of series and sequences in mathematics
  • Familiarity with convergence tests in calculus
  • Knowledge of absolute and conditional convergence
  • Basic grasp of limits and their properties
NEXT STEPS
  • Study the comparison test for series convergence
  • Learn about the ratio test and its applications
  • Explore the concept of absolute convergence versus conditional convergence
  • Investigate the implications of the p-series test for convergence
USEFUL FOR

Mathematicians, students studying calculus, and anyone interested in understanding the nuances of series convergence and divergence.

GeoMike
Messages
64
Reaction score
0
I'm looking for help with my conceptual understanding of part of the following:

1) If a series is convergent it's nth term approaches 0 as n approaches infinity
This makes perfect sense to me.

2) If the nth term of a series does not approach 0 as n approaches infinity, the series is divergent
Again, makes perfect sense.

3) A divergent series can have an nth term that approaches 0 as n approaches infinity. Thus #1 cannot be used as a test FOR convergence.
Here's where I'm thrown a little. I can follow the proofs in my textbook fine, and I think I see what they all suggest.
Essentially: The RATE at which the terms of a series approaches zero (assuming they do at all) is what really determines convergence/divergence -- am I understanding this right?

Thanks,
-GM-
 
Physics news on Phys.org
Yes you're right. The sequence has to tend to zero fast enough for the sum to converge.

For example the sum of 1/n does not converge altough 1/n goes to zero, but too slowly, the same for 1/log(n). In fact as you know the sum of 1/n^s converges for all s>1 and diverges for s<=1 so this gives you an idea of how fast the sequence should go to zero.
 
GeoMike said:
Essentially: The RATE at which the terms of a series approaches zero (assuming they do at all) is what really determines convergence/divergence -- am I understanding this right?
Thanks,
-GM-

Not exactly, it is known that the series \sum \frac{1}{n} is divergent, while \sum \frac{(-1)^n}{n} is convergent.
Both have terms which converges to zero, with the same "rate".
However, the last one is, of course, not absolutely convergent.
 
another example, sum((-1)^n/sqrt(n)) also converges, but again, not absolutely, actually we can put anything in the denominator with an n, even sum((-1)^n/n^(1/1000)) converges
 
Last edited:

Similar threads

  • · Replies 11 ·
Replies
11
Views
5K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K