Divergence Test for an Infinite Series (General question)

Click For Summary

Discussion Overview

The discussion revolves around the conditions for proving divergence or convergence of infinite series, specifically addressing whether the starting index of the series affects its convergence properties. The scope includes theoretical considerations related to series convergence and divergence.

Discussion Character

  • Conceptual clarification, Debate/contested

Main Points Raised

  • One participant questions if an infinite series must always start at n = 1 to prove divergence or convergence.
  • Another participant asserts that starting at n = 0 is ill-defined for the series \(\sum_{n=0}^{+\infty} \frac{1}{n}\) due to division by zero, thus excluding it from consideration.
  • It is proposed that starting the series at different indices, such as n = 1 or n = 10000, does not affect the convergence of the series.
  • A further point is made that the behavior of the series is not influenced by a finite number of initial terms, suggesting that \(\sum_{n=1}^{\infty} a_n\) and \(\sum_{n=k}^{\infty} a_n\) (for k > 1) exhibit the same convergence properties.

Areas of Agreement / Disagreement

Participants appear to agree that the starting index can be adjusted without affecting convergence, but there is a lack of consensus on the implications of starting at n = 0 due to its ill-defined nature.

Contextual Notes

The discussion does not resolve the broader implications of starting indices for all types of series, nor does it address potential exceptions or specific cases beyond the examples given.

MHCFesh
Messages
2
Reaction score
0
This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

For example would doing a test for \sum^{∞}_{n=1}\frac{1}{n} be any different from \sum^{∞}_{n=0}\frac{1}{n}
 
Physics news on Phys.org
MHCFesh said:
This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

No

For example would doing a test for \sum^{∞}_{n=1}\frac{1}{n} be any different from \sum^{∞}_{n=0}\frac{1}{n}

Well, in this special case,

\sum_{n=0}^{+\infty} \frac{1}{n}

is ill-defined at ##n=0## (division by zero), so we need to exclude it. So you can't start at ##0## here. But doing a test at

\sum_{n=1}^{+\infty}\frac{1}{n}

or at

\sum_{n=10000}^{+\infty}\frac{1}{n}

is the same thing. It won't affect convergence.
 
OH, that was silly of me. But thanks a lot for taking your time to explain!
 
And more generally, what happens in the first few (finite number of) terms at the beginning of the series doesn't affect the convergence or divergence of the series. So these two series have the same behavior:
$$ \sum_{n = 1}^{\infty}a_n$$
and
$$ \sum_{n = k}^{\infty}a_n$$
In the latter series, k is assumed to be larger than 1.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 17 ·
Replies
17
Views
6K
  • · Replies 11 ·
Replies
11
Views
6K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K