Divergence Test for an Infinite Series (General question)

In summary, when trying to prove convergence or divergence of an infinite series, the starting value of n does not affect the outcome as long as it is greater than 1. However, in special cases where the series is ill-defined at n=0, it is necessary to exclude that value. Overall, the first few terms of a series do not impact its convergence or divergence.
  • #1
MHCFesh
2
0
This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]
 
Physics news on Phys.org
  • #2
MHCFesh said:
This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

No

For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]

Well, in this special case,

[tex]\sum_{n=0}^{+\infty} \frac{1}{n}[/tex]

is ill-defined at ##n=0## (division by zero), so we need to exclude it. So you can't start at ##0## here. But doing a test at

[tex]\sum_{n=1}^{+\infty}\frac{1}{n}[/tex]

or at

[tex]\sum_{n=10000}^{+\infty}\frac{1}{n}[/tex]

is the same thing. It won't affect convergence.
 
  • #3
OH, that was silly of me. But thanks a lot for taking your time to explain!
 
  • #4
And more generally, what happens in the first few (finite number of) terms at the beginning of the series doesn't affect the convergence or divergence of the series. So these two series have the same behavior:
$$ \sum_{n = 1}^{\infty}a_n$$
and
$$ \sum_{n = k}^{\infty}a_n$$
In the latter series, k is assumed to be larger than 1.
 

FAQ: Divergence Test for an Infinite Series (General question)

1. What is the Divergence Test for an Infinite Series?

The Divergence Test is a tool used in mathematics to determine whether an infinite series converges or diverges. It states that if the limit of the terms in a series does not approach zero, then the series diverges.

2. How do you apply the Divergence Test to an Infinite Series?

To apply the Divergence Test, you must first take the limit as n approaches infinity of the terms in the series. If the limit does not equal zero, then the series diverges. If the limit equals zero, the test is inconclusive and other methods must be used to determine convergence or divergence.

3. Can the Divergence Test be used on all types of infinite series?

No, the Divergence Test can only be used on series with positive terms. If the series has negative terms, the Absolute Convergence Test or the Conditional Convergence Test must be used instead.

4. What is the difference between a convergent and a divergent series?

A convergent series is one in which the sum of all the terms approaches a finite value as the number of terms increases. In contrast, a divergent series is one in which the sum of the terms either approaches infinity or does not approach any value at all.

5. Are there any limitations or drawbacks to using the Divergence Test?

One limitation of the Divergence Test is that it can only determine if a series diverges, it cannot determine convergence. Additionally, the test may give an inconclusive result in some cases, in which other tests must be used to determine convergence or divergence.

Similar threads

Replies
5
Views
292
Replies
17
Views
3K
Replies
5
Views
1K
Replies
3
Views
1K
Replies
3
Views
1K
Replies
2
Views
1K
Replies
15
Views
2K
Back
Top