- #1

MHCFesh

- 2

- 0

For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter MHCFesh
- Start date

In summary, when trying to prove convergence or divergence of an infinite series, the starting value of n does not affect the outcome as long as it is greater than 1. However, in special cases where the series is ill-defined at n=0, it is necessary to exclude that value. Overall, the first few terms of a series do not impact its convergence or divergence.

- #1

MHCFesh

- 2

- 0

For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]

Physics news on Phys.org

- #2

- 22,183

- 3,324

MHCFesh said:This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

No

For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]

Well, in this special case,

[tex]\sum_{n=0}^{+\infty} \frac{1}{n}[/tex]

is ill-defined at ##n=0## (division by zero), so we need to exclude it. So you can't start at ##0## here. But doing a test at

[tex]\sum_{n=1}^{+\infty}\frac{1}{n}[/tex]

or at

[tex]\sum_{n=10000}^{+\infty}\frac{1}{n}[/tex]

is the same thing. It won't affect convergence.

- #3

MHCFesh

- 2

- 0

OH, that was silly of me. But thanks a lot for taking your time to explain!

- #4

Mark44

Mentor

- 37,756

- 10,114

$$ \sum_{n = 1}^{\infty}a_n$$

and

$$ \sum_{n = k}^{\infty}a_n$$

In the latter series, k is assumed to be larger than 1.

The Divergence Test is a tool used in mathematics to determine whether an infinite series converges or diverges. It states that if the limit of the terms in a series does not approach zero, then the series diverges.

To apply the Divergence Test, you must first take the limit as n approaches infinity of the terms in the series. If the limit does not equal zero, then the series diverges. If the limit equals zero, the test is inconclusive and other methods must be used to determine convergence or divergence.

No, the Divergence Test can only be used on series with positive terms. If the series has negative terms, the Absolute Convergence Test or the Conditional Convergence Test must be used instead.

A convergent series is one in which the sum of all the terms approaches a finite value as the number of terms increases. In contrast, a divergent series is one in which the sum of the terms either approaches infinity or does not approach any value at all.

One limitation of the Divergence Test is that it can only determine if a series diverges, it cannot determine convergence. Additionally, the test may give an inconclusive result in some cases, in which other tests must be used to determine convergence or divergence.

- Replies
- 5

- Views
- 292

- Replies
- 17

- Views
- 3K

- Replies
- 11

- Views
- 3K

- Replies
- 5

- Views
- 1K

- Replies
- 6

- Views
- 2K

- Replies
- 3

- Views
- 1K

- Replies
- 3

- Views
- 1K

- Replies
- 3

- Views
- 1K

- Replies
- 2

- Views
- 1K

- Replies
- 15

- Views
- 2K

Share: