Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Divergence Test for an Infinite Series (General question)

  1. Mar 18, 2014 #1
    This might seem like a rudimentary question but when trying to prove divergence (or even convergence) of an infinite series does the series always have to start at n = 1?

    For example would doing a test for [itex]\sum^{∞}_{n=1}\frac{1}{n}[/itex] be any different from [itex]\sum^{∞}_{n=0}\frac{1}{n}[/itex]
  2. jcsd
  3. Mar 18, 2014 #2

    Well, in this special case,

    [tex]\sum_{n=0}^{+\infty} \frac{1}{n}[/tex]

    is ill-defined at ##n=0## (division by zero), so we need to exclude it. So you can't start at ##0## here. But doing a test at


    or at


    is the same thing. It won't affect convergence.
  4. Mar 18, 2014 #3
    OH, that was silly of me. But thanks a lot for taking your time to explain!
  5. Mar 18, 2014 #4


    Staff: Mentor

    And more generally, what happens in the first few (finite number of) terms at the beginning of the series doesn't affect the convergence or divergence of the series. So these two series have the same behavior:
    $$ \sum_{n = 1}^{\infty}a_n$$
    $$ \sum_{n = k}^{\infty}a_n$$
    In the latter series, k is assumed to be larger than 1.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook