1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Convergence of a series

  1. May 8, 2014 #1

    ChrisVer

    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data

    For 0<q<∞, and x rational, for what x values does the series converge?
    [itex] \sum_{n=0}^{∞} q^{1/n} x^n[/itex]


    3. The attempt at a solution
    I don't know which method works best for this
     
  2. jcsd
  3. May 8, 2014 #2
    Ratio test. Then check boundaries.
     
  4. May 8, 2014 #3
    Next time do lots of trials with other methods to learn your mistakes, so that you yourself can figure out the optimal method. Asking people for help won't give you the intuition necessary to apply methods during exams.
     
  5. May 8, 2014 #4

    ChrisVer

    User Avatar
    Gold Member

    ratio test doesn't work:
    [itex] \frac{ q^{\frac{1}{n+1}} x^{n+1}}{ q^{\frac{1}{n}} x^{n}}[/itex]
    doesn't give any rational result....
    [itex] \frac{q^{\frac{1}{n+1}}}{q^{\frac{1}{n}}} x[/itex]

    what happens with q^(1/n)?
    I'll have
    [itex] q^{1/n} \rightarrow 1 [/itex] for n going to infinity... but does this help in getting x result?

    (thank god I won't need to give any exams)
     
  6. May 8, 2014 #5

    ChrisVer

    User Avatar
    Gold Member

    On the other hand, since [itex]q^{1/n}\rightarrow 1[/itex]:
    [itex] ln q^{1/n}= \frac{1}{n}lnq \rightarrow 0[/itex]
    [itex] q^{1/n}=e^{ln(q^{1/n})}\rightarrow e^{0}=1[/itex]

    and since:
    [itex] \sum x^{n}= \frac{1}{1-x}, x<1[/itex]
    Couldn't I say that the series converge for x<1? Is there such a theorem in calculus?
     
  7. May 8, 2014 #6
    ##\frac{q^\frac{1}{n+1}}{q^\frac{1}{n}}=q^\frac{-1}{n(n+1)}## Thus by the ratio test. we get ##|\frac{ q^{\frac{1}{n+1}} x^{n+1}}{ q^{\frac{1}{n}} x^{n}}|=|q^\frac{-1}{n(n+1)}x|=|x|<1## as ##n\rightarrow \infty##
     
  8. May 8, 2014 #7

    ChrisVer

    User Avatar
    Gold Member

    So for some unknown reason, we got the same result...?
     
  9. May 8, 2014 #8
    What you did doesn't really constitute as a proof. You basically just assumed that ##\sum x^n=\frac{1}{1-x}##, which you can't really assume because you don't know for what values of x the series converges for.
     
  10. May 8, 2014 #9

    ChrisVer

    User Avatar
    Gold Member

    It's geometric series...In fact for x>1, the series diverge... in order to converge you just need to have it x<1...
    On the other hand, suppose that I have:
    [itex] \sum c_{n} a_{n} [/itex]
    with [itex] c_{n}= q^{1/n}[/itex]
    and [itex]a_{n}= x^{n}[/itex]
    If [itex]c_{n}\rightarrow L < ∞[/itex] can I say that the series converge if [itex]a_{n}[/itex] converges? that's why I asked if there is a theorem about it.
     
    Last edited: May 8, 2014
  11. May 8, 2014 #10
    There isn't a theorem, but the idea your using is simply derived from the ratio test. Since clearly if the sequence ##c_n<\infty## as ## n\rightarrow \infty## then ##|\frac{c_{n+1}}{c_{n}}|=L<1##(sequence convergence test). And hence if we let ##a_n=x^n## then ##| \frac{c_{n+1}}{c_n} ||\frac{x^{n+1}}{x^n}|=| \frac{c_{n+1}}{c_n} ||x|=<1##. Therefore ##|x|<1\leq | \frac{c_{n}}{c_{n+1}} |=\frac{1}{L}##. Therefore the series indeed converges when ##|x|<1## if we know that ##c_n## converges.
     
    Last edited: May 8, 2014
  12. May 9, 2014 #11

    ChrisVer

    User Avatar
    Gold Member

    So let's make the thing more interesting...
    By mathematica also one can verify that the convergence condition is |x|<1. But I also came across a problem...
    As we know the ratio method does not give us a specific answer about what happens in the case the result is =1... in our case if x=1...
    So in that case we will need to see what happens with:
    [itex] \sum_{n} x^{n}= \sum_{n} (1)^{n} = 1+1+... = \sum_{n} 1=-\frac{1}{2}[/itex]
    (the proof can be taken from working with the residues of the zeta and gamma function)
    What happens then for that case? Is mathematica missing that one extra x?
     
  13. May 9, 2014 #12

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    I hope you are joking. Of course ##\sum 1 \neq -1/2##. Such "proofs" typically involve formal manipulation of equations/identities outside their range of validity. Coming up with such "proofs" can be fun, but nobody really takes them seriously.
     
  14. May 9, 2014 #13

    ChrisVer

    User Avatar
    Gold Member

    What do you mean by that?
    They even have physical application (for example the zeta function's "proofs" can be found in Zwiebach's Introduction course to string theory as exercise, since they appear in the Hamiltonian- there was when I had to prove that).
     
  15. May 9, 2014 #14

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    I mean that the result is provably false, so any "proof" must be invalid. I am fully aware that by pushing the boundaries, especially in Physics, we can get things that look wrong---but might not be wrong (especially if they lie beyond the borders of experimental verification). For example, heat can flow from negative-temperature to positive-temperature regions; and other such anomalies are found in various studies and models. However, the result you cite is not of this type---it is just plain wrong. There IS a difference.
     
  16. May 9, 2014 #15

    ChrisVer

    User Avatar
    Gold Member

    How can a mathematically proven result be wrong?
    [itex] Γ(s)= \int_{0}^{∞} dt (t)^{s-1} e^{-t} \rightarrow \int_{0}^{∞} d(nt) (nt)^{s-1} e^{-nt}= n^{s}\int_{0}^{∞} dt (t)^{s-1} e^{-nt}[/itex]
    Then the zeta function is defined as:
    [itex]ζ(s)= \sum_{n=1} \frac{1}{n^{s}}[/itex]

    So
    [itex]Γ(s)ζ(s)= \sum_{n=1} \frac{1}{n^{s}} n^{s}\int_{0}^{∞} dt (t)^{s-1} e^{-nt}= \int_{0}^{∞} dt (t)^{s-1} \sum_{n=1}e^{-nt}[/itex]

    But we can see the geometric series now:
    [itex] \sum_{n=1}e^{-nt}= \frac{e^{-t}}{1-e^{-t}}= \frac{1}{e^{t}-1}[/itex]

    Thus:
    [itex]Γ(s)ζ(s)= \int_{0}^{∞} dt \frac{t^{s-1} }{e^{t}-1} [/itex]

    Also, expanding the denominator's exponential to Taylor series:
    [itex]\frac{1}{e^{t}-1}= \frac{1}{1+t+\frac{t^{2}}{2}+\frac{t^{3}}{3!}+O(t^{4})-1}=\frac{1}{t+\frac{t^{2}}{2}+\frac{t^{3}}{6}+O(t^{4})}=\frac{1}{t} \frac{1}{1+\frac{t}{2}+\frac{t^{2}}{6}+O(t^{3})}[/itex]
    Then the 2nd denominator, can be expanded as:
    [itex] \frac{1}{1+x}= 1-x+x^{2}-...[/itex] for [itex]x= \frac{t}{2}+\frac{t^{2}}{6} [/itex]

    The result is:
    [itex]\frac{1}{e^{t}-1}= \frac{1}{t}- \frac{1}{2}+\frac{t}{12}+O(t^{2})[/itex]


    So
    [itex]Γ(s)ζ(s)= \int_{0}^{1} dt \frac{t^{s-1} }{e^{t}-1}+ \int_{1}^{∞} dt \frac{t^{s-1} }{e^{t}-1}[/itex]

    From which only the first integral can diverge for t=0. So for that we write:
    [itex]\int_{0}^{1} dt \frac{t^{s-1} }{e^{t}-1}=\int_{0}^{1} t^{s-1} (\frac{1}{e^{t}-1}-\frac{1}{t}+ \frac{1}{2}-\frac{t}{12}+O(t^{2}))dt+ \int_{0}^{1} t^{s-1} (-\frac{1}{t}+ \frac{1}{2}-\frac{t}{12}+O(t^{2}))dt[/itex]
    which is like the 1st integral is zero, and the 2nd one will give the (1/e-1)... Nevertheless, the 2nd can be evaluated:
    [itex]\int_{0}^{1} dt \frac{t^{s-1} }{e^{t}-1}=\int_{0}^{1} t^{s-1} (\frac{1}{e^{t}-1}-\frac{1}{t}+ \frac{1}{2}-\frac{t}{12}+O(t^{2}))dt + \frac{1}{s-1}- \frac{1}{2s}+\frac{1}{12(s+1)} [/itex]


    And by that:
    [itex]Γ(s)ζ(s)= \int_{0}^{1} t^{s-1} (\frac{1}{e^{t}-1}-\frac{1}{t}+ \frac{1}{2}-\frac{t}{12}+O(t^{2}))dt + \frac{1}{s-1}- \frac{1}{2s}+\frac{1}{12(s+1)} + \int_{1}^{∞} dt \frac{t^{s-1} }{e^{t}-1}[/itex]

    The 2nd integral, converges for every value of s. The parenthesis in the 1st integral is of order [itex]t^{2}[/itex] which gives:
    [itex] \int_{0}^{1} t^{s+1} dt [/itex] which converges for [itex]Re>-2[/itex].

    Using the fact that the gamma function has a simple pole at [itex]s=0 \rightarrow Res=1[/itex], and a simple pole at [itex]s=-1 \rightarrow Res=-1[/itex]... or in general the pole at [itex]s=-n, n\in Z \rightarrow Res=\frac{(-1)^{n}}{n!}[/itex]... and the fact that the above expression has simple poles at s=0, s=-1, we get:
    [itex] Res_{s=-n}[Γ(s)ζ(s)]= Res_{s=-n}[Γ(s)] ζ(-n), n=0,1[/itex]

    By that we can get that:
    [itex]ζ(0)=-\frac{1}{2}= \sum_{n=1} 1[/itex]
    as well as:
    [itex]ζ(-1)= - \frac{1}{12}= \sum_{n=1}n[/itex]
    while its only singularity is at s=+1.

    One could come up with the idea that expansions are wrong, but that's not the case, since they don't contribute anything to the singular points which I make use of... they just exist in a converging integral...
     
    Last edited: May 9, 2014
  17. May 9, 2014 #16

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper



    Several problems:
    (1) Use of function definitions outside their range of validity. The definition
    [tex] \Gamma(s) = \int_0^{\infty} t^{s-1} e^{-t} \, dt [/tex]
    applies when ##\text{Re}(s) > 0##; for other values of ##s##, ##\Gamma(s)## is defined by analytic continuation---typically by
    [tex] \Gamma(s) = \frac{\Gamma(s+n)}{s(s+1) \cdots (s+n-1)}[/tex]
    (2) Swapping the order of integration and infinite summation without checking if it is valid; we know from examples that it cannot always be done, and sometimes will lead to incorrect results.
    (3) Integrating a series term-by-term, when (at least some of) the terms are not integrable---that is, have divergent integrals.
     
  18. May 9, 2014 #17

    ChrisVer

    User Avatar
    Gold Member

    1. For Re>0 the integral is itself convergent and gamma functions shows no poles. However extending it, allowing complex s is not wrong.
    2. why wouldn't it be valid? In fact the exponential (which happens to have the only n dependence) is behaving nicely under summation (geometric series)
    3. I didn't integrate term by term series I think...I just stated it converges...
     
  19. May 9, 2014 #18

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Some other problems:

    This is not the definition of the zeta function. It only coincides with the zeta function for ##\mathrm{Re}(s)>1##. For other ##s## (such as ##s=0## or ##s=-1##) we must use another definition!

    Thus:

    This is wrong. It is right that ##\zeta(0) = -1/2## and ##\zeta(-1) = -1/12##. But

    [tex]\zeta(-1) = \sum_{n=1}^{+\infty} n~\text{and}~\zeta(0) = \sum_{n=1}^{+\infty} 1[/tex]

    are false.

    If you ever see the above equalities in mathematical literature, then that is because they use a very different notion of convergence of series. And they usually mention what notion they're using. Your OP didn't mention such a notion, so the usual definition applies and under the usual definitions the above equalities are false.
     
  20. May 9, 2014 #19

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award



    There is no problem with analytic continuation of the Gamma function. But if you do that then the Gamma function will not equal the integral anymore. Equalities only hold when the integral converges.

    We're not saying it is necessarily invalid. But you need to check it. There are many counterexamples that swapping integration and limits is invalid, so this might be such a case. You need to rigorously show that the two sides are equal.
     
  21. May 9, 2014 #20

    ChrisVer

    User Avatar
    Gold Member

    Sorry for being a little bit stubborn, but I am really confused...
    Eg how could someone ask for us to prove something which is wrong? (see attachment)
     

    Attached Files:

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Convergence of a series
  1. Convergent series (Replies: 9)

  2. Series Convergence (Replies: 3)

  3. Convergence of series (Replies: 2)

  4. Series convergence (Replies: 26)

  5. Series Convergence (Replies: 15)

Loading...