Proof of Convergence by Integral Test and/or Comparison Test

middleCmusic
Messages
74
Reaction score
0
Note: This is not strictly a homework problem. I'm just doing these problems for review (college is out for the semester) - but I wasn't sure if putting them on the main part of the forum would be appropriate since they are clearly lower-level problems.(Newbie)

Homework Statement



The problem says to "Determine whether the series is convergent or divergent [using only the Comparison Test or the Integral Test]."

The first, #13, is \sum_{n=1}^{\infty }\, n e^{-n^{2}}

and the second, #14, is \sum_{n=1}^{\infty }\frac{\ln n}{n^{2}}.

Homework Equations



The Integral Test listed in Stewart says:

Suppose f is a continuous, positive, decreasing function on [0,∞) and let a_n=f(n). Then the series \sum_{n=1}^{\infty }\[\itex] is convergent if and only if the improper integral \int_{1}^{\infty }f(x)dx[\itex] is convergent. In other words: <br /> <br /> <span style="font-size: 9px"><br /> (a) If \int_{1}^{\infty }f(x)dx is convergent, then \sum_{n=1}^{\infty }a_n is convergent.<br /> (b) If \int_{1}^{\infty }f(x)dx is divergent, then \sum_{n=1}^{\infty}a_n is divergent.<br /> <br /> The Comparison Test listed in Stewart says<br /> <br /> <span style="font-size: 9px">Suppose that \sum a_n and \sum b_n are series with positive terms. <br /> <br /> (a) If \sum b_n is convergent and a_n \leq b_n for all n, then \sum a_n is also convergent.<br /> (b) If \sum b_n is divergent and a_n \geq b_n for all n, then \sum a_n is also divergent. <br /> <br /> <h2>The Attempt at a Solution</h2><br /> <br /> First, I checked to see if either sequence beings summed had a limit as n \rightarrow \infty that was nonzero, as this would show them to be divergent, but both sequences <i>do</i> go to zero in the limit.<br /> <br /> Next, I tried evaluating #13 as a function of x: f(x)=x e^{-x^{2}} and finding the integral \int_{1}^{\infty } f(x), but I had no luck in evaluating this by hand. Surely, I could pop it into WolframAlpha and see what it churns out, but as this is a problem in a one-variable calc textbook, I doubt that its solution demands numerical computation.<br /> <br /> I could not find something to compare it to, though I tried taking the natural log of the product and seeing if I could compare that to anything.<br /> <br /> \ln n e^{-n^{2}} = \ln n - n^{2} \ln e = \ln n - n^{2} > \ln n - \ln n^{2}<br /> <br /> So can I say therefore e^{\ln n - n^{2}} > e^{\ln n - \ln n^{2}}? <br /> I'm not sure if I can just exponentiate both sides of an inequality, at least not without declaring a minimum n such that it's true, but what would the n be greater than? If I can do that, then I have <br /> n e^{-n^{2}} > e^{\ln \frac{n}{n^{2}}} = \frac{1}{n}, which means that the series in #13 is divergent by the Comparison Test.<br /> <br /> For #14, I tried the a similar process, starting out by exponentiating the sequence \frac{\ln n}{n^{2}}, but I had no luck.<br /> <br /> Any ideas?</span></span>
 
Physics news on Phys.org
middleCmusic said:
Note: This is not strictly a homework problem. I'm just doing these problems for review (college is out for the semester) - but I wasn't sure if putting them on the main part of the forum would be appropriate since they are clearly lower-level problems.(Newbie)

Homework Statement



The problem says to "Determine whether the series is convergent or divergent [using only the Comparison Test or the Integral Test]."

The first, #13, is \sum_{n=1}^{\infty }\, n e^{-n^{2}}

and the second, #14, is \sum_{n=1}^{\infty }\frac{\ln n}{n^{2}}.

Homework Equations



The Integral Test listed in Stewart says:

Suppose f is a continuous, positive, decreasing function on [0,∞) and let a_n=f(n). Then the series \sum_{n=1}^{\infty }\[\itex] is convergent if and only if the improper integral \int_{1}^{\infty }f(x)dx[\itex] is convergent. In other words: <br /> <br /> <span style="font-size: 9px"><br /> (a) If \int_{1}^{\infty }f(x)dx is convergent, then \sum_{n=1}^{\infty }a_n is convergent.<br /> (b) If \int_{1}^{\infty }f(x)dx is divergent, then \sum_{n=1}^{\infty}a_n is divergent.<br /> <br /> The Comparison Test listed in Stewart says<br /> <br /> <span style="font-size: 9px">Suppose that \sum a_n and \sum b_n are series with positive terms. <br /> <br /> (a) If \sum b_n is convergent and a_n \leq b_n for all n, then \sum a_n is also convergent.<br /> (b) If \sum b_n is divergent and a_n \geq b_n for all n, then \sum a_n is also divergent. <br /> <br /> <h2>The Attempt at a Solution</h2><br /> <br /> First, I checked to see if either sequence beings summed had a limit as n \rightarrow \infty that was nonzero, as this would show them to be divergent, but both sequences <i>do</i> go to zero in the limit.<br /> <br /> Next, I tried evaluating #13 as a function of x: f(x)=x e^{-x^{2}} and finding the integral \int_{1}^{\infty } f(x), but I had no luck in evaluating this by hand. Surely, I could pop it into WolframAlpha and see what it churns out, but as this is a problem in a one-variable calc textbook, I doubt that its solution demands numerical computation.</span></span>
<span style="font-size: 9px"><span style="font-size: 9px"><br /> Did you consider trying the substitution u= x^2 so that du= 2xdx[/tex]?<br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> I could not find something to compare it to, though I tried taking the natural log of the product and seeing if I could compare that to anything.<br /> <br /> \ln n e^{-n^{2}} = \ln n - n^{2} \ln e = \ln n - n^{2} > \ln n - \ln n^{2}<br /> <br /> So can I say therefore e^{\ln n - n^{2}} > e^{\ln n - \ln n^{2}}? <br /> I'm not sure if I can just exponentiate both sides of an inequality, at least not without declaring a minimum n such that it's true, but what would the n be greater than? If I can do that, then I have <br /> n e^{-n^{2}} > e^{\ln \frac{n}{n^{2}}} = \frac{1}{n}, which means that the series in #13 is divergent by the Comparison Test.<br /> <br /> For #14, I tried the a similar process, starting out by exponentiating the sequence \frac{\ln n}{n^{2}}, but I had no luck.<br /> <br /> Any ideas? </div> </div> </blockquote></span></span>
 
For 14, try to use the comparison test. Does there exist a suitable p such that ln(n)\leq n^p for large enough n??
 
HallsofIvy said:
Did you consider trying the substitution u= x^2 so that du= 2xdx[/tex]?
<br /> <br /> Wow... I can&#039;t believe I didn&#039;t see that. Thanks for the help!
 
micromass said:
For 14, try to use the comparison test. Does there exist a suitable p such that ln(n)\leq n^p for large enough n??

So, I tried n=0.9, and it seems to work. When graphed, n=0.9 shoots upward compared to \ln n and also,
\lim_{n \rightarrow \infty } \frac{\ln n}{n^{0.9}} = \lim_{n \rightarrow \infty} \frac{\frac{1}{n}}{0.9 n^{-0.1}} = \lim_{n \rightarrow \infty} 1.1 \frac{1}{n^{-0.9}} = 0, so I know that as n becomes infinitely large, \ln n \leq n^{0.9}.

I tried to prove this fact with induction, but so far can't get past the crucial step. Here's what I have so far:

Base case: n=1
\ln 1=0&lt;1^{0.9}=1
Assume that \ln k \leq k^{0.9}
Now we try to show that \ln (k+1) \leq (k+1)^{0.9}.
If we take the 0.9 root of each side of the above inequality, we get (\ln(k+1))^{1.1} \leq (k+1).
If this were a product, then I could try to separate it somehow...but as it is, I don't see how to make use of our inductive step.

If we could show this fact with induction, then we would know that \frac{\ln n}{n^{2}} \leq \frac{n^{0.9}}{n^{2}}=\frac{1}{n^{1.1}} and since the last term in the (in)equality is a p-series with p=1 \geq 1.1, which is convergent, our series is convergent by the Comparison Test.

Thanks for all the help guys! Anyone got an idea about how to finish that induction proof?
 
middleCmusic said:
So, I tried n=0.9, and it seems to work. When graphed, n=0.9 shoots upward compared to \ln n and also,
\lim_{n \rightarrow \infty } \frac{\ln n}{n^{0.9}} = \lim_{n \rightarrow \infty} \frac{\frac{1}{n}}{0.9 n^{-0.1}} = \lim_{n \rightarrow \infty} 1.1 \frac{1}{n^{-0.9}} = 0, so I know that as n becomes infinitely large, \ln n \leq n^{0.9}.

Doesn't this prove it already. I don't see why you need induction. (and I also think an induction proof would not be easy)
 
micromass said:
Doesn't this prove it already. I don't see why you need induction. (and I also think an induction proof would not be easy)

I guess you're right. It only matters that there's some N_0 after which the relation holds.

But I realized I could also just show that the difference of the two is always increasing... that is that x^{0.9}- \ln x has a positive derivative for all n, since it already started out with n^{0.9} on top.

Since \frac{d}{dx}(x^{0.9}- \ln x)=0.9x^{-.1} - \frac{1}{x} and
0.9x^{-.1} - \frac{1}{x} \geq 0 for all x \geq 2, we know that the difference between them can only increase for n \geq 2. Thus, the inequality holds for all n, since it was proved to hold for n=1 as well.

Thanks again for the help - otherwise I might have been staring at this one for a while. :P
 
Integration by parts for ln(x)/x^2.

Let u=ln(x), dv=dx/x^2.


:devil:

cannot delete accidental repeat?? how google of you
 
Last edited:
Integration by parts for ln(x)/x^2.

Let u=ln(x), dv=dx/x^2.
By the way, I can look at these and know they're convergent in about five seconds, in my head. The exponential goes faster than any polynomial. Natural log goes slower than x. And anything that goes faster than 1/n converges. Be careful with this rule. At least you can guess faster. And it might have given you the idea micromass gave you.
 
  • #10
algebrat said:
Integration by parts for ln(x)/x^2.

Let u=ln(x), dv=dx/x^2.



By the way, I can look at these and know they're convergent in about five seconds, in my head. The exponential goes faster than any polynomial. Natural log goes slower than x. And anything that goes faster than 1/n converges. Be careful with this rule. At least you can guess faster. And it might have given you the idea micromass gave you.

Thanks for the tip! Your method is way simpler than what I was trying to come up with and I'm sure it will be useful.

Just to be sure I know what you're saying, for these two examples, we have

#13 Since e^{n^{2}} &gt; n^{2}, we must have that \frac{n}{e^{n^{2}}} &lt; \frac{n}{n^2} = \frac{1}{n} and so it's basically equivalent to a convergent p-series with p&gt;1

#14 And for this one, \ln n &lt; n, so \frac{\ln n}{n^{2}} &lt; \frac{n}{n^{2}} = \frac{1}{n}, so this one also is equivalent to a convergent p-series with p&gt;1.

Thanks again. :)
 
  • #11
Mmm, careful, you are showing it is less than a divergent series. Just to clarify, this is very informal reasoning, and proofs should be formal lest they prove something false.

For instance, 1/(2n)<1/n, or 1/(n+1)<1/n. The reasoning applied carelessly would not work here. It is a trick for developing an initial conjecture with better speed. With experience, you can get good at using this "rule".

Try it on (n^2+2n)/(n^3*ln(n)+60n). Does its series converges?
 
Last edited:

Similar threads

Back
Top