Does the Divergence of ∑b_n Imply ∑a_n Also Diverges?

  • Thread starter Thread starter Bigworldjust
  • Start date Start date
  • Tags Tags
    Divergent
Bigworldjust
Messages
53
Reaction score
0

Homework Statement



Suppose that ∑a_n and ∑b_n are series with positive terms and ∑b_n is divergent. Prove that if:

lim a_n/b_n = infinity
n--->infinity

then ∑a_n is also divergent.

Homework Equations


The Attempt at a Solution



Well in attempting to write a viable solution, I have deducted that since both series have positive terms, both sequences are increasing. If ∑b_n is is divergent and the limit as n approaches infinity of a_n/b_n is infinity than ∑a_n also must be divergent. Is there anymore to this however? I think I am missing something important in the explanation but I am not too sure of what it is. Thank you!
 
Physics news on Phys.org
I would start with your definition of divergnence, what is it?

Qualitatively, hopefully you can see what is going on the series bn diverges, but for some n>N, every term is an is much larger that the bn term hence the sum over an diverges

an example is:
b_n = \frac{1}{n}
a_n = \frac{1}{\sqrt{n}}
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top