Proving Series Divergence: Convergence of a_n w/o Explicit Formula

CrusaderSean
Messages
44
Reaction score
0
Given a_{n} > 0 and \sum a_{n} diverges, show that \sum \frac{a_{n}}{1+a_{n}} diverges.
Since I don't have an explicit form for the series, I can't apply any of the standard tests. I'm not sure where to start on this problem. I know the criteria for convergence/divergence, namely tail end of series has to converge or cauchy criterion condition. But I don't see how that helps without knowing what series looks like. Please steer me in the right direction.
 
Physics news on Phys.org
If a series diverges, what happens to to its reciprocal ?
 
I would say reciprocal converges, but apparently it's not enough to bring original series to convergence... I thought about this a little more and I think I'll analyze it based on how a_{n} diverges. that is, does it go to zero, constant, or infinity as n goes to infinity and try to bound the reciprocal from below to show series diverges.
 
For a_n>1:

\frac{a_n}{1+a_n}>\frac{1}{2}

For a_n\leq 1:

\frac{a_n}{1+a_n}\geq\frac{a_n}{2}
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Replies
3
Views
1K
Replies
2
Views
2K
Replies
1
Views
1K
Replies
5
Views
1K
Replies
1
Views
1K
Replies
11
Views
3K
Replies
14
Views
2K
Back
Top