Show that series converges to 1 while another diverges

  • Thread starter Thread starter Jaggis
  • Start date Start date
  • Tags Tags
    Series
Jaggis
Messages
36
Reaction score
0

Homework Statement




Let a_n≥ 0 when n ≥ 1.


Show that


\sum_{n=1}^{\infty}\frac{a_n}{(1+a_1)(1+a_2)...(1+a_n)} = 1,


when \sum_{n=1}^{\infty}{a_n} diverges.


The Attempt at a Solution



I tried to do something with partial sums because they should approach 1 as n goes to infinity. I looked at the difference and quotient of partial sums for n and n+1 but it didn't go anywhere. I didn't find any use for the fact that the n:th term in the first series should go to zero when n goes to infinity.
 
Physics news on Phys.org
The sum of the first n terms has a very simple expression (did you write down the first 2 or 3 terms and simplified?). You just have to show that the denominator of that sum diverges, and that is not hard.
 
mfb said:
The sum of the first n terms has a very simple expression (did you write down the first 2 or 3 terms and simplified?). You just have to show that the denominator of that sum diverges, and that is not hard.

The expression I got for a partial sum would beS_n = \frac{a_1(1+ a_1)...(1+a_n) +...+ a_{n-1}(1+a_n) + a_n}{(1+a_1)(1+a_2)...(1+a_n)}.If the whole series converges to 1 then according to my knowledge also S_n→ 1 as n→ ∞.I don't know how to get this result, even if the denominator of S_n diverged.
 
For S_n, it is tricky to see how to simplify the numerator. Try S_2 or S_3 and see how many factors of the denominator you can include somehow in the numerator.

The divergence of the denominator is the following step, it does not help you at this point.
 
Jaggis said:
S_n = \frac{a_1(1+ a_1)...(1+a_n) +...+ a_{n-1}(1+a_n) + a_n}{(1+a_1)(1+a_2)...(1+a_n)}.

That's not quite right. You should not have a1(1+ a1) at the start.
 
haruspex said:
That's not quite right. You should not have a1(1+ a1) at the start.

Yes, you are right. Should be a1(1+ a2).
mfb said:
For S_n, it is tricky to see how to simplify the numerator. Try S_2 or S_3 and see how many factors of the denominator you can include somehow in the numerator.

The divergence of the denominator is the following step, it does not help you at this point.

I'm thinking \frac{a_1}{1+a_1} + \frac{a_2}{(1+a_1)(1+a_2)} = \frac{(1+a_1)(1+a_2) -1}{(1+a_1)(1+a_2)} = 1-\frac{1}{(1+a_1)(1+a_2)}.And then the last term goes to zero?
 
If you generalize that to the sum of the first n fractions (which looks exactly as you can expect based on your result): yes.
 
mfb said:
If you generalize that to the sum of the first n fractions (which looks exactly as you can expect based on your result): yes.

OK, thanks for your help. I return with more questions on the subject in case I find myself in trouble with something.
 
Back
Top