# Infinite Series

$$\sum\frac{1}{n(n+k)}$$ from n=1 to infinity

find that the series is convergent and find it's sum.

Now I'm a bit confused... I can show it's convergent with k=1
and I attempted the same thing with k by breaking this into partial fractions. But I'm given a harmonic series that is divergent minus another divergent series... how can this be convergent?

Well, let's look at the series $$\sum{\frac{k}{n(n+k)}$$ instead. You are probably aware that you can split this in partial fractions:

$$\frac{k}{n(n+k)}=\frac{1}{n}-\frac{1}{n+k}$$

Now it's not immediately clear what happens if you sum the above series. Try taking k=2 and write 10 terms of the above series. You will see that a lot of terms vanish. This will give you an idea for the general proof...

Thank you micromass, I'll try expanding that out as soon as I get home. I'm sure it'll telescope out... there is one thing driving me nuts however.

$$\sum 1/n$$ is a harmonic series... which is divergent.

I'm not sure about $$\sum 1/(n+k)$$

since series have the property that $$\sum (a-b)$$=$$\sum a$$ - $$\sum b$$

how is it that the difference between a divergent series and a convergent (or divergent series) results in a convergent series?

Well, for one thing, the series $$\sum{\frac{1}{n+k}}$$ is divergent.

But that aside, you state the equality

$$\sum(a_n-b_n)=\sum a_n - \sum b_n$$

This statement is INCORRECT. This is only correct is both sequences are convergent. Thus this equality is not applicable in this case.

Thank you!