# Sum of series involving natural log

1. Jul 30, 2009

### icosane

1. The problem statement, all variables and given/known data

Find the sum of the series, if it converges, of

n=1 to infinity of ln(n/(n+1))

3. The attempt at a solution

My intuition tells me the series converges because as n goes to infinity the series is taking the log of 1, which is 0. How to sum it though... I have no clue? How should I go about solving this?

2. Jul 30, 2009

### Office_Shredder

Staff Emeritus
Rewrite ln(n/(n+1)) as ln(n)-ln(n+1)

3. Jul 30, 2009

### Dick

Use Office Shredder's advice to find the partial sums. Then ask yourself again whether it converges.

4. Jul 30, 2009

### icosane

I always forget about useful log manipulations, thanks guys. So it diverges because I can't say infinity - infinity = 0, correct? This really goes against my intuition on series. As n goes to infinity how does the natural log of a number infinitesimally larger than 1 not go to 0? Is there a way to think about this to make it make sense, or should I shrug it off as a strange property of the mysterious natural logarithm?

5. Jul 30, 2009

### icosane

I also see that its a telescoping series, with a final term ln(n) which goes to infinity... but my question remains.

6. Jul 30, 2009

### Office_Shredder

Staff Emeritus

ln(n/(n+1)) does go to zero as n goes to infinity. But the same with 1/n, but the harmonic series diverges (I imagine you've seen that example). The terms of a series converging to zero is a necessary condition, not a sufficient one. There's nothing about ln that makes this mysterious.

When you telescope the series, since you're starting at n=1, your partial sums come out to

ln(1)-ln(k+1) for the kth partial sum. You should be able to see this clearly be experimentation... there's no infinity-infinity term involved

7. Jul 30, 2009

### slider142

Back to the definition of convergence. What are the partial sums? Can you write them as a function of n?

8. Jul 30, 2009

### snipez90

If the partial sums go off to infinity, then it diverges. This shouldn't be too hard to grasp. If the sum converges, then it's really a number, after all. I'm not sure what you mean by the natural log of a number infinitesimally greater than than 1... it is clear when we take the partial sums, it telescopes nicely, so we can treat the partial sums as a sequence that, as you have shown, diverges.

9. Jul 30, 2009

### icosane

0-ln(2)+ln(2)-ln(3)....ln(n)-ln(n+1)

When its written like this Its obvious that it diverges.

But when looking at it like an infinite sum of ln(n/(n+1)), intuitively I just look at the limit as n goes to infinity so it seems like eventually the partial sums would become ln(1)+ln(1)+ln(1), except actually a number slightly smaller than 1 for each term. So if i was to write out each sum individually and I ended up just adding 0 + 0 + 0 after a long enough time I don't see why it doesn't converge.

10. Jul 30, 2009

### Dick

You aren't paying enough attention to the real evidence in front of your face that it doesn't converge. 'Seems like' isn't evidence for convergence.

11. Jul 30, 2009

### slider142

The important part is that each argument is slightly smaller than 1 and thus the logarithm is emphatically not 0, it is a negative term. So you are actually adding quite a lot (an infinite amount) of negative numbers, that while close to 0, are still non-zero. Ie., adding an infinite amount of 1/(a googol) is still infinite, not the same as adding a bunch of 0's even though it may be indecipherable from 0 on a number line.

12. Jul 30, 2009

### icosane

Okay I guess my question now can be generalized to why doesn't a series converge if the terms of the series converge to zero? I know that 1/n diverges but only because my professor and textbook says so. I know its a p series, and I know how to apply the integral test... but I want to know why a bunch of finite positive numbers that eventually go to essentially 0 + 0 + 0 doesn't converge. It just doesn't make sense to me.

13. Jul 30, 2009

### slider142

The terms never go to 0. In particular, the partial sums, the defining element of a series converging or not converging, never approach any number; the series of partial sums is unbounded. The property of being bounded or unbounded is the defining property of what it means for an infinite process to converge. It is fairly easy to show that there is no number M that is an upper bound for that sequence of sums, and thus it is meaningless to assign it a number (if there is at least one upper bound, then there is some smallest upper bound) in that context.
There are tests that use the actual terms in the infinite series to ascertain whether it converges or not, but these are all theorems strictly dependent on the original definition of convergence based on comparing the infinite series to the behavior of finite sums of the series, ie., are the sums bounded. Don't confuse the tests with the definition of convergence. Also remember the 1/googol example. Just because the terms "get very small", it does not say anything when you are considering infinity. Infinity means you eventually get googol/googol, (2googol)/googol and beyond.

Last edited: Jul 30, 2009
14. Jul 30, 2009

### icosane

I see the evidence and I'm not trying to make an argument for convergence. I'm just looking for an intuitive explanation, Dick.

15. Jul 30, 2009

### snipez90

Here is another way to think about it. ln(1) + ln(1) + ln(1) + ... is certainly 0, nothing could be simpler. This is of course due to the fact that when we raise a real number to the 0-th power, we get 1.

But ln(1/2) + ln(2/3) + ln(3/4) + ... + ln(99/100) is just a bunch of exponents, each term being the exponent you need to raise e to get 1/2, 2/3, etc. These terms are small, but they are not zero. If we add these exponents, and remember that the property ln(a) + ln(b) = ln(ab) for real a,b > 0 simply tells us that when we multiply two exponents with the same base, we add their exponents, then the number that you need to raise e to get (1/2)(2/3)(3/4)*...*(99/100) = 1/100 is in fact smaller than any of the original numbers. This number that we need to raise e to is in fact the 99th partial sum.

16. Jul 30, 2009

### icosane

Okay, its starting to sink in now the more I think about it. Thanks.

17. Jul 30, 2009

### Dick

Pay attention to the evidence. Apparently a whole lot of really small numbers can add up to a really large number. That shouldn't seem too crazy.

18. Jul 30, 2009

### icosane

Thanks for the reply. I can see now that the series of a log function has some interesting properties.

19. Jul 30, 2009

### icosane

But .1 + .01 + .001 doesn't add up to a really large number. How much does each successive term need to decrease in order to converge?

20. Jul 30, 2009

### Dick

That's exactly what convergence tests where created to determine. And there's a good number of them. Some series decrease fast enough to have a finite sum, and some don't. ln((n+1)/n) doesn't, as you've proved. (1/10)^n does.

Last edited: Jul 30, 2009