Limit of a sequence from Knopp

1. Aug 28, 2010

mikepol

Hi,

I've been skimming through Knopp's book "Theory and Applications of Inifnite Series", mostly to get some practice with sequences/series. The problems there are pretty hard, I've been trying to do this one without much success. It is from Chapter 2, 15(b): show that the following sequence converges to 1/2

$$\log\left(1+\frac{1}{n^2}\right) + \log\left(1+\frac{2}{n^2}\right) + ... + \log\left(1+\frac{n}{n^2}\right)$$

Does anyone have an idea how to do this?

2. Aug 28, 2010

mikepol

Oh! I've spent quite a lot of time on this problem, and right after I posted this I got a solution, but it's ugly, I'm not sure if Knopp would have liked it :)
Basically I expressed this as a product and then divided it into d groups, where d is some integer held constant for now. Each group then has n/d elements. Then I bounded the product above and below by approximating each group with its upper and lower element, and then took limit on n of each group. Then as d is taken larger and larger, the two bounds become tighter and approach exp(1/2). So if P_n is the product appearing when all logs are conglomerated together, and for fixed d:

$$\left(1+\frac{1}{n^2}\right)^{n/d} \left(1+\frac{1}{dn}\right)^{n/d} \ldots \left(1+\frac{(d-1)}{dn}\right)^{n/d} < P_n < \left(1+\frac{1}{dn}\right)^{n/d} \ldots \left(1+\frac{d}{dn}\right)^{n/d}$$

And then take limit on n, with d fixed constant, so can take limit of each product individually:

$$e^{1/d^2} e^{2/d^2} \ldots e^{(d-1)/d^2} < P_n < e^{1/d^2} \ldots e^{d/d^2}$$

$$\exp(\frac{d-1}{2d}) < P_n < \exp(\frac{d+1}{2d})$$

So as d is taken larger and larger, the two sides approach exp(1/2), and log of that will give 1/2.

Can anyone come up with a nicer solution?

3. Aug 28, 2010

Petr Mugver

Define

$$g(k,n)=\log\left[1+\frac{k}{n^2}\right]-\frac{k}{n^2}\qquad k=1,\dots,n$$

It is easy to show that

$$|g(k,n)|\leq |g(n,n)|\qquad k=1,\dots,n$$

and that

$$|ng(n,n)|\rightarrow 0\qquad\textrm{for}\quad n\rightarrow\infty$$

So

$$\left|\sum_{k=1}^n\log\left[1+\frac{k}{n^2}\right]-\frac{1}{2}\right|=\left|\sum_{k=1}^n\left(\frac{k}{n^2}+g(k,n)\right)-\frac{1}{2}\right|\leq\frac{n+1}{2n}-\frac{1}{2}+|ng(n,n)|\rightarrow 0\qquad\textrm{for}\quad n\rightarrow\infty$$

Last edited: Aug 28, 2010
4. Aug 29, 2010

mikepol

Hi Petr,

Wow!... This is a type of solution I was looking for but couldn't get myself. I don't think I could have come up with this idea of subtracting k/n^2, such that g(k,n) can be bounded by g(n,n), but still being o(1/n), so that their total contribution goes to zero. What sort of argument has led you to consider this? I think I tried to apply everything from the second chapter and nothing worked.

Thanks a lot for your help.

5. Aug 29, 2010

Petr Mugver

Well, the first thing that you think when you see a log is the first order Taylor expansion $$\log(1+x)=x+o(x)$$, so it's quite natural to try writing (I'm a physician, so I don't care about rigour)

$$\sum_{k=1}^{n}\log\left(1+\frac{k}{n^2}\right)=\sum_{k=1}^{n}\left[\frac{k}{n^2}+o\left(\frac{1}{n^2}\right)\right]=\frac{n+1}{2n}\,+\,n\,o\left(\frac{1}{n^2}\right)=\frac{n+1}{2n}\,+\,o\left(\frac{1}{n}\right)\rightarrow \frac{1}{2}\qquad\textrm{for}\qquad n\rightarrow\infty$$

After you have written this (which is not entirely correct), you have to translate it in "mathematiquees" and you end up more or less with what I wrote in the other post.

Last edited: Aug 29, 2010