I Prove Ramanujan Identity: 3 = \sqrt{1+2\sqrt{1+3\sqrt{1+4\sqrt{1+...}}}}

  • I
  • Thread starter Thread starter jostpuur
  • Start date Start date
  • Tags Tags
    Roots Square
jostpuur
Messages
2,112
Reaction score
19
How do you prove the identity

<br /> 3 = \sqrt{1 + 2\sqrt{1 + 3\sqrt{1+4\sqrt{1 + \cdots}}}}<br />

with a real proof that actually proves the convergence? I know there are "proofs" that "prove" the identity with some trickery that ignore all the convergence issues, and I'm not interested in those trickeries.
 
Physics news on Phys.org
The "trickery" method can be used to prove convergence.

We can generalize Ramanujan's result:

\sqrt{1+x\sqrt{1+(x+1)\sqrt{1+(x+2)\sqrt{1+ ...}}}}= x+1

The original result is the special case x=2. How to prove the general case?

Let's define a function f(x) = x+1. We can prove that:

f(x) = \sqrt{1 + x f(x+1)}

We can similarly expand f(x+1):

f(x+1) = \sqrt{1+(x+1) f(x+2)}

So we can keep expanding to get, for n &gt; 0:

f(x) = \sqrt{1+x \sqrt{1+(x+1) \sqrt{1 + (x+2) \sqrt{1 + ... + \sqrt{1 + (x+n-1) f(x+n+1)}}}}}

Now, let's define a sequence of functions f_n(x):

f_n(x) = \sqrt{1+x \sqrt{1+(x+1) \sqrt{1 + (x+2) \sqrt{1 + ... + \sqrt{1+ (x+n-1)}}}}}

It's obvious that f_n(x) &lt; f(x). And it's also obvious that f_{n+1}(x) &gt; f_n(x). So for fixed x, we have an increasing sequence that is bound from above by f(x) = x+1. So every increasing bounded sequence converges. So f_n(x) converges.

How do you show that its limit is f(x)? Well, if it converges to some function f_{\infty}(x), then we can see that that function satisfies:

f_{\infty}(x) = \sqrt{1+ x f_{\infty}(x+1)}

A solution to that is clearly f_{\infty}(x) = x+1. Is it the only solution? I don't know, but at this point, the problem isn't convergence, because you know that it converges.
 
  • Like
Likes mfb and berkeman
stevendaryl said:
A solution to that is clearly f_{\infty}(x) = x+1. Is it the only solution? I don't know

I think we've pinpointed the last step to complete the proof, because that seems to be the only acceptable solution after all. The functional equation

<br /> g(x+1) = \frac{(g(x))^2 - 1}{x}<br />

obviously has infinitely many solutions, but it seems that it has only one unique solution that satisfies the condition 0\leq g(x)\leq x+1.

If you set g(2) = 2.9, this happens:

g(3) = 3.705
g(4) = 4.2423...
g(5) = 4.2493...
g(6) = 3.4114...
g(7) = 1.7729...
g(8) = 0.3062...
g(9) = -0.1132...

If you set g(2) = 2.99, this happens:

g(3) = 3.9700...
g(4) = 4.9204...
g(5) = 5.8026...
g(6) = 6.5341...
g(7) = 6.9492...
g(8) = 6.7560...
g(9) = 5.5804...
g(10) = 3.3490...
g(11) = 1.0216...
g(12) = 0.0039...
g(13) = -0.0833...

And so on. So it seems that these type of solutions are not going to satisfy g(x)=\sqrt{1 + xg(x+1)} due to the presence of negative values.

If you set g(2) > 3, then obviously the constraint g(x)\leq x+1 gets violated. Those type of solutions would satisfy g(x)= \sqrt{1 + xg(x+1)}, but we know that the function sequence f_1,f_2,f_3,\ldots is not converging to those.
 
  • Like
Likes stevendaryl
The numerical result I gave in the previous post does not look like an easy thing to prove, so the question of my opening post is still open.

In order to complete the proof in the way envisioned above, the following result should be proven: Assume that 0&lt;g(2)&lt;3, and that the sequence g(3),g(4),\ldots is generated by the formula
<br /> g(n+1) = \frac{g(n)^2 - 1}{n},\quad\quad n=2,3,\ldots<br />
Then g(n)&lt;0 with with some n.

Here's some remarks concerning the most obvious proof attempts. Suppose
<br /> g(n) &lt; n + 1 - \epsilon<br />
holds with some \epsilon &gt;0. Then
<br /> g(n+1) &lt; \frac{(n+1-\epsilon)^2 - 1}{n} &lt; n + 2 - 2\epsilon + \frac{\epsilon^2}{n}<br />
I dropped one term to simplify the inequality a little, weakening the inequality at the same time. If you assume that the epsilon is so small that its square can be ignored, we get
<br /> g(n+1) \lesssim n + 2 - 2\epsilon.<br />

Treating 2\epsilon as a "new epsilon" we'll get
<br /> g(n+2) \lesssim n + 3 - 2^2\epsilon<br />
<br /> g(n+3) \lesssim n+4 - 2^3\epsilon<br />
and so on.

Unfortunately the presence of term -2^n\epsilon will not prove that the sequence would go below zero, because the assumption about the smallness of the epsilon has been contradictory. We learned that the sequence will start to deviate from n+1 roughly exponentially only as long as the deviation is very small. Once the deviation is sufficiently large that the square of the epsilon can no longer be ignored, the sequence of inequalities stops working.

The effect of the term \frac{\epsilon^2}{n} is that it tends to push the sequence upwards, so it doesn't look like that would be an easy problem.
 
I managed to prove the missing step. The trick was to approach the exponential stuff with a slightly different touch. If g(n)&lt;\alpha(n+1) holds with some constant 0&lt;\alpha &lt;1, then also g(n+1)&lt;\alpha^2 (n+2) holds. This implies that if we start with a condition g(2)&lt;3\alpha, inductively we get
<br /> g(n) &lt; \alpha^{2^{n-2}}(n+1)<br />
for all n\in\{2,3,4,\ldots\}.

This alone does not prove the sought claim that g(n)&lt;0 would hold for some n, but the exponential dominating does prove that g(n)&lt;1 will hold from some point on. On the other hand the condition 0\leq g(n)&lt;1 implies that g(n+1)\leq g(n)-\frac{1}{n}, so in other words, if 0&lt;g(n_A)&lt;1 holds with some n_A, from that point on
<br /> g(n) \leq g(n_A) - \sum_{k=n_A}^{n-1}\frac{1}{k}<br />
will hold for all n\in\{n_A+1,n_A+2,\ldots, n_B\}, where n_B is the inevitable point where g(n_B)&lt;0 will be true.
 
I just realized that my proof contained an unnecessary silly complication, and it could have been simpler. Since nobody pointed it out, I can conclude that my proof was not read very carefully by anyone :rolleyes:
 
Back
Top