Can recursive square roots reveal the secrets of the golden ratio?

  • Thread starter Thread starter Bob3141592
  • Start date Start date
  • Tags Tags
    Roots Square
Bob3141592
Messages
236
Reaction score
2
I have a question. You might be familiar with the following expression for the golden ratio:

<br /> X = \sqrt{1 + {\sqrt{1 + {\sqrt{1 + {\sqrt{1 + {\sqrt{1 + ...}}}}}}}}}<br />

For those who haven't seen this before, to solve this we just square both sides (which removes the outermost square root sign from the right hand side), subtract one from both sides of the equals sign, and then substitute X for what remains on the right (since it's identical to the original equation). This gives the simple quadratic equation

<br /> X^2 - 1 = X or X^2 - X - 1 = 0<br />

which is easilly solved to give X = \frac{1+\sqrt{5}}{2}

Playing around, I generalized this by using "n" instead of "1" in the original equation, so that any (non-negative) value could appear under the root sign, just to see what I'd get. This gave me X^2 - X - n = 0 or

<br /> X = \frac{1 + \sqrt{1 + 4n}}{2}<br />

This gives the expected result for n = 1 and the interesting integer results of X = 2 for n = 2. Additional integer solutions occur for n = 6, 12, 20, etc.

But what I find most curious is when we set n equal to zero! Substitituting into the general solution we find that X = 1, which gives us the remarkable equation that

<br /> 1 = \sqrt{0 + {\sqrt{0 + {\sqrt{0 + {\sqrt{0 + {\sqrt{0 + ...}}}}}}}}}<br />

Whoa! Obviously something isn't quite kosher here. I do note that taking the negative sign of the square root from the generalized form does give us the expected result. But I'd like to know if there is some fundamental principle I can use to justify excluding the positive root in this case. Hopefully, something more revealing than "because it just doesn't work" might be offered. But I don't see what it might be.

Can anyone help? Thanks.

Bob
 
Physics news on Phys.org
By squaring both sides of an equation you can introduce extra solutions which do not solve the actual equation.

For example, if you have the equation x=1 then you clearly have only one solution. But if you square both sides then you get x^2=1 which has two solutions.

If you square an equation on both sides and then re-arrange it to get a set of solutions, you've only actually produced a set of possible solutions. You still have to go through all your answers and check if they are actual solutions.
 
and specifically here one can show that x^2=x, thus x=0 or 1, and clearly only one is valid, just as in the original equation where the OP excludes for no good reason (if his final conclusion is to be pointful) the other root of the equation x^2-x-1 from being equal to the continued root he writes down.
 
<br /> X = \sqrt{n + {\sqrt{n + {\sqrt{n + {\sqrt{n + {\sqrt{n + ...}}}}}}}}}<br />
so
(\pm X)^2= (n + \sqrt{n + {\sqrt{n + {\sqrt{n + {\sqrt{n + {\sqrt{n + ...}}}}}}}}})
so
(\pm X)^2 - X - n=0

X=\frac{1 \pm \sqrt{1 + 4n}}{2}

When you try n=0 you get:
\frac{1 \pm 1}{2}
Which yields X=0 or X=1.
 
Last edited:
This is an interesting sort of problem because there is a hidden ambiguity.

A lot of times, when we write a limiting process with an ellipsis, such as

1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \ldots

the notation specifies the starting point of the iteration.


With these infinitely nested radicals, however, the initial point is not specified!


To write these more properly, we would do something like

<br /> \begin{array}{l}<br /> s_{k+1} = \sqrt{n + s_k} \\<br /> s = \lim_{k \rightarrow \infty} s_k<br /> \end{array}<br />

But the missing piece, as mentioned, is where do we start; what value do we give for s_0? Depending on what we choose for s_0 (and also depending on n), the convergence of this limiting process can vary wildly.
 
Originally posted by Hurkyl
But the missing piece, as mentioned, is where do we start; what value do we give for s_0? Depending on what we choose for s_0 (and also depending on n), the convergence of this limiting process can vary wildly.

That's an interesting observation, Hurkyl, and might be closer to the reason why I find these kinds of things so fascinating. While I agree with the posts about why the infinitely nested roots of zero doesn't produce one, I also note that if we let n approach zero, the equation tends to one and not zero. That matches the predictions of the generalized form for all n except zero, where there is a discontinuity. It's true the generalized solution still works if we take the negative root, but why switch to that root only when n is zero and for no other values of n? It's because of the discontinuity! Right? So perhaps my real question is "why is there a discontinuity in this series?" Perhaps your observation provides the needed clue (though I'm not at all sure--I'll need to think about this, and I don't know if my thinker is up to it!)

Discontinuities always catch me by surprize, and often leave me feeling unsure and unsatisfied. That is, if the discontinuity is really a key. Maybe it isn't. Also, why can't we start the series from the left, much like as you had it? As in
<br /> \begin{array}{l}<br /> s_0 = n \\<br /> s_1 = n + \sqrt{s_0} \\<br /> s = \sqrt{\lim_{k \rightarrow \infty} n + \sqrt{s_k}}<br /> \end{array}<br />

Isn't that unambiguously defined? It'd be easy enough to program it that way into a computer to look for convergance. Is there any reason that's not an acceptable form for the problem? Then again, it does look awkward, doing things after going to infinity .

So maybe we could just skip the final troublesome post limit root and accept that the limit gives us s + n. At least that eliminates the "post-infinity" contusion.

Another thought I had was how would this system work in the complex domain? It'd be fascinating to find out, but I don't have much formal training in this area. I'd be willing to put some independent study into it, if you or anyone else could recommend a book that might address this topic (intermediate level at most, since I don't have anyone to go to with questions if I get lost in an advanced text). But maybe it wouldn't be relevant; I don't know.

In any case, thanks to you and all the others who responded to my post. I hope there will be more of them.
 
Yes, your version of the recursion is unambiguous. (So is my version, if you include a s_0 term)


There's some work to be done if you wanted to try to extend to the complex domain; the first of which is you have to specify in some way which of the square roots you take at every step.


There are several reasons to expect 0 to be special. For instance, 0 is the smallest real number that has a square root. n=0 is the largest real number so that the formula for X has two positive roots.

I -think- you can prove that if n is negative, then s_k will eventually be negative, and thus continued iteration becomes undefined.

Also, notice that if you let s_0 be any positive number when n = 0, the sequence will converge to 1, not 0!


One thing to inspect in this type of problem is the formula for the distance between iterates and the answer. For example, find a formula for |s - s_{n+1}| in terms of |s - s_n|, and see if you can find a criterion for this sequence to be decreasing.
 
Originally posted by Hurkyl
There's some work to be done if you wanted to try to extend to the complex domain; the first of which is you have to specify in some way which of the square roots you take at every step.

Perhaps using a Taylor/Mclaurin series for square root?


There are several reasons to expect 0 to be special. For instance, 0 is the smallest real number that has a square root. n=0 is the largest real number so that the formula for X has two positive roots.

I -think- you can prove that if n is negative, then s_k will eventually be negative, and thus continued iteration becomes undefined.

Have you tried s_0=\frac{1}{4}, n=-\frac{1}{4}? The limit is really easy to find ;).

The limiting condition appears to be that n \geq -\frac{1}{4}

Also, notice that if you let s_0 be any positive number when n = 0, the sequence will converge to 1, not 0!

For n \neq 0 the limit appears to be n+\frac{1\pm \sqrt{4n+1}}{2}
 
Last edited:
Bah you're right; I forgot for a moment that &radic; makes numbers in (0, 1) bigger.

It's nice (if you can prove it) that it works out for n >= 1/4, (With appropriate s_0) because that's the domain of definition of the formula you game.
 
  • #10
Originally posted by Hurkyl
Bah you're right; I forgot for a moment that &radic; makes numbers in (0, 1) bigger.

It's nice (if you can prove it) that it works out for n >= 1/4, (With appropriate s_0) because that's the domain of definition of the formula you game.

Consider the following:

s_1=n + \sqrt{s_0}
now, if s_i is constant then, s_1=s_0 so we have s_0=n+\sqrt{s_0}. Which solves to give s_0=n+\frac{1\pm \sqrt{4n+1}}{2}. Now, for n&gt;0 it's possible to show that the solution is n+\frac{1+ \sqrt{4n+1}}{2} the only stable one, and for n&lt;0 n+\frac{1- \sqrt{4n+1}}{2}.

In the n=0 case s_0=0 and s_0=1 both lead to stable sequences.

To show that these are the only limit points, consider that 0=f(x)=n+x^{\frac{1}{2}}-x at any
limit point, and that the limit is stable if f&#039;(x) &lt; 0 in some neighborhood of x

Clearly s_0 &gt; -n|n| is required for the sequence to be defined in the real domain.
 
  • #11
In the n = 0 case, there isn't a neighborhood around x = 0 such that f'(x) < 0. (&radic;x grows faster than x) 0 is an unstable fixed point.
 
  • #12
I think I see an issue with this that could simply explain the discontinuity. Hidden in the unwritten portion off the right of the original formula, is a term that is essentially
n^\frac{1}{2^i}
as i increases without bound. Although n^0 = 1 and 0^i = 0, 0^0 is undefined. When we substitute n = 0 into the generalized form, hidden in the ellipses, we have a term which could only be written as 0^\frac{1}{\infty} which is effectively 0^0, right?

Is that an explicit axiom of arithmetic, that 0^\frac{1}{\infty} is undefined? If not, does it need to be? Could I name it after me, as “Cairone’s axiom of infinitely small exponents”? Granted, it’s trivial, but it might be my only shot at mathematical fame!

Well, in any case, if this is actually a proper objection, it satisfies the sense of unease I’ve had without diminishing the interest in the system. And I certainly appreciate the opportunity to talk these things out in this forum, instead of working in a vacuum. Since I’m not a mathematician and my casual reading of mathematical books gets strange looks from the other members of my family, there aren’t many places I can ask questions or talk about such things. This area is excellent.
 
Last edited:
  • #13
No, because \infty is not part of arithmetic!

But I know what you mean, and it is a fact from calculus that 0^{1/\infty} is an indeterminate form.
 
Back
Top