# Another convergence/divergence question

1. Mar 29, 2005

### philosophking

Another convergence/divergence question :)

Hi, thank you in advance for your help. This was a practice problem for the New Jersey Undergraduate Mathematics Competition last year.

Prove whether or not the following converges:

$$\sqrt{1+\sqrt{2+\sqrt{3+...}}}$$

Forgive my LaTeX, I'm still learning :). Suggestions on that would be appreciated as well, haha.

For this problem, I was thinking of reducing it to the following:

$$\sqrt{\sqrt\ ... \sqrt{n}}$$

which reduces to $$n^{1/n}$$,

which I would then show diverges, because I think it does diverge, but I'm just not sure.

Thanks for the help.

Last edited: Mar 29, 2005
2. Mar 29, 2005

### joeboo

Ok, here's what I think ( if you think you're getting the idea halfway thru, stop reading as I post a full solution ):

$$A, B \geq 1 \longrightarrow 0 < \frac{1}{A} + \frac{1}{B} \leq 2$$

This gives:
$$2 \leq A+B \leq 2A \cdot B$$

$\ln(x)$ increasing on $(0, +\infty )$ gives:

$$\ln(2) \leq \ln(A+B) \leq \ln (2A \cdot B) = \ln(2) + \ln(A) + \ln(B)$$

Now define:
$$S(n) = \sqrt{1+\sqrt{2+\sqrt{3 + \dots +\sqrt{n}}}}$$

Now, I claim, by taking the logarithm of the $S(n)$, and repeatedly applying the above inequality, we obtain:
$$0 \leq \ln(S(n)) \leq -\frac{\ln(2)}{2^n} + \sum_{k=1}^n \frac{\ln(2) + \ln(k)}{2^k}$$
or:
$$1 \leq S(n) \leq e^{-\frac{\ln(2)}{2^n} + \sum_{k=1}^n \frac{\ln(2) + \ln(k)}{2^k}}$$

Letting $n \rightarrow \infty$ gives the desired result ( noting that the series in the exponent will converge. )

( hope that's right, hehe )

-joeboo

edit: I just wanted to add 1 thing that may prevent some small confusion-
the $-\frac{\ln(2)}{2^n}$ is caused because the last term in the series doesn't get a "ln(2)" term added in because we are not applying the inequality ( ie, $\ln(\sqrt{4})$ in the S(4) expansion doesn't need to be dealt with, so there will be no accompanying ln(2) term ) - I hope that helps

Last edited: Mar 29, 2005
3. Mar 30, 2005

### philosophking

I'm sorry, this seems very complicated. Is there anyway you could elucidate?

What do you think of my idea? Of taking some slightly smaller sum ($$n^{1/n}$$), trying to show that the limit of that does not approach zero, and concluding that the bigger sum does not converge either? It seems pretty easy to take the limit of what that is, but I just don't know how to do it.

Thanks again for your help.

4. Mar 30, 2005

### joeboo

What I'm saying in my previous post is that it converges.

5. Mar 30, 2005

### philosophking

I know you are.

I was just wondering if you could either elucidate your argument or criticize my argument?

Thank you again for the help.

6. Mar 30, 2005

### joeboo

Sorry.

I'll try and elaborate.
You're trying to determine if:
$$S = \sqrt{1+\sqrt{2+\sqrt{3+\dots }}}$$
Exists or not.

All I'm doing is establishing a new sequence,

$$S(n)=\sqrt{1+\sqrt{2+\sqrt{3+\dots +\sqrt{n}}}}$$

such that:
$$\lim_{n \rightarrow \infty} S(n) = S$$

Clearly, $S(n)$ is increasing. Then I show that it is bounded below by $1$ and bounded above by:

$$e^{-\frac{\ln(2)}{2^n} + \sum_{k=1}^n \frac{\ln(2) + \ln(k)}{2^k}}$$

I arrive at that conclusion by applying the inequalities with the logarithm over and over to the $S(n)$ terms
Now, the exponent is composed of 2 terms:

$$-\frac{\ln(2)}{2^n}}$$

and:
$$\sum_{k=1}^n \frac{\ln(2) + \ln(k)}{2^k}$$

the first term goes to $0$ as $n \rightarrow \infty$ and the second series converges as well ( root test, ratio test both work well ).
Therefore, the sequence $S(n)$ is convergent as it is a monotonic sequence bounded by 2 convergent sequences.

As far as critiquing your argument, I'm not entirely sure I understand it.
I'm confused here because in the original statement of the problem, there is no infinite sum. It's a "continued radical" ( if there is such a term ).
If you are suggesting the use of:
$$\sqrt{n^\frac{1}{n} +\sqrt{n^\frac{1}{n} +\sqrt{n^\frac{1}{n} +\dots}}}$$
in some fashion I would suggest you consider the following:
$$\sqrt{1+\sqrt{1+\sqrt{1+\dots}}} = \frac{1+\sqrt{5}}{2}$$

Sorry if I'm making this seem too complicated. If you want to be more specific with your method ( as I'm not sure what you're suggesting ), I'd happily go into more detail.

-joeboo

7. Mar 30, 2005

### t!m

Well $$\lim_{x\rightarrow \infty} n^{1/n}$$ does equal 1, if that helps at all. This would imply divergence, assuming the original assumption is correct, which I'm not sure it is.

Let $$y=n^{1/n}$$
$$\ln y=\frac{1}{n} \ln n=\frac{\ln n}{n}$$
$$\lim_{x\rightarrow \infty} \ln y=\lim_{x\rightarrow \infty} \frac{\ln n}{n}=\lim_{x\rightarrow \infty}\frac{1}{n}=0$$
$$\lim_{x\rightarrow \infty} y=e^0=1$$

8. Mar 30, 2005

### t!m

Yeah, a quick check with a calculator will in fact show that this will converge to about 1.758ish.

9. Mar 31, 2005

### philosophking

Thank you joe, I apologize for having you type most of your stuff out again, but that definitely helps, and I see where you obtained your expressions. I see how at the end you're showing that since the original expression is between two things that converge, it also converges, and that's how you come to your conclusion.

I guess I just have to figure out where I went wrong. This is how I see the problem:

$$\sqrt{1+\sqrt{2+\sqrt{3+... \sqrt{n}}}}\geq \sqrt{\sqrt{\sqrt{...\sqrt{n}}}}$$

And the RHS is basically n square roots, i.e., $$n^{1/n}$$ is what the RHS is. Now, a theorem states that if a series converges, then the limit is zero. I.e., if the limit is not zero, the series does not converge (diverges). Also, if one series is bigger than the other, and the smaller diverges, we know the larger one diverges as well.

Hence, if we can show that limit as n->infinity of the RHS is not zero, we could show that the series diverges, which implies the original diverges.

Where do I go wrong? Is it in my assumption that it is an "infinite sum" or whatever, as you said?

10. Mar 31, 2005

### Data

does it make sense to say that since

$$\lim_{x \rightarrow \infty} 1 = 1 \neq 0$$

that the sequence $\{1\}$ diverges?

Indeed, the theorem you refer to deals with series, not sequences.

If we let

$$x_n=\sqrt{1+\sqrt{2+\sqrt{3+... \sqrt{n}}}}$$

then you are trying to find

$$\lim_{n\rightarrow \infty} x_n$$

ie. the limit of a sequence, not a series.

Your inequality just tells you that if the sequence converges, then it converges to something $\geq 1$

11. Mar 31, 2005

### Alkatran

$$S1(4) = \sqrt{1+\sqrt{2+\sqrt{3+\sqrt{4}}}}$$
$$S2(4) = \sqrt{1+\sqrt{2+\sqrt{3+4}}}$$

$$\sqrt{1+\sqrt{2+\sqrt{3+\sqrt{4}}}} <= \sqrt{1+\sqrt{2+\sqrt{3+4}}}$$
$$S1 <= S2$$

$$\sqrt{1+\sqrt{2+\sqrt{3+4}}} < \sqrt{1+\sqrt{2+3}}$$
This behavior continues as n increases, with each term being slightly smaller. This is because x < sqr(2x + 1) when:
x=sqr(2x+1)
x^2 = 2x + 1
x^2 - 2x - 1 = 0
x = 2.4142
So it must get smaller when n increases once n is greater than or equal to 3.
$$S2(n) > S2(n+1)$$

$$\sqrt{1+\sqrt{2+\sqrt{3+\sqrt{4}}}} >= \sqrt{1+\sqrt{2+\sqrt{3}}}$$
This is obviously true, since we're adding more. Each term will be slightly larger.
$$S1(n) < S1(n+1)$$

So we have
$$S1(n+1) > S1(n)$$
$$S2(n+1) < S2(n)$$
$$S1 <= S2$$

S1 can't go past S2 and it must increase. S2 can't go past S1 and it must decrease. Looks like convergence to me!

Last edited: Mar 31, 2005
12. Mar 31, 2005

### shmoe

The RHS is not $$n^{1/n}$$, it's $$(((n^{1/2})^{1/2})\ldots)^{1/2}=n^{1/2^n}$$.

In anycase $$n^{1/n}$$ goes to 1 as n goes to infinity, so the best you could say is if your orignial series converges, then it converges to something greater than or equal to 1.

You can also bound your original sequence by 2 without too much trouble. Take $$x_{1}=2$$ and define $$x_{i+1}=x_{i}^2-i$$ for all i>0. (note that we have $$x_{i}=\sqrt{i+x_{i+1}}$$ as well). Show by induction that $$x_{i}^2> 2i$$ for all i. Now to show S(n) (using the same notation as joeboo) is less than 2 consider $$x_{n}$$ and work backwards:

$$\sqrt{n}\leq x_{n}$$

$$\sqrt{(n-1)+\sqrt{n}}\leq\sqrt{(n-1)+x_{n}}=x_{n-1}$$

... (use a finite induction here)

$$S(n)=\sqrt{1+\sqrt{2+\sqrt{3+\dots +\sqrt{n}}}}\leq x_{1}=2$$

13. Mar 31, 2005

### omagdon7

It's convergence is pretty simple to show N^(1/N) = (1)/(N^N) if you take the limit as N goes to infinity it is 0 and if the limit of a sequence as N goes to infinity is some number the sequence converges.

14. Mar 31, 2005

### Data

$$n^{\frac{1}{n}} \neq n^{-n}$$

.....

15. Apr 1, 2005

### saltydog

Alright, I'm pretty convinced. I wrote a Mathematica program to plot the trend up to n=25 (you know, nested 25 times and all the ones before it). It seems to jump pretty quick to a value around 1.75793. See the attached plot. Can anyone prove this?

Ok, I see T!M already showed this. A plot is Ok too though.

#### Attached Files:

• ###### nestedroot.JPG
File size:
4.1 KB
Views:
41
Last edited: Apr 1, 2005
16. Apr 1, 2005

### saltydog

According to Herschfield's Convergence Theorem (which I just learned):

$$\sqrt{x_1+\sqrt{x_2+\sqrt{x_3+...}}}$$

converges iff the sequence:

$$\{(x_n)^{1/{2^n}}\}$$

is bounded.

So, for positive terms, it's bounded below by 0 and for:

$$\sqrt{1+\sqrt{2+\sqrt{3+...}}}$$

we have:

$$(x_n)^{1/{2^n}}=n^{1/{2^n}}<(2^n)^{1/{2^n}}=2^{n/{2 ^n}}$$

and:

$$\lim_{n\rightarrow\infty}2^{n/{2^n}}=2^0=1$$

Thus the sequence is bounded and therefore the nested square root converges.

17. Apr 1, 2005

### philosophking

Very interesting! Nice insight. What class did you learn that in, analysis?

Thanks for your help.

18. Apr 1, 2005

### saltydog

Well, I just reviewed Herschfield's theorem this morning via MathWorld and then applied it to the problem. I believe my analysis is correct but leave it up to you and others reviewing my post to verify or point out flaws.

Also, reviewing the web for nested radicals lead me to belive that determining the value of the limit analytically is not easily done and in fact may not be possible although I yield to anyone who proves me wrong.