# Hard Problem

1. Dec 20, 2007

### sobolev

Can any of you solve this?

Firstly, some notation:

Let $$\Pi(x) = \Gamma(x+1)$$ where $$\Gamma(x)$$ is the usual gamma function i.e. an extension of the factorial to the complex numbers.

Let $$log^{n} (x) = log( log( \cdots log( x ) ) )$$ where $$log$$ is applied n times to x e.g. $$log^{4} (x) = log( log( log( log( x ) ) ) )$$.

Similarly, let $${\Pi}^{n} (x) = \Pi( \Pi( \cdots \Pi( x ) ) )$$ where $$\Pi$$ is applied n times to x.

THE QUESTION:

Let $$a_{n} = log^{n} ( {\Pi}^{n} (3) )$$.

Evaluate, using a computer of otherwise, $$\lim_{n \rightarrow \infty}{ a_{n} }$$ to five decimal places.

(If you are feeling especially clever try to derive a closed form expression for $$\lim_{n \rightarrow \infty}{ a_{n} }$$).

2. Dec 20, 2007

### sobolev

Even just a discussion on the sort of approaches that people would take to solve such a problem would be interesting.

3. Dec 20, 2007

### EnumaElish

With a computer you will soon run into overflow error (e.g. for any n > 3).

4. Dec 20, 2007

Did you try to throw Stirling's Approximation at it?

5. Dec 20, 2007

### Gib Z

I did, but then I couldn't be bothered to find the general Approximation for iterating it n times. It's the OP's question, so he might have to bother.

6. Dec 20, 2007

### sobolev

Of course you will get overflow error if you compute it the naive way. The idea is to write it in a different form so you can calculate it without overflow.

I have already solved the problem. Just interested in what others have to say about it.

7. Dec 21, 2007

### EnumaElish

I'd really like to see a demonstration that the limit is anything other than infinity.

8. Dec 21, 2007

Yes I was trying to figure that out too. The only thing I have figured out so far is that it's monotonic increasing:

$$log^{(n)}(\Pi^{(n)}(3)) = log^{n-1}(log(\Pi^{n}(3))) \geq log^{(n-1)}(\Pi^{(n-1)}(3))$$
$$\Leftrightarrow log \Pi^{n}(3) = log \Pi^{(n-1)}(3)! \geq \Pi^{(n-1)}(3)$$
$$\Leftrightarrow \Pi^{(n-1)}(3)! \geq exp(\Pi^{(n-1)}(3))$$
$$\Leftrightarrow \Pi^{(n-1)}(3) \geq 6$$ (n! > exp(n) when n >= 6.)
$$\Leftrightarrow n \geq 2$$.

It follows that $$a_n \geq a_{n-1}$$ for n = 2,3,...
Still seems like there's a long way to go.

The following webpage was interesting, but does not seem easy to apply here, if at all possible:http://www.rskey.org/gamma.htm

Last edited: Dec 21, 2007
9. Dec 21, 2007

### EnumaElish

Actually, it should be monotonically decreasing. Which means that the limit may in fact be finite.

"Loosely," the first PI results in 4!. The first Log application converts that into the sum Log[1] + Log[2] + Log[3] + Log[4] = x < 4 (x is about 3). The second nested PI is then x!. The second nested Log turns this into Log[1] + ... + Log[x] = y < x. (y is about 2.)

So the first three terms (including n = 0) are 4 > x (approx. 3) > y (approx. 2).

10. Dec 21, 2007

My calculations are:
a_1 = log 3! = 1.79175946922806
a_2 = log log 3!! = log log 720 = 1.88392094129494

The javascript calculator at http://www.univie.ac.at/future.media/moe/JavaCalc/jcintro.html has a "loggamma" function, I used it to compute the first three terms:
a_1 = loggamma(3+1) = 1.7917594692280545
a_2 = log(loggamma(3!+1)) = 1.8839209412949373
a_3 = log(log(loggamma(3!!+1))) = 2.1161775528289275
a_4 = log(log(log(loggamma(3!!!+1)))) = NaN

But it seems you get stuck no matter what if you try to calculate using
log N!! = log 2 + ... + log N!, because the log N! term gives NaN.

Another approach to calculating it is to set

$$s_1 = a_1$$
$$s_{n+1} = a_{n+1}-a_n$$, and consider
$$a_n = \sum_{i = 1}^n s_n$$.

So you can add term by term if you can figure out a way to calculate
$$s_n = a_n - a_{n-1} = log^{(n)}(\Pi^{(n)}(3)) - log^{(n-1)}(\Pi^{(n-1)}(3))$$
$$\ \ \ = log( \frac{ log^{(n-1)}(\Pi^{(n-1)}(3)!) }{ log^{(n-2)}(\Pi^{(n-2)}(3)!) } )$$.

Also from this expression it seems plausible that s_n -> 0 as n -> infinity, or perhaps
$$|\frac{s_{n+1}}{s_n}| \rightarrow L < 1$$ as n -> $$\infty$$, which would show a_n = sum s_n converges.

But for some reason it seems to me like the answer won't be that simple.. Actually, the root test may be better..

11. Dec 21, 2007

I think he's saying, for positive integers k,
$$\Pi(k) = \Gamma(k+1) = k!$$, so
$${\Pi}^{n} (3) = 3!!...!$$, with n !'s.
$$log^{(n)}(\Pi^{(n)}(3)) = log^{(n)}(3!!...!)$$.

12. Dec 21, 2007

### EnumaElish

Thanks. I started out by assuming Gamma[x] = x!, which should have been (x-1)!.

I also have been applying the (alternating) Log[PI[Log[PI...]]], which is different than Log[...Log[PI[...PI]]].

BTW, the alternating function ---> 0.

Last edited: Dec 21, 2007
13. Dec 22, 2007

### sobolev

I don't want to tell you how I solved the problem for a good reason. I found it to be quite interesting and I want to see other creative and unique ideas for solving it. If I tell you how I did it then I probably will not see other creative approaches.

Here is some carrot: a_4 = 2.1164259

Last edited: Dec 22, 2007
14. Dec 22, 2007

### Gib Z

Thanks for that carrot, now I can give you the answer! That is, provided I get 100 guesses.

15. Dec 23, 2007

### sobolev

Okay some more carrot: a_5 = 2.1164259

Anybody solve it on their own?

16. Dec 23, 2007

### EnumaElish

The Grinch in this problem is the memory overflow error (numbers too large for a desktop computer to handle).

I am thinking a statistical sampling approach.

Gamma(x) is the expected value of g(t;x) = tx-1 where t is an exponential random variable with mean = 1 (t ~ exp(1)).

Therefore Gamma(x+1) = E[g(t;x+1)] = E[tx].

To prevent a memory overflow error, I'd look for a way to scale t and g(t) so that E[g(h(t);x)] = E[k(x)g(t;x)] = k(x)E[g(t;x)] for some functions h and k.

E.g. let k be a scaling constant and let h(t) = t/k. Then E[g(h(t);x)] = E[g(t/k;x)] = E[tx/kx] = E[tx]/kx. Conversely, if E[tx/kx] = $\mu$ then E[tx] = kx$\mu$.

Last edited: Dec 23, 2007
17. Dec 23, 2007

### sobolev

Sounds interesting.

Also I would like to see how you guys would go about proving convergence of the sequence.

The one thing I couldn't do was come up with a closed form expression for the value. The person who set the problem couldn't either so I didn't waste too much time on this.

18. Dec 23, 2007

I may have more on this soon (or may not)... Something I was able to show that might help (or might not) is:

$$\frac{log \ log \ N!}{log \ N} \rightarrow 1$$ as $$N \rightarrow \infty$$

proof:

Use the fact that $$N \cdot log(N) - N \leq log(N!) \leq N \cdot log(N)$$,
which can be proved by showing
$$Nlog(N) - N + 1 = \int_1^N log(t) dt \leq log(N!) \leq \int_1^N log(t) + 1 \ dt = Nlog(N)$$.

You then also have to use the fact that $$lim_{M \rightarrow \infty} \frac{log(M)}{M} = 0$$ to get the limit of the log of the left side and the log of the right side:
$$\frac{log(Nlog(N) - N)}{log(N)} = \frac{log(N) + log(log(N) - 1)}{log(N)} = 1 + \frac{log(log(N) - 1)}{log(N)} \rightarrow 1 + 0 = 1$$,
$$\frac{log(Nlog(N))}{log(N)} = \frac{log(N) + log(log(N))}{log(N)} = 1 + \frac{log(log(N))}{log(N)} \rightarrow 1$$.

The claim follows by the squeeze theorem.

It seems you might be able to apply this to proving convergence as in:

19. Dec 23, 2007

Another kind of striking observation...
Let $$exp^n(x) = exp(exp(...exp(x)...))$$ (similar to log^n(x)).

For fixed k >= 1 define a sequence
$$b_n^k = exp^k(a_{n+k}) = log^n(\Pi^{n+k}(3))$$.

Notice that for any sequence c_n and any k >= 1 you have
$$c_n \rightarrow \infty$$ if and only if $$exp^k(c_n) \rightarrow \infty$$.

It follows that our sequence a_n converges if and only if $$b_n^k$$ converges for all k. In other words we can instead look at a sequence with starting value $$\Pi^{k+1}(3)$$ as large as we want. Still not sure if this is going to lead to a solution.

20. Dec 23, 2007

$$\frac{log^{k+1}(N!)}{log^k(N)} \rightarrow 1$$ as $$N \rightarrow \infty$$