Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Hard Problem

  1. Dec 20, 2007 #1
    Can any of you solve this? :smile:

    Firstly, some notation:

    Let [tex]\Pi(x) = \Gamma(x+1)[/tex] where [tex]\Gamma(x)[/tex] is the usual gamma function i.e. an extension of the factorial to the complex numbers.

    Let [tex]log^{n} (x) = log( log( \cdots log( x ) ) )[/tex] where [tex]log[/tex] is applied n times to x e.g. [tex]log^{4} (x) = log( log( log( log( x ) ) ) )[/tex].

    Similarly, let [tex]{\Pi}^{n} (x) = \Pi( \Pi( \cdots \Pi( x ) ) )[/tex] where [tex]\Pi[/tex] is applied n times to x.


    Let [tex]a_{n} = log^{n} ( {\Pi}^{n} (3) )[/tex].

    Evaluate, using a computer of otherwise, [tex]\lim_{n \rightarrow \infty}{ a_{n} }[/tex] to five decimal places.

    (If you are feeling especially clever try to derive a closed form expression for [tex]\lim_{n \rightarrow \infty}{ a_{n} }[/tex]).
  2. jcsd
  3. Dec 20, 2007 #2
    Even just a discussion on the sort of approaches that people would take to solve such a problem would be interesting.
  4. Dec 20, 2007 #3


    User Avatar
    Science Advisor
    Homework Helper

    With a computer you will soon run into overflow error (e.g. for any n > 3).
  5. Dec 20, 2007 #4
    Did you try to throw Stirling's Approximation at it?
  6. Dec 20, 2007 #5

    Gib Z

    User Avatar
    Homework Helper

    I did, but then I couldn't be bothered to find the general Approximation for iterating it n times. It's the OP's question, so he might have to bother.
  7. Dec 20, 2007 #6
    Of course you will get overflow error if you compute it the naive way. The idea is to write it in a different form so you can calculate it without overflow.

    I have already solved the problem. Just interested in what others have to say about it.
  8. Dec 21, 2007 #7


    User Avatar
    Science Advisor
    Homework Helper

    I'd really like to see a demonstration that the limit is anything other than infinity.
  9. Dec 21, 2007 #8
    Yes I was trying to figure that out too. The only thing I have figured out so far is that it's monotonic increasing:

    [tex]log^{(n)}(\Pi^{(n)}(3)) = log^{n-1}(log(\Pi^{n}(3))) \geq log^{(n-1)}(\Pi^{(n-1)}(3))[/tex]
    [tex]\Leftrightarrow log \Pi^{n}(3) = log \Pi^{(n-1)}(3)! \geq \Pi^{(n-1)}(3)[/tex]
    [tex]\Leftrightarrow \Pi^{(n-1)}(3)! \geq exp(\Pi^{(n-1)}(3))[/tex]
    [tex]\Leftrightarrow \Pi^{(n-1)}(3) \geq 6[/tex] (n! > exp(n) when n >= 6.)
    [tex]\Leftrightarrow n \geq 2[/tex].

    It follows that [tex]a_n \geq a_{n-1}[/tex] for n = 2,3,...
    Still seems like there's a long way to go.

    The following webpage was interesting, but does not seem easy to apply here, if at all possible:http://www.rskey.org/gamma.htm
    Last edited: Dec 21, 2007
  10. Dec 21, 2007 #9


    User Avatar
    Science Advisor
    Homework Helper

    Actually, it should be monotonically decreasing. Which means that the limit may in fact be finite.

    "Loosely," the first PI results in 4!. The first Log application converts that into the sum Log[1] + Log[2] + Log[3] + Log[4] = x < 4 (x is about 3). The second nested PI is then x!. The second nested Log turns this into Log[1] + ... + Log[x] = y < x. (y is about 2.)

    So the first three terms (including n = 0) are 4 > x (approx. 3) > y (approx. 2).
  11. Dec 21, 2007 #10
    My calculations are:
    a_1 = log 3! = 1.79175946922806
    a_2 = log log 3!! = log log 720 = 1.88392094129494

    The javascript calculator at http://www.univie.ac.at/future.media/moe/JavaCalc/jcintro.html has a "loggamma" function, I used it to compute the first three terms:
    a_1 = loggamma(3+1) = 1.7917594692280545
    a_2 = log(loggamma(3!+1)) = 1.8839209412949373
    a_3 = log(log(loggamma(3!!+1))) = 2.1161775528289275
    a_4 = log(log(log(loggamma(3!!!+1)))) = NaN

    But it seems you get stuck no matter what if you try to calculate using
    log N!! = log 2 + ... + log N!, because the log N! term gives NaN.

    Another approach to calculating it is to set

    [tex]s_1 = a_1[/tex]
    [tex]s_{n+1} = a_{n+1}-a_n[/tex], and consider
    [tex]a_n = \sum_{i = 1}^n s_n[/tex].

    So you can add term by term if you can figure out a way to calculate
    [tex]s_n = a_n - a_{n-1} = log^{(n)}(\Pi^{(n)}(3)) - log^{(n-1)}(\Pi^{(n-1)}(3))[/tex]
    [tex]\ \ \ = log( \frac{ log^{(n-1)}(\Pi^{(n-1)}(3)!) }{ log^{(n-2)}(\Pi^{(n-2)}(3)!) } ) [/tex].

    Also from this expression it seems plausible that s_n -> 0 as n -> infinity, or perhaps
    [tex]|\frac{s_{n+1}}{s_n}| \rightarrow L < 1[/tex] as n -> [tex]\infty[/tex], which would show a_n = sum s_n converges.

    But for some reason it seems to me like the answer won't be that simple.. Actually, the root test may be better..
  12. Dec 21, 2007 #11
    I think he's saying, for positive integers k,
    [tex]\Pi(k) = \Gamma(k+1) = k![/tex], so
    [tex]{\Pi}^{n} (3) = 3!!...![/tex], with n !'s.
    [tex]log^{(n)}(\Pi^{(n)}(3)) = log^{(n)}(3!!...!)[/tex].
  13. Dec 21, 2007 #12


    User Avatar
    Science Advisor
    Homework Helper

    Thanks. I started out by assuming Gamma[x] = x!, which should have been (x-1)!.

    I also have been applying the (alternating) Log[PI[Log[PI...]]], which is different than Log[...Log[PI[...PI]]].

    BTW, the alternating function ---> 0.
    Last edited: Dec 21, 2007
  14. Dec 22, 2007 #13
    I don't want to tell you how I solved the problem for a good reason. I found it to be quite interesting and I want to see other creative and unique ideas for solving it. If I tell you how I did it then I probably will not see other creative approaches.

    Here is some carrot: a_4 = 2.1164259
    Last edited: Dec 22, 2007
  15. Dec 22, 2007 #14

    Gib Z

    User Avatar
    Homework Helper

    Thanks for that carrot, now I can give you the answer! That is, provided I get 100 guesses.
  16. Dec 23, 2007 #15
    Okay some more carrot: a_5 = 2.1164259

    Anybody solve it on their own?
  17. Dec 23, 2007 #16


    User Avatar
    Science Advisor
    Homework Helper

    The Grinch in this problem is the memory overflow error (numbers too large for a desktop computer to handle).

    I am thinking a statistical sampling approach.

    Gamma(x) is the expected value of g(t;x) = tx-1 where t is an exponential random variable with mean = 1 (t ~ exp(1)).

    Therefore Gamma(x+1) = E[g(t;x+1)] = E[tx].

    To prevent a memory overflow error, I'd look for a way to scale t and g(t) so that E[g(h(t);x)] = E[k(x)g(t;x)] = k(x)E[g(t;x)] for some functions h and k.

    E.g. let k be a scaling constant and let h(t) = t/k. Then E[g(h(t);x)] = E[g(t/k;x)] = E[tx/kx] = E[tx]/kx. Conversely, if E[tx/kx] = [itex]\mu[/itex] then E[tx] = kx[itex]\mu[/itex].
    Last edited: Dec 23, 2007
  18. Dec 23, 2007 #17
    Sounds interesting.

    Also I would like to see how you guys would go about proving convergence of the sequence.

    The one thing I couldn't do was come up with a closed form expression for the value. The person who set the problem couldn't either so I didn't waste too much time on this.
  19. Dec 23, 2007 #18
    I may have more on this soon (or may not)... Something I was able to show that might help (or might not) is:

    [tex]\frac{log \ log \ N!}{log \ N} \rightarrow 1[/tex] as [tex]N \rightarrow \infty[/tex]


    Use the fact that [tex]N \cdot log(N) - N \leq log(N!) \leq N \cdot log(N)[/tex],
    which can be proved by showing
    [tex]Nlog(N) - N + 1 = \int_1^N log(t) dt \leq log(N!) \leq \int_1^N log(t) + 1 \ dt = Nlog(N)[/tex].

    You then also have to use the fact that [tex]lim_{M \rightarrow \infty} \frac{log(M)}{M} = 0[/tex] to get the limit of the log of the left side and the log of the right side:
    [tex]\frac{log(Nlog(N) - N)}{log(N)} = \frac{log(N) + log(log(N) - 1)}{log(N)} = 1 + \frac{log(log(N) - 1)}{log(N)} \rightarrow 1 + 0 = 1[/tex],
    [tex]\frac{log(Nlog(N))}{log(N)} = \frac{log(N) + log(log(N))}{log(N)} = 1 + \frac{log(log(N))}{log(N)} \rightarrow 1[/tex].

    The claim follows by the squeeze theorem.

    It seems you might be able to apply this to proving convergence as in:

  20. Dec 23, 2007 #19
    Another kind of striking observation...
    Let [tex]exp^n(x) = exp(exp(...exp(x)...))[/tex] (similar to log^n(x)).

    For fixed k >= 1 define a sequence
    [tex]b_n^k = exp^k(a_{n+k}) = log^n(\Pi^{n+k}(3))[/tex].

    Notice that for any sequence c_n and any k >= 1 you have
    [tex]c_n \rightarrow \infty[/tex] if and only if [tex]exp^k(c_n) \rightarrow \infty[/tex].

    It follows that our sequence a_n converges if and only if [tex]b_n^k[/tex] converges for all k. In other words we can instead look at a sequence with starting value [tex]\Pi^{k+1}(3)[/tex] as large as we want. Still not sure if this is going to lead to a solution.
  21. Dec 23, 2007 #20
    Now I have been able to show for k >= 1
    [tex]\frac{log^{k+1}(N!)}{log^k(N)} \rightarrow 1[/tex] as [tex]N \rightarrow \infty[/tex]

    So it seems that's an inch away from proving convergence.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook