Black hole entropy and the log of the golden mean

  • Thread starter marcus
  • Start date

marcus

Science Advisor
Gold Member
Dearly Missed
24,713
783
there is a curious 1996 paper by Rovelli that gets the
black hole entropy/area formula by a simple counting method.

I say simple advisedly---a lot of combinatorics and counting is
not really simple at all but IMHO difficult---but in this amazing little
5 page paper the counting of partitions of a number, which is all it really is, really is simple

and like so much other combinatorics and elementary arithmetical jazz the golden mean shows up, so he is finding the BH entropy using the logarithm of the golden mean---got a chuckle out of that for some reason

well, here's the link if you want a look
http://arxiv.org/gr-qc/9603063 [Broken]

notice that he was just looking at a Schwarzschild hole, looking at the simplest thing he could, and oversimplifying indeed IMO not even paying attention like riding a bicycle with your eyes shut and the amazing thing is that his was one of the very first papers (either in string or loop) and he came within a factor the size of the, say, immirzi parameter (which he was ignoring). such things are governed by a kind of graceful luck I believe.

For comparison here is a 1996 string paper which is the first
instance of a string explanation.
http://arxiv.org/hep-th/9601029 [Broken]
It is by Strominger and Vafa and applies to 5-dimensional "extremal" holes by counting the degeneracy of "BPS soliton"
bound states.


Marcus
_______________

A foxhunt by British gentry has been described as "the unspeakable in pursuit of the inedible"
 
Last edited by a moderator:

marcus

Science Advisor
Gold Member
Dearly Missed
24,713
783
the counting

It turns out that all Rovelli has to do to find the entropy of the BH is to get a grip on something he calls N(M) which is

the number of sequences of positive integers p1, p2,...,pn of any finite length
which "add up" to a large number M, in a certain unusual way:

Σ sqrt(pi(pi + 2)) = M


This is almost like finding the number of PARTITIONS of the number M----the finite sequences of whole numbers that
add up to M in the usual sense
-------------------
Why is it that easy? Because for a given spin-network, or a given state of the geometry (a given giant polymer representing the exitation of the geometry that confers area and volume on things)
the area of the BH is

A = 8 pi hbar G Σ sqrt(pi(pi + 2))

that is just the renouned and revered quantum gravity area formula, where the p's are numbers 1,2,.... called colors attached to segments of the polymer which happen to pass thru the surface----the edges of the graph confer area to surfaces they pass thru and the vertices confer volume to regions they lie within.
The graph is just a way of summarizing a lot of geometrical info like that in a way that is highly adapted to being quantized and makes for a good theory.

Because of background independence or so called covariance all that matters is the number of punctures and the colors on them---location is factored out as a "gauge" or physically irrelevant thing.
So all that matters is the p's and all he has to do is count them
------------------------------

And it's not too different from counting arithmetical partitions of integers.

To keep his backpack light, Rovelli defines

M = A/(8 pi hbar G)

So now counting the p's that give him A
A = 8 pi hbar G Σ sqrt(pi(pi + 2))
is the same as counting the p's that give him M
M = Σ sqrt(pi(pi + 2))
And N(M) ----the number of microstates----is the how many
p-sequences do that. As I mentioned at the top.
And the entropy is going to be the log of the number of (independent, i.e. really different) microstates that give the same
outward appearance namely the same area.

So he is going after the logarithm of N(M)

In fact Rovelli gets a grip on N(M) by clamping it between upper and lower bounds which do count ordinary-type partitions and which he denotes for convenience by N+(M) for the upper and by N-(M) for the lower



His N+(M) is just the number of finite sequences that satisfy a slightly less stringent condition namely

Σ sqrt(pi(pi)) = M

It is the same as before but with the 2 removed and it boils down to simply

Σ pi = M

The number of those things is 2M-1 so there is no trouble seeing

ln N(M) < M ln 2

To get his lower bound N-(M) he needs a slightly more stringent condition and he recalls the old Grade School thing
that A2 - B2 = (A + B)(A - B) and says

sqrt(pi(pi + 2)) = sqrt((pi + 1)2 - 1) which is about (pi + 1)

So his His N-(M) is just the number of finite sequences that satisfy a slightly more stringent condition namely

&Sigma; (pi + 1) = M

this means counting the partitions of M into chunks that are at least 2 in size and the Golden Mean comes in and he finds

ln N-(M) = M ln (1 + sqrt 5)/2 = M ln (G.M.)

Now he has some number D which is estimated to be between
ln (G.M.) and ln 2 and

ln N(M) = D M = D A/(8 pi hbar G) = (D/(8 pi hbar G)) A

the rest is mopping up
according to the official def of thermodynamical entropy
he is looking for Boltzmann's k times log of the number of states corresponding to a certain area. N(M) and N(A) are the same
state-count

S = k ln N(A) = M = [kD/(8 pi hbar G)] A

That means he has derived Beckenstein-Hawking except that the number in front is kD/8 pi instead of 1/4. But it's 1996 and that will be taken care of in time----what matters is the proportionality to the area (it could be some totally screwed up function of the area or no clear function of it at all but it isnt---it is some small number times the area). So it seems like a good place to stop.
 
Last edited:

Related Threads for: Black hole entropy and the log of the golden mean

Replies
2
Views
2K
Replies
1
Views
1K
  • Posted
Replies
11
Views
5K
  • Posted
2 3
Replies
62
Views
11K
Replies
1
Views
601
  • Posted
Replies
9
Views
5K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top