Steven Weinberg offers a way to explain inflation

  • #91
If we assume an analogy to the case of the gases at critical point, I'd expect the following situation, if one could somehow draw a critical line representing the point where a horizon forms, in a graph of density vs. radius.

Above the critical line -> outside horizon
Outside the critical line -> inside horizon

Critical point, both situations meet, and it is located at Planck density and Planck size, which is the most extreme point by the way. Matter at that point would tunnel to the lower side. The only way not to violate the extreme is to expand. Perhaps a Planck photon.

Note here that in this reasoning the only way of matter entering in a black hole it is being disassembled to plank scale, and like being digested, and enter there as through that extreme point. I think it is likely that the energy difference of the states between inside and outside will force a fast enough "digestion".

It is interesting to notice that what if instead of a Planck photon, one got the whole universe in there, or a lot of mass. What one would get? A baby universe? A bouncing univese. Because, that critical point also corresponds to (/\,G) point, and so, the IR limit could follow a momentous trajectory that would cause matter to big bang. Or maybe just collapse to the inside of a black hole, until it self digest it.

You see, although the region that allows the passage of matter is short, and greatly favors the flux outside or inside, there is a small probability of it tunnels back through it.
 
Last edited:
Physics news on Phys.org
  • #92
marcus said:
We have no evidence that physics up near Planck scale is basically the physics of black holes. The idea is not current. We just had an important conference on Planck Scale in July! One of the most influential string theorists in Europe, Hermann Nicolai, gave a talk which I would recommend anybody to watch. Did he talk about "A.D."? No.
Some thirty-odd people gave papers about their ideas of the physics from here up to Planck. No discussion of A.D. at least that I'm aware of.

Steve Giddings talked about it.
http://www.ift.uni.wroc.pl/~planckscale/movie/index5.html
http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf
 
Last edited:
  • #93
atyy said:

Giddings talked about "asymptotic darkness"?
I looked through his slides and found no reference to it.
http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf

My impression is he was calling everything into question---including the idea that classical general relativity applies, and by implication the idea that black holes can form at Planck scale. I listened to some of his talk and it impressed me as an "outsider's" talk, he was saying all our present ideas are probably inadequate. He argued that we will need radically new ideas to understand physics at that scale. It was something of a "lone opposition" voice.

Since Giddings' talk was so disconnected from the rest, I would be reluctant to watch it again. But I did review his slides and saw nothing about "asym. darkness" and no reference to Tom Banks.
If you can point me to a slide, which I somehow missed, please do. Or to some point in the talk where he actually claims or assumes that high-energy physics is dominated by black holes. It would be in sharp contrast to the rest of what I've heard, and interesting to pinpoint.

BTW turns out there were some 46 speakers. I had the number wrong earlier, so I corrected it.
 
Last edited:
  • #94
marcus said:
Giddings talked about "asymptotic darkness"?
I looked through his slides and found no reference to it.
http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf

My impression is he was calling everything into question---including the idea that classical general relativity applies, and by implication the idea that black holes can form at Planck scale. I listened to some of his talk and it impressed me as an "outsider's" talk, he was saying all our present ideas are probably inadequate. He argued that we will need radically new ideas to understand physics at that scale. It was something of a "lone opposition" voice.

Since Giddings' talk was so disconnected from the rest, I would be reluctant to watch it again. But I did review his slides and saw nothing about "asym. darkness" and no reference to Tom Banks.
If you can point me to a slide, which I somehow missed, please do. Or to some point in the talk where he actually claims or assumes that high-energy physics is dominated by black holes. It would be in sharp contrast to the rest of what I've heard, and interesting to pinpoint.

BTW turns out there were some 46 speakers. I had the number wrong earlier, so I corrected it.

I was thinking of slide 6 of http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf , ie. transplanckian collisions produce black holes.

Yes, maybe a lone ranger - but not as much as Markopoulou - I think her work about emergent locality is inspired by similar considerations.
 
  • #95
atyy said:
Yes, maybe a lone ranger - but not as much as Markopoulou - I think her work about emergent locality is inspired by similar considerations.

I take the first bit back - he's an alpinist!
 
  • #96
BTW, I'm not sure if I have this right, but I don't think Asymptotic Darkness is Tom Banks's idea - it's the name that was his.

The idea that transplanckian collisions produce black holes can be found eg.

http://arxiv.org/abs/gr-qc/9510063
Structural Issues in Quantum Gravity
Chris Isham
"This has been emphasised recently by several people and goes back to an old remark of Bekenstein: any attempt to place a quantity of energy E in a spatial region with boundary area A—and such that E > √A—will cause a black hole to form, and this puts a natural upper bound on the value of the energy in the region (the argument is summarised nicely in a recent paper by Smolin)."

http://arxiv.org/abs/gr-qc/9508064
The Bekenstein Bound, Topological Quantum Field Theory and Pluralistic Quantum Field Theory
Lee Smolin
"This suggests that, ultimately, a quantum theory of gravity will not be formulated most simply as a theory of fields on a differential manifold representing the idealized-and apparently nonexistent-“points” of space and time. To put this another way, the space of fields-the basic configuration space of classical field theory-has been replaced in the quantum theory by abstract Hilbert spaces. At the same time, ordinary space, in these formulations, remains classical, as it remains the label space for the field observables. This perpetuates the idealization of arbitrarily resolvable space-time points, that the results of string theory, non-perturbative quantum gravity and semiclassical quantum gravity (through the Bekenstein bound) suggest we must give up."

And more recently
http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf
http://www.damtp.cam.ac.uk/user/tong/string.html

Of course, this is handwavy, and AS is a direction suggested by Wilsonian renormalization, so we shall have to wait and see.
 
  • #97
atyy said:
I was thinking of slide 6 of http://www.ift.uni.wroc.pl/~planckscale/lectures/5-Friday/1-Giddings.pdf , ie. transplanckian collisions produce black holes.
...

I see now what you were identifying with "asymptotic darkness." But as I see it, he's not proposing a scenario of what will happen at very high energies. He is certainly not claiming to know. I'd say he is trying to impress the other conferees with how little we know, how inadequate our ideas are.

I don't agree with Giddings and I was disappointed---he could have contributed more to the conference. But at least in any case he wasn't trying to sell them Tom Banks old scenario of asymptotic darkness.
Notice in his picture he puts "BH" in quotes.

One can justifiably be skeptical of any scenario about would happen in a hypothetical collision between an electron and positron each with E >> Planck. (much larger than Planck energy). Does anybody nowadays claim to know?

The Planck energy is enough to run an ordinary automobile well over 100 miles. Roughly equivalent to the energy in half a tank of gasoline, if I remember right. You probably know the exact figure, something like 2 billion joules? He's imagining you give each particle that much energy and have the two collide. He says that a successful theory of qg would be able to say what happens, either explain how a collision would be avoided---explain that collision is theoretically impossible---or describe the collision. His message is we don't know, and we don't even have a clue to the right concepts. He sets what I think is an impractically high bar and in effect discourages people from even trying. But I don't think the others paid much attention.

Did you watch the first discussion session? It was led by Nicolai and was generally about quantum theories of gravity and matter---or Planck scale physics. Why needed? What history? Comparison of various research directions and current status. Then audience discussion. Robert Helling was the guy in the audience who made an angry speech where he kept dropping something onto the desk in front of him.
 
  • #98
marcus said:
I don't agree with Giddings and I was disappointed---he could have contributed more to the conference. But at least in any case he wasn't trying to sell them Tom Banks old scenario of asymptotic darkness.

marcus said:
Did you watch the first discussion session? It was led by Nicolai and was generally about quantum theories of gravity and matter---or Planck scale physics. Why needed? What history? Comparison of various research directions and current status. Then audience discussion. Robert Helling was the guy in the audience who made an angry speech where he kept dropping something onto the desk in front of him.

I don't think the idea originated from Banks - seems to go back to Bekenstein, and you can find it in papers by eg. Isham, Smolin etc. I've put more detail in the post #96.
 
  • #99
Atyy, thanks for pointing to these papers. I will try to comment.
atyy said:
http://arxiv.org/abs/gr-qc/9510063
Structural Issues in Quantum Gravity
Chris Isham
"This has been emphasised recently by several people and goes back to an old remark of Bekenstein: any attempt to place a quantity of energy E in a spatial region with boundary area A—and such that E > √A—will cause a black hole to form, and this puts a natural upper bound on the value of the energy in the region (the argument is summarised nicely in a recent paper by Smolin)."

http://arxiv.org/abs/gr-qc/9508064
The Bekenstein Bound, Topological Quantum Field Theory and Pluralistic Quantum Field Theory
Lee Smolin
"This suggests that, ultimately, a quantum theory of gravity will not be formulated most simply as a theory of fields on a differential manifold representing the idealized-and apparently nonexistent-“points” of space and time. To put this another way, the space of fields-the basic configuration space of classical field theory-has been replaced in the quantum theory by abstract Hilbert spaces. At the same time, ordinary space, in these formulations, remains classical, as it remains the label space for the field observables. This perpetuates the idealization of arbitrarily resolvable space-time points, that the results of string theory, non-perturbative quantum gravity and semiclassical quantum gravity (through the Bekenstein bound) suggest we must give up."

The Bekenstein bound is discussed here
http://www.scholarpedia.org/article/Bekenstein_bound
Happily enough Bekenstein himself is the curator of the Scholarpedia article about his bound.

The bound is independent of Newton's G. It relates the entropy S in a region to the energy E in the region and to the radius R of a ball containing the region.
S ≤ 2π R E.
Let's imagine we have adjusted units so hbar=c=1 and omit them, though the pedia article puts them in.

We also have a bound on the amount of energy you can pack into a region with radius R without getting a black hole. This is a well-known consequence of the Schwarzschild radius formula which goes back to the work of Karl Schwarzschild in 1916.
Going by what Wikipedia says, it took years for the idea of a black hole to become accepted. There were papers by Oppenheimer (1939) and Finkelstein (1953). Then a 1967 public lecture by Wheeler gave the term "black hole" wide currency.

This bound on the energy inside a finite region does not have an "official" name as far as I know. We could call it the Schwarzschild bound---and this DOES depend on the value of Newton G. This is a bound on the amount of energy you can pack into a region with radius R. It is just a disguised form of the 1916 Schwarzschild radius formula which each of us must have seen countless times.

RSchw= 2GM/c2 or in terms of the equivalent energy
RSchw= 2GE/c4 and then since we set c = 1
RSchw = 2GE

I'm ignoring any effects of spin and charge, to keep things simple. So here is a bound on the amount of energy you can stuff into a ball with radius R, without forming a Schwarzschld black hole. This bound on the energy is:
E ≤ R/(2G)

But the area of a ball is A = 4π R2 , so that R is proportional to sqrt A.
Forgetting some constants like 2 and π we can simply substitute sqrt A for the radius R, and write this as Isham does:
E ≤ sqrt A.

So far that doesn't seem very interesting. Bekenstein and Isham and Smolin and the others are talking about something more subtle, involving entropy and the dimensionality of the Hilbert space of quantum states. Intuitively because the energy in a bounded region is bounded, so also are things like the entropy and information and state space dimensionality bounded as well.

=============
The above is kind of preamble. Maybe now we are getting to something more interesting.

What does this have to do with renormalization of gravity+matter, and in particular with the running of G and Lambda?

Well intuitively, as the cutoff k -> infty we get that G becomes negligible and Lambda gets large. This could actually prevent a black hole from forming!
Remember the "Schwarzschild bound" on the energy in a given finite region depends on G. So if G is running----or more correctly it is the dimensionless number G(k)k2 which runs, converges to a finite fixedpoint number G*---this could interfere with the bound in some very high energy regime.

Bonanno seems to be discussing this kind of thing in his most recent paper.
I should apologize if I've been grouchy earlier. I didn't think the avid discussion of "darkness" had much relevance to the main topic (Weinberg's recent talks and work on renormalization of gravity as a way to explain inflation.) But I now see that there is something interesting to discuss here.

Over the years I've seen many physics arguments that depend on this "Schwarzschild bound" on the energy (or mass) inside finite region if collapse is to be avoided. What if that presumed "bound" is weakened? Which arguments are at risk of being compromised?
I'll try to get back to this later.
 
Last edited:
  • #100
marcus said:
I didn't think the avid discussion of "darkness" had much relevance to the main topic (Weinberg's recent talks and work on renormalization of gravity as a way to explain inflation.) But I now see that there is something interesting to discuss here.

Over the years I've seen many physics arguments that depend on this "Schwarzschild bound" on the energy (or mass) inside finite region if collapse is to be avoided. What if that presumed "bound" is weakened? Which arguments are at risk of being compromised?
I'll try to get back to this later.

OK, here's a real diversion from condensed matter:biggrin::
http://arxiv.org/abs/0704.3906
Area laws in quantum systems: mutual information and correlations
M.M. Wolf, F. Verstraete, M.B. Hastings, J.I. Cirac

A more serious question:
Weinberg starts with the most general generally covariant action. But Krasnov has an even more general one. What is the difference? I might guess Weinberg has the most general generally covariant *local* action, but is Krasnov's non-local?
 
  • #101
atyy said:
A more serious question:
Weinberg starts with the most general generally covariant action. But Krasnov has an even more general one. What is the difference? I might guess Weinberg has the most general generally covariant *local* action, but is Krasnov's non-local?

Now you're talking! This paper of Weinberg's is awesome. I'm trying to focus on it.
There seems to be two parts or two "stages" to the paper.
In the first he considers the "completely general generally covariant action" (middle of page 3)
He uses the symbol Lambda for the scale (not for cosmological constant). And he has an infinite series of couplings which run with Lambda. And he assumes he knows the beta functions for all these couplings and that they make the couplings converge to the fixed point. To actually calculate he would need to truncate, but he doesn't want to calculate, he wants to set up the formalism.

Then he goes to the second stage where he assumes the usual uniformity (homog and isotropy) associated with the classic Friedman model. It isn't clear to me at a level of detail how he gets from stage one to stage two---from the full theory to the symmetry-reduced theory called FRW model that is normally used in cosmology. But he makes that transition, and then he can start talking about inflation. That part begins on page 5.

It also isn't clear how you define scale in a background independent manner. I think he
says at the top of page 5 that Lambda can be defined in either of two ways and it makes no difference which.

Either define the scale Lambda by limiting the loop diagrams at a certain level of complexity (akin to Rivasseau's idea of scale=complexity)
Or else define the scale as a momentum cutoff, which he notes is often denoted by letter k,
"a regulator term added to the action, or a sliding renormalization scale."

How exactly, if there is no background metric, does one define the scale?
I suspect this is just a minor problem, I may be the only one puzzled by it.

General covariance is a synonym for diffeomorphism invariance (as other parts of the community call it). Maybe someone can help us understand how the scale Lambda is defined in a diffeo invariant context.
 
  • #102
MTd2 said:
If we assume an analogy to the case of the gases at critical point, ...

MTd2, I would urge you to contact Raoul Abramo about Weinberg's paper. Just ask questions. Don't interject your ideas. I think this paper of Weinberg is important and hot right now. The people at USP will want to be discussing its implications. Especially Abramo will want to discuss this. Or so I guess. If you can, get him to explain the significance, as he sees it.

They may have a journal club at the IFT that meets every week to discuss new papers. A discussion may be coming up about this paper. They would probably allow you as an interested outsider to sit in at the informal discussion meeting.

I could be mistaken, but I think this is the thing to focus one's attention on right now.
 
  • #103
atyy said:
OK, I'm very confused. Is AS really incompatible with Asymptotic Darkness? AD means if you collide two things at high enough energy, you will form a big black hole, so the horizon will be pretty flat and semiclassical. I understand that AS seems to say that black holes will evaporate to a remnant (http://arxiv.org/abs/hep-th/0602159), whereas string theory seems to say black holes will evaporate completely (http://arxiv.org/abs/hep-th/0601001). But isn't that a different issue from AD?

The question of whether or not a black hole decays to a remnant or something like that is besides the point, why are we even talking about this? I just read the Bonnano-Reuter paper, and it says nothing about high energy quantum collisions, only that when the mass of the black hole is very low (after the Hawking radiation evaporates away most of the mass of the hole) that it turns off.. Quite on the contrary, it seems to agree with the relatively pedestrian notion that a black hole forms when the mass M is large enough (and you can arrange for collisions off arbitrarily high energy in this little thought experiment, makign the shock waves as big as you want, even making an astrophysical sized one if you want) and that indeed it remains more or less classical in that regime.

And there the scaling argument comes into play, b/c a local conformal quantum field theory cannot satisfy an area law.

If on the other hand, black holes do not form in the AS scenario at high energies (which I think none of the AS authors claim), then that indeed makes the point off the paper and you are back to trying to show which of the generic assumptions fail in the AD arguments. For instance, why the 1 graviton exchange eikonal regime ceases to be well described semiclassically and why it doesn't dominate the density of states.
 
Last edited by a moderator:
  • #104
Haelfix said:
The question of whether or not a black hole decays to a remnant or something like that is besides the point, why are we even talking about this? I just read the Bonnano-Reuter paper, and it says nothing about high energy quantum collisions, only that when the mass of the black hole is very low (after the Hawking radiation evaporates away most of the mass of the hole) that it turns off.. Quite on the contrary, it seems to agree with the relatively pedestrian notion that a black hole forms when the mass M is large enough (and you can arrange for collisions off arbitrarily high energy in this little thought experiment, makign the shock waves as big as you want, even making an astrophysical sized one if you want) and that indeed it remains more or less classical in that regime.

And there the scaling argument comes into play, b/c a local conformal quantum field theory cannot satisfy an area law.

If on the other hand, black holes do not form in the AS scenario at high energies (which I think none of the AS authors claim), then that indeed makes the point off the paper and you are back to trying to show which of the generic assumptions fail in the AD arguments. For instance, why the 1 graviton exchange eikonal regime ceases to be well described semiclassically and why it doesn't dominate the density of states.

Yes, I agree. Let me just paraphrase to see if I got what you are saying right: AD is a general argument goimg back to Bekenstein that suggests if AS works, then something interesting is happening maybe with the dimensionality or with asymptotically dS space. The Bonanno and Reuter papers don't address AD and are about something else.
 
  • #105
Haelfix said:
The question of whether or not a black hole decays to a remnant or something like that is besides the point, why are we even talking about this? I just read the Bonnano-Reuter paper, and it says nothing about high energy quantum collisions, only that when the mass of the black hole is very low (after the Hawking radiation evaporates away most of the mass of the hole) that it turns off.. Quite on the contrary, it seems to agree with the relatively pedestrian notion that a black hole forms when the mass M is large enough (and you can arrange for collisions off arbitrarily high energy in this little thought experiment, makign the shock waves as big as you want, even making an astrophysical sized one if you want) and that indeed it remains more or less classical in that regime.

And there the scaling argument comes into play, b/c a local conformal quantum field theory cannot satisfy an area law.

If on the other hand, black holes do not form in the AS scenario at high energies (which I think none of the AS authors claim), then that indeed makes the point off the paper and you are back to trying to show which of the generic assumptions fail in the AD arguments. For instance, why the 1 graviton exchange eikonal regime ceases to be well described semiclassically and why it doesn't dominate the density of states.

Let me try and explain the situation for high energy scattering and black holes in AS.

Classically when I have a energy E>>M_p located in a region of radius R<2GE a black hole will form. Where M_p is the Planck mass G is Newtons constant. But as E>>M_p we also have R_s>>l_p the Planck length where R_s is the radius of the black hole. So here we can neglect quantum gravity effects at the horizon and throughout most of the spacetime apart from at the singularity. So the semi-classical approximation is still valid.

The Black hole will then evaporate and the semi-classical approximation will break down once the energy E of the black hole falls to the Planck scale E~M_p. Here AS predicts that a remnant forms which stops the black hole from evaporating further.

On the other hand if we take if we begin with an energy E~M_p in a region R<2GE, where the curvature will be Planckian, we already cannot trust classical physics and AS predicts a black hole will not form.

I think a key point here is when we have to worry about QG effects. Note that it is not when E>>M_p but when the density~ E/R^3 is high this follows from the Einstein equations that relate the strength of the gravitational field with the energy density. If R~2GE then density ~ 1/E^2 so the smaller the black hole mass the more we need to worry about QG effects.

Another consequence of the density~1/E^2 is that it is indeed very "easy" to create black holes with a large energy who's formation can be described with classical physics.
 
  • #106
The discussion has not been limited to black holes forming remnants. Bonanno's recent paper argues that BH simply do not form below a certain critical mass. This does not have to do with evaporation. But evaporation and remnants are also discussed in the same paper.
marcus said:
Right. Did you already cite Bonanno's recent paper? It's a good readable review and it mentions the 2000 result of Bonanno and Reuter to that effect.

http://arxiv.org/abs/0911.2727
Astrophysical implications of the Asymptotic Safety Scenario in Quantum Gravity
Alfio Bonanno
(Submitted on 13 Nov 2009)
"In recent years it has emerged that the high energy behavior of gravity could be governed by an ultraviolet non-Gaussian fixed point of the (dimensionless) Newton's constant, whose behavior at high energy is thus antiscreened. This phenomenon has several astrophysical implications. In particular in this article recent works on renormalization group improved cosmologies based upon a renormalization group trajectory of Quantum Einstein Gravity with realistic parameter values will be reviewed. It will be argued that quantum effects can account for the entire entropy of the present Universe in the massless sector and give rise to a phase of inflationary expansion. Moreover the prediction for the final state of the black hole evaporation is a Planck size remnant which is formed in an infinite time."
Comments: 28 pages, 6 figures. Invited talk at Workshop on Continuum and Lattice Approaches to Quantum Gravity. Sept. 2008, Brighton UK. To appear in the Proceedings

The point you were making is around the top of page 18. If the mass is below critical, no horizon exists.

I'm skeptical when I hear talk of imparting transplanckian energies to two particles and having them collide and form a black hole. It's speculative and has no clear connection with Weinberg's paper.
 
  • #107
Finbar said:
Classically when I have a energy E>>M_p located in a region of radius R<2GE a black hole will form. Where M_p is the Planck mass G is Newtons constant. But as E>>M_p we also have R_s>>l_p the Planck length where R_s is the radius of the black hole. So here we can neglect quantum gravity effects at the horizon and throughout most of the spacetime apart from at the singularity. So the semi-classical approximation is still valid.

The Black hole will then evaporate and the semi-classical approximation will break down once the energy E of the black hole falls to the Planck scale E~M_p. Here AS predicts that a remnant forms which stops the black hole from evaporating further.

On the other hand if we take if we begin with an energy E~M_p in a region R<2GE, where the curvature will be Planckian, we already cannot trust classical physics and AS predicts a black hole will not form.

Isn't AD limited to the case where E>>M_p? For example, Tong's notes say "Firstly, there is a key difference between Fermi’s theory of the weak interaction and gravity. Fermi’s theory was unable to provide predictions for any scattering process at energies above sqrt(1/GF). In contrast, if we scatter two objects at extremely high energies in gravity — say, at energies E ≫ Mpl — then we know exactly what will happen: we form a big black hole. We don’t need quantum gravity to tell us this. Classical general relativity is sufficient. If we restrict attention to scattering, the crisis of non-renormalizability is not problematic at ultra-high energies. It’s troublesome only within a window of energies around the Planck scale." http://www.damtp.cam.ac.uk/user/tong/string/string.pdf

So it's that case which leads to the information paradox and the suggestion that maybe gravity cannot be a local quantum field theory unless something interesting happens.
 
  • #108
atyy said:
Isn't AD limited to the case where E>>M_p? For example, Tong's notes say "Firstly, there is a key difference between Fermi’s theory of the weak interaction and gravity. Fermi’s theory was unable to provide predictions for any scattering process at energies above sqrt(1/GF). In contrast, if we scatter two objects at extremely high energies in gravity — say, at energies E ≫ Mpl — then we know exactly what will happen: we form a big black hole. We don’t need quantum gravity to tell us this. Classical general relativity is sufficient. If we restrict attention to scattering, the crisis of non-renormalizability is not problematic at ultra-high energies. It’s troublesome only within a window of energies around the Planck scale." http://www.damtp.cam.ac.uk/user/tong/string/string.pdf

So it's that case which leads to the information paradox and the suggestion that maybe gravity cannot be a local quantum field theory unless something interesting happens.

This is exactly my point "...the crisis of non-renormalizability is not problematic at ultra-energies" when E>>Mpl gravity the black holes are large and described by gravity in the IR. "It's troublesome only within a window of energies around the Planck scale".

AD is the assumption that gravity is not AS and hence gravity is not sufficiently strong to disallow black holes with a radius r<<lpl.


The information paradox is a different problem and AS still needs to deal with it. Personally I don't think the remnant picture is good enough if one assumes all the information is stored in the remnant and doesn't get out some how.
 
  • #109
atyy said:
... It’s troublesome only within a window of energies around the Planck scale." http://www.damtp.cam.ac.uk/user/tong/string/string.pdf
...

I strongly agree. If there are any problems that are ready for us to confront they are on the way to Planck scale. This is the perspective that Nicolai adopted at the Planck scale conference. At Planck scale some new physics is expected to take over, his program is, if possible, to get all the way to Planck scale with minimal new machinery and have the theory testable.

And this range E < EPlanck is exactly where Bonanno's assertion applies. It is also where Roy Maartens and Martin Bojowald found, in 2005, that black holes could not form (given the Loop context).

We may in fact not have a problem. The sheer existence of black holes of less than Planck mass is questionable. There is no evidence that they exist, and there are analytical results to the contrary.

Finbar said:
This is exactly my point "...the crisis of non-renormalizability is not problematic at ultra-energies" when E>>Mpl gravity the black holes are large and described by gravity in the IR. "It's troublesome only within a window of energies around the Planck scale".
...

I agree strongly again. I'm glad you made these points.
 
  • #110
OK, looks like we all agree on the physics heuristics but maybe not the names of various hypotheses.
 
  • #111
marcus said:
How exactly, if there is no background metric, does one define the scale?
I suspect this is just a minor problem, I may be the only one puzzled by it.

General covariance is a synonym for diffeomorphism invariance (as other parts of the community call it). Maybe someone can help us understand how the scale Lambda is defined in a diffeo invariant context.

They use a particle physicist thing called the "background field method". You pick a background, but the background is arbitrary. Take a look at http://arxiv.org/abs/0910.5167's discussion beginning before Eq 56 "We can write g=background+h. It is not implied that h is small." up to Eq 59 "Also the cutoff term is written in terms of the background metric ... where is some differential operator constructed with the background metric."

AS is basically not very rigourous (Rivasseau complained about this in a footnote in his GFT renormalization paper) and kinda hopeful, but my impression is that it's often that way in condensed matter. For example in Kardar's exposition at some point he says (I'm doing very free paraphrase) well, how do we know there's not non-perturbative fixed points - we don't, but luckily we can do experiments and they even more luckily match our perturbative calculations! He also says there are several different coarse -graining schemes which actually no one has proven are mathematically equivalent, but they all seem to match experiment, so we live in blissful ignorance! In condensed matter the predictions are "universal", so for example the critical temperature is different for all sorts of materials and the theory cannot predict the temperature - what it gets right is the critical exponent which seems to be independent of material and dependent only on symmetries and dimensionality. So I guess Weinberg and co are hoping for some such generic predictions.
 
Last edited:
  • #112
Just a note on possible confusion. When one says "high energy" in gravity it can be confused for "low energy" and vice versa. The reason is the following: Newton's constant is dimensionful. It has mass dimension [G]=-2 such that when I write GM this is a length or an inverse mass [GM]=[G]+[M] =-2+1=-1.

One consequence of this is the strange property of black holes that when I increase there mass their temperature drops T=1/(8 pi G M) i.e. they have a negative specific heat.

Other consequences of [G]=-2 are that the entropy of a black hole goes as the S=area/(4G) since G is the Planck area and the infamous power counting non-renormalizability of general relativity.
 
  • #113
Does AS really need a fixed point? Could it live with, say, a limit cycle?
 
  • #114
atyy said:
They use a particle physicist thing called the "background field method". You pick a background, but the background is arbitrary. Take a look at http://arxiv.org/abs/0910.5167's discussion beginning before Eq 56 "We can write g=background+h. It is not implied that h is small." up to Eq 59 "Also the cutoff term is written in terms of the background metric ... where is some differential operator constructed with the background metric."

AS is basically not very rigourous (Rivasseau complained about this in a footnote in his GFT renormalization paper) and kinda hopeful, but my impression is that it's often that way in condensed matter. For example in Kardar's exposition at some point he says (I'm doing very free paraphrase) well, how do we know there's not non-perturbative fixed points - we don't, but luckily we can do experiments and they even more luckily match our perturbative calculations! He also says there are several different coarse -graining schemes which actually no one has proven are mathematically equivalent, but they all seem to match experiment, so we live in blissful ignorance! In condensed matter the predictions are "universal", so for example the critical temperature is different for all sorts of materials and the theory cannot predict the temperature - what it gets right is the critical exponent which seems to be independent of material and dependent only on symmetries and dimensionality. So I guess Weinberg and co are hoping for some such generic predictions.

If you use the back ground field method rigorously then (slightly paradoxically) you actually ensure background independence. In a sense you quantizing the fields on all backgrounds at the same time. Up until recently however it has not been done rigorously enough though.

The relevant paper is
http://arxiv.org/pdf/0907.2617

Also checkout

Frank Saueressig's talk at perimeter.
 
  • #115
Finbar said:
Let me try and explain the situation for high energy scattering and black holes in AS.

Classically when I have a energy E>>M_p located in a region of radius R<2GE a black hole will form. Where M_p is the Planck mass G is Newtons constant. But as E>>M_p we also have R_s>>l_p the Planck length where R_s is the radius of the black hole. So here we can neglect quantum gravity effects at the horizon and throughout most of the spacetime apart from at the singularity. So the semi-classical approximation is still valid.

The Black hole will then evaporate and the semi-classical approximation will break down once the energy E of the black hole falls to the Planck scale E~M_p. Here AS predicts that a remnant forms which stops the black hole from evaporating further.

On the other hand if we take if we begin with an energy E~M_p in a region R<2GE, where the curvature will be Planckian, we already cannot trust classical physics and AS predicts a black hole will not form.

I think a key point here is when we have to worry about QG effects. Note that it is not when E>>M_p but when the density~ E/R^3 is high this follows from the Einstein equations that relate the strength of the gravitational field with the energy density. If R~2GE then density ~ 1/E^2 so the smaller the black hole mass the more we need to worry about QG effects.

Another consequence of the density~1/E^2 is that it is indeed very "easy" to create black holes with a large energy who's formation can be described with classical physics.

I agree with most of what you just said (some technical quibbles aside), which is why I'm now very confused about what we are arguing about. B/c that's exactly what asymptotic darkness says. At transplanckian center of mass energy densities, as you go further and further into the UV you expect larger and larger black holes to form, which by the above argument implies that you are getting closer and closer to classical GR and QG becomes less and less relevant. Its immaterial what happens at the Planck scale (or say within an order or two thereof). No one knows exactly what goes on there, its only at much smaller energies, or conversely at much larger energies where we enter regimes that we can actually calculate in.
 
  • #116
Agreement about "on the way" heuristics

marcus said:
I strongly agree. If there are any problems that are ready for us to confront they are on the way to Planck scale. This is the perspective that Nicolai adopted at the Planck scale conference. At Planck scale some new physics is expected to take over, his program is, if possible, to get all the way to Planck scale with minimal new machinery and have the theory testable.

And this range E < EPlanck is exactly where Bonanno's assertion applies. It is also where Roy Maartens and Martin Bojowald found, in 2005, that black holes could not form (given the Loop context).

We may in fact not have a problem. The sheer existence of black holes of less than Planck mass is questionable. There is no evidence that they exist, and there are analytical results to the contrary.
...

atyy said:
OK, looks like we all agree on the physics heuristics but maybe not the names of various hypotheses.

I think that's a good way to put it. IMO the reason for strong interest in the research community in what physics might be like in the range from say 109 TeV up to 1016 TeV, is because of interest in high-energy astrophysics and the early universe.

The paradigm of colliding two particles at higher and higher energy, and equating that with physics, has become less interesting. It's a mental rut (almost an obsession) left over from the accelerator era. For example Weinberg was talking about inflation, which is a different business.

Different concepts, and different sources of data, come into play.

You could say that the range 109 TeV up to 1016 TeV is the range from just over "cosmic ray" energy up to "early universe" energy.

A billion TeV is kind of approximate upper bound on cosmic ray energies. It's quite rare to detect cosmic rays above that level. And 1016 TeV is the Planck energy.

I would say this is a new erogenous zone for theoretical physics. The putative "GUT" scale, of a trillion-plus TeV, comes in there. But it impressed me that in Nicolai's new model there is no new physics at GUT scale. What Nicolai and Meissner have done is project a model which

*is falsifiable by LHC (once it gets going) and
*is conceptually economical, even minimalistic---based on existing standard model concepts,
*pushes the breakdown/blow-up points out past Planck scale, so it
*delays the need for fundamentally new physics until Planck scale is reached.

Whether Nicolai and Meissner's model is correct is not the issue here. What this example suggests is that this kind of conservative unflamboyant goal, this kind of unBaroque proposed solution, will IMO likely become fashionable among theorists. You could think of it as a reaction to past excesses, or a corrective swing of the pendulum.

This same economical or conservative spirit is the essence of what Weinberg is doing.
The new paper of his that we are discussing simply carries through on what he was talking about in his 6 July CERN lecture, where he said he didn't want to discourage anyone from continuing string research, but string theory might not be needed, might not be how the world is. How the world is, he said, might be described by (asymptotic safe) gravity and "good old" quantum field theory.

I assume that means describing the world pragmatically out to Planck scale (1016 TeV) so you cover the early universe. An important part of the world! :biggrin: And not worrying about whatever new physics might then kick in, if any does.
It's a modest and practical agenda, just getting that far, compared with worrying about putative seamonsters and dragons out beyond Planck energy. But of course that's fun and all to the good as well. :biggrin:

================================
In case anyone new is reading this thread, here is a link to video of Weinberg's 6 July CERN talk:
http://cdsweb.cern.ch/record/1188567/
It gives an intelligent overview of what this paper is about, where it fits into the big picture, and what motivates the Asymptotic Safe QG program (which he describes in the last 12 minutes of the video).

As a leading example of extending known and testable physics out to Planck scale, here is Nicolai's June 2009 talk:
http://www.ift.uni.wroc.pl/~rdurka/planckscale/index-video.php?plik=http://panoramix.ift.uni.wroc.pl/~planckscale/video/Day1/1-3.flv&tytul=1.3%20Nicolai
Here's the index to all the videos from the Planck Scale conference
http://www.ift.uni.wroc.pl/~rdurka/planckscale/index-video.php
 
Last edited by a moderator:
  • #117
Haelfix said:
I agree with most of what you just said (some technical quibbles aside), which is why I'm now very confused about what we are arguing about. B/c that's exactly what asymptotic darkness says. At transplanckian center of mass energy densities, as you go further and further into the UV you expect larger and larger black holes to form, which by the above argument implies that you are getting closer and closer to classical GR and QG becomes less and less relevant. Its immaterial what happens at the Planck scale (or say within an order or two thereof). No one knows exactly what goes on there, its only at much smaller energies, or conversely at much larger energies where we enter regimes that we can actually calculate in.

Ok so we're getting somewhere. The problem is exactly the one I was pointing out in my post yesterday...

"Just a note on possible confusion. When one says "high energy" in gravity it can be confused for "low energy" and vice versa. The reason is the following: Newton's constant is dimensionful. It has mass dimension [G]=-2 such that when I write GM this is a length or an inverse mass [GM]=[G]+[M] =-2+1=-1. "

So for the argument about the non-renormalizability of gravity based on its scaling in the UV to be valid the "Asymptotic" in Asymptotic darkness and needs to be the same as the Asymptotic in Asymptotic safety. The reason it is false is because they are not for exactly the reason above.

If I have a large mass black hole M>>Mpl then r=2GM is large r>>lpl. This is what the "Asymptotic" in AD refers to and as you say you get closer and closer to classical GR. But the "Asymptotic" in AS refers to exactly the opposite limit that is when k>>Mpl where k=1/r this is where we are very far from classical GR and hence where we need a full theory of QG to answer any questions appropriately.

This is exactly the point David Tong is making

""Firstly, there is a key difference between Fermi’s theory of the weak interaction and gravity. Fermi’s theory was unable to provide predictions for any scattering process at energies above sqrt(1/GF). In contrast, if we scatter two objects at extremely high energies in gravity — say, at energies E ≫ Mpl — then we know exactly what will happen: we form a big black hole. We don’t need quantum gravity to tell us this. Classical general relativity is sufficient. If we restrict attention to scattering, the crisis of non-renormalizability is not problematic at ultra-high energies. It’s troublesome only within a window of energies around the Planck scale.""

So you see its not the AD scenario that I'm arguing about. Its that AD(an IR property of classical gravity) has any baring on AS/renormalizablity(which is a UV problem of quantum gravity).
 
  • #118
Finbar said:
...
So you see its not the AD scenario that I'm arguing about. Its that AD(an IR property of classical gravity) has any bearing on AS/renormalizablity(which is a UV problem of quantum gravity).

I was surprised anyone would bring up AD in this context. It seems like a red herring. Just distracts from considering the main burden of what Weinberg is doing.

Could it be that some people want to deny or dismiss the significance of AS suddenly coming to the forefront? It seems to me when something like this happens----greatly increased research, first ever AS conference, possible alliance with CDT and even Horava, connection with cosmology revealed---that the appropriate thing to do is to pay attention, and focus on it, not try to dismiss (especially not by handwaving about transplanckian black holes :biggrin:)

Haelfix, could you have been misled by someone with a vested interest that felt threatened by Weinberg's CERN talk, or recent paper, and is grasping at straws? or just blowing smoke? Be careful, maybe a bit more skeptical?
 
Last edited:
  • #119
Finbar said:
So you see its not the AD scenario that I'm arguing about. Its that AD(an IR property of classical gravity) has any baring on AS/renormalizablity(which is a UV problem of quantum gravity).

If AD suggests that gravity cannot be described by a "normal" local quantum field theory even at IR, then it suggests that AS may be wrong - only suggests, since Wilsonian renormalization indicates AS is a logical possibility - but in which case an interesting issue is in what way AS is not a "normal" local quantum field theory, even though the heuristic behind AS is that it is a "normal" local quantum field theory.

One thing I don't understand is that Weinberg's paper (the one being discussed in this thread) starts with the most general generally covariant Lagrangian (http://arxiv.org/abs/0911.3165) - but Krasnov has recently proposed an even more general generally covariant Lagrangian (http://arxiv.org/abs/0910.4028 ) - so presumably Weinberg's is less general - is that because Weinberg admits only local terms, while Krasnov's contains non-local terms? Usually renormalization flows don't generate non-local terms, I think, and naively I would expect the same for AS, but is that true?

Edit: Krasnov says his terms are all local - so what is the difference between his stuff and AS?

Litim's http://arxiv.org/abs/0810.3675 says "A Wilsonian effective action for gravity should contain ... possibly, non-local operators in the metric field." So I guess non-local terms can come about through coarse-graining, which is not intuitive to me - can someone explain? Also what are these terms, and did Weinberg include these?

Edit: As far as I can tell, Weinberg, as well as Codello et al, only included local (or quasilocal) terms. So what are these non-local terms Litim is talking about, and why would they arise?
 
Last edited by a moderator:
  • #120
http://relativity.livingreviews.org/Articles/lrr-2006-5/
"a canonical formulation is anyhow disfavored by the asymptotic safety scenario"

What!?
 
Last edited by a moderator:

Similar threads

Replies
2
Views
4K
Replies
62
Views
10K
  • · Replies 26 ·
Replies
26
Views
9K
  • · Replies 16 ·
Replies
16
Views
5K
Replies
11
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
8K
Replies
16
Views
6K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
5
Views
5K
  • · Replies 0 ·
Replies
0
Views
3K