# Grav. + GUT (Gravity from a Particle Physicist's perspective)

1. Oct 27, 2009

### marcus

http://arxiv.org/abs/0910.5167
Gravity from a Particle Physicist's perspective
R. Percacci
Lectures given at the Fifth International School on Field Theory and Gravitation, Cuiaba, Brazil April 20-24 2009. To appear in Proceedings of Science
(Submitted on 27 Oct 2009)
"In these lectures I review the status of gravity from the point of view of the gauge principle and renormalization, the main tools in the toolbox of theoretical particle physics. In the first lecture I start from the old question "in what sense is gravity a gauge theory?" I will reformulate the theory of gravity in a general kinematical setting which highlights the presence of two Goldstone boson-like fields, and the occurrence of a gravitational Higgs phenomenon. The fact that in General Relativity the connection is a derived quantity appears to be a low energy consequence of this Higgs phenomenon. From here it is simple to see how to embed the group of local frame transformations and a Yang Mills group into a larger unifying group, and how the distinction between these groups, and the corresponding interactions, derives from the VEV of an order parameter. I will describe in some detail the fermionic sector of a realistic "GraviGUT" with $$SO(3,1)\times SO(10) \subset SO(3,11)$$. In the second lecture I will discuss the possibility that the renormalization group flow of gravity has a fixed point with a finite number of attractive directions. This would make the theory well behaved in the ultraviolet, and predictive, in spite of being perturbatively nonrenormalizable. There is by now a significant amount of evidence that this may be the case. There are thus reasons to believe that quantum field theory may eventually prove sufficient to explain the mysteries of gravity."

Last edited: Oct 27, 2009
2. Oct 27, 2009

### marcus

Percacci is the chief organizer of the conference on Asymptotic Safety being held at Perimeter Institute a little over a week from now.
http://www.perimeterinstitute.ca/en/Events/Asymptotic_Safety/Asymptotic_Safety_-_30_Years_Later/ [Broken]

He has an AsymSafe FAQ at his website, and wrote the chapter on AsymSafe QG that appeared in Oriti's book Approaches to Quantum Gravity: Towards a New Understanding of Space, Time, and Matter, published by Cambridge U. P.

Percacci appears to be at the focus of current efforts to unify gravity with particle physics without inventing extra dimensions or extra degrees of freedom---simply using quantized general relativity in four dimensions and quantum field theory (more or less standard QFT) again on 4D.

We have been following this with increased attention ever since 6 July when Steven Weinberg gave a talk at CERN announcing his current participation in this line of research. I will get links for Weinberg's talk, for Percacci's FAQ, and for the upcoming conference at Perimeter.

Here is the video of Steven Weinberg's 6 July CERN talk:
http://cdsweb.cern.ch/record/1188567/
To save time jump to minute 58, the last 12 minutes---that is where he starts talking about his current research focus on AsymSafe as a possible avenue to unification also with applications to cosmology---offering a natural explanation for inflation.
Here's a condensed version of the Perimeter conference program listing speakers and talks:
Here is the AsymSafe FAQ:
http://www.percacci.it/roberto/physics/as/faq.html
This has a bibliography on AsymSafe with papers by Percacci and others, including his survey chapter in Oriti's book:
http://www.percacci.it/roberto/physics/as/
Here are the slides of a June 2009 talk on AsymSafe QG which Percacci gave at a school sponsored by Renate Loll's network:
http://th-www.if.uj.edu.pl/school/2009/lectures/percacci.pdf
Percacci is normally at the Italian Institute for Advanced Studies (SISSA) outside Trieste. However he is currently on leave from there and spent all or part of the past academic year at Utrecht, and is spending the present semester at Perimeter Institute.

Last edited by a moderator: May 4, 2017
3. Oct 27, 2009

### RUTA

Great post, marcus. It would seem the particle physicists are throwing in the towel. They can't unify the Standard Model and GR with their approach to physics so they're starting to convince themselves that no real merger is possible :-)

4. Oct 27, 2009

### marcus

"GRaviGUT"

RUTA I can't react so quickly. I need time to appreciate just what is going on. I don't see Percacci or Weinberg throwing in the towel (slang for "giving up").
I think maybe it wasn't too bright of the other particle physicists to try for so long to establish gravity on a fixed rigid background. It was careless of them to think of gravity as a force and to imagine that formulating gravitons on flat space was all that's needed, as if they could treat a gravity like just another ordinary particle. Maybe that narrowminded vision is now dying and that narrow program is being abandoned.

But the smart particle physicists like Percacci are not trapped in that narrow program. They see that gravity is dynamic geometry of a 4D continuum and that quantum gravity must be quantum dynamic geometry, again of 4D.

So the natural thing for particle physicists to do (the ones that "get it") is take a quantum version of Gen Rel and build QFT on that dynamic 4D geometry.

They are "not giving up", on the contrary they may be the winners of the game, because they have the know-how to re-build QFT on the new basis.

Percacci and Weinberg are not the only particle physicists who have boarded this train. There is also Daniel Litim. Various others. It is not just relativists now---new players have arrived. I think Arkady Tseytlin was formerly a string theorist--I guess he would count as a particle physicist. Benjamin Ward is a particle physicist. These are people invited to give papers at the Perimeter conference taking place in a few days. I have a feeling that if I checked MOST of them would turn out to be particle physicists. Yes a small number compared with the huge mass of particle theorists (string and other) but it is always the small active minority that starts change and does the real stuff.

Last edited: Oct 27, 2009
5. Oct 27, 2009

### atyy

Asymptotic safety has long been a logical route to investigate from the "particle physics" or Wilsonian viewpoint. In fact, from the "condensed matter" viewpoint, 1976 was already late to the game - conceptually - not calculationally - till this day, (proof or disproof of) asymptotic safety is not on firm mathematical ground. If you read Polchinski's string text, you will find asymptotic safety respectfully mentioned. Wilsonian renormalization has roots in work done by particle physicists including Stuckelberg and Petermann, and Gell-Mann and Low. Wilson himself was a particle physicist, who did his most famous work on critical phenomena in condensed matter, an area in which Kadanoff and Fisher are key names too. The clarity of the Wilsonian framework was so powerful that Weinberg was led to suggest asymptotic safety when he was trying to teach himself what his "statistical brethren" had achieved: http://ccdb4fs.kek.jp/cgi-bin/img/allpdf?197610218 [Broken]

This Wilsonian framework is nowadays part of standard coursework. Take Kardar's lectures for example http://ocw.mit.edu/OcwWeb/Physics/8-334Spring-2008/LectureNotes/index.htm [Broken].

In L7, he writes "the RG procedure is sometimes referred to as a semi-group. The term applies to the action of RG on the space of configurations: each magnetization profile is mapped uniquely to one at larger scale, but the inverse process is non-unique as some short scale information is lost in the coarse graining. (There is in fact no problem with inverting the transformation in the space of the parameters of the Hamiltonian.)" The "semi-group" comment is a reference to emergence; the parenthetical comment is a reference to possible asymptotic safety.

In L12, non-perturbative fixed points are considered, and it is noted that by luck, perturbative theory is sufficient to calculate in the particular physical case being discussed: "The uniqueness of the critical exponents observed so far for each universality class, and their proximity to the values calculated from the epsilon–expansion, suggests that postulating such non-perturbative fixed points is unnecessary."

Last edited by a moderator: May 4, 2017
6. Oct 28, 2009

### marcus

Thanks for the encouragement, RUTA! I don't know how familiar you are with the history of the A.S. idea. Several of the usual papers go over it.
By Reuter, Percacci, and recently Weinberg looking back.

You might find it entertaining to see what our discussion of it at Physicsforums was like back in 2007. Then people were less aware and it was more difficult to get attention focused on it. I started a thread called What if Weinberg had succeeded in 1979?

The point is he thought of A.S. 1976, and gave lectures on it---e.g. at Erice. And he tried to make it work but didn't have the math tools to tackle the 4D case. He wrote about it in 1979, in a chapter of a book edited by Hawking, celebrating the Einstein centennial.
Then he gave up on it.

Martin Reuter revived the approach in 1998, using some new mathematical techniques. But Weinberg still stayed on the sidelines, until according to him, he saw a 2006 paper by Percacci, which convinced him it had enough chance of being right to be worth pursuing.

I tried to imagine how things might have gone if Weinberg had gotten the result he wanted in 1979 (and the field didn't have to wait 20 years for Reuter to revive it). It's a bit of an odd way to approach it. But back in 2007 it was harder to get a conversation started about Asymptotic Safety.

In any case the subject has had quite an interesting history

I'm interested in how you imagine a "real merger". To me Asymptotic Safe QG does seem to offer the possibility of a real merger, within the context of a quantum field theory. But I may be wrong, or missing something important. I'd like to hear your take on it. (I'm still trying to assimilate this most recent paper of Percacci's---it may take me a while.)
=========================

In case anyone else is reading: newcomers may be confused by what Percacci says in the abstract "in spite of being perturbatively nonrenormalizable." This should not be taken to mean that the theory is nonrenormalizable in general---only when the wrong methods are used. The moral is, don't use perturbative techniques on gravity---they won't work---but other methods will.
Asymptotic safety has been described (by Percacci, Reuter and others) as nonperturbative renormalizability. An A.S. theory becomes predictive to arbitrarily high energy once a finite number of parameters have been determined by experiment--which is the practical consequence of renormalizability, whatever the context and the methodology being applied.

Last edited: Oct 28, 2009
7. Oct 28, 2009

### arivero

8. Oct 28, 2009

### apeiron

Marcus - can I ask for a simple explanation of how asymptotic safety works? And I did read the FAQ!

Is is simply that Reuter found within flexi-GR 4D geometry a circumstance in which 4D breaks down into fractal 2D approaching planckscale - and this would then reduce the available directions for quantum gravity self-action at this scale?

If so, what was the reason for this crumbling into 2D?

With CDT, though this was uncertain, it seemed to me that it must be something coming from the quantum side of the model so to speak. At smallest scale, direction becomes a confused issue and so you only have 2D actions (a vector against the backdrop of a foamy context rather than a vector going in one direction, and so also quite definitely not in the other two). Bit like the grin left behind by the disappearing Cheshire Cat.

The same idea would seem to fit Reuter's approach. Or have I got completely the wrong end of the stick here?

9. Oct 28, 2009

### RUTA

But you have to admit, AS is nowhere near as ambitious as the unification of gravity with GUTs to get SUTs. I don't see any reason for taking gravity out of the mix given their paradigm of particles and forces. It's a fallback position.

Last edited: Oct 28, 2009
10. Oct 28, 2009

### humanino

How do you know Nature's ambition ? What matters is that we, as a community, have different groups pursuing all different logical possibilities (that we are aware of).

11. Oct 28, 2009

### marcus

I'm glad you read the FAQ! I think Percacci did a great job with it. Hope you agree. It has 40 Q&A items, several with links to more technical explanation. Even though he is careful to keep the answers at a basic simple level, there is still a lot to think about. I have not read the whole FAQ myself---some parts read, some I've only skimmed.

When you ask "how it works" and then try out some mental imagery of geometry at extremely small scale, then what I believe you are asking is how should we picture the microstructure of geometric relationships?
How should we imagine the microstructure so that it would behave as A.S. says it should behave?

For example, an issue you raise, is how to picture microstructure so that it would have spontaneous dimensional reduction.

Steve Carlip has a recent paper discussing how spon. dim. red. arises in several different types of QG. He weaves these separate occurrences of it in separate theories into one picture and then he describes an heuristic classical GR reason for it. Not to stress this too much, but if you are curious about how Carlip addresses this question, and havent seen his paper, here it is: http://arxiv.org/abs/0909.3329

I liked your mental imagery suggesting how spon.dim.red. might happen. I actually felt some physical intuition, as a kind of electricity, in those verbal images. I also liked Steve Carlip's suggestive classical GR analysis, which is surprising. I don't think I have anything better to offer---but I could try (and have tried in the past) to come up with some explanatory visions of microgeometry.

==================

I think what Percacci FAQ is saying---just to focus attention on that---is that to understand fundamental microgeometry you have to give up the idea of it being metric. To the extent that it is describable or representable by a metric, you must be prepared to have the metric be energy dependent----to run with scale (down near planck level)

You may remember down around questions #32 or 33 in the FAQ where he talks about this.

To me this seems related to the nonmetric approach to QG that Kirill Krasnov has set in motion. Having many metrics, but no one particular metric, and having the metric able to run, to depend on scale----perhaps even making the basic item something else besides a metric---a differential form subject perhaps (not to the original Einstein equation) but to a variant of the Plebanski action. Krasnov say that one of his motivations is to enable spinfoam QG to come to terms with renormalization.

12. Oct 28, 2009

### RUTA

Sorry, I fail to see the relevance of your post to mine.

13. Oct 28, 2009

### humanino

Sure, sorry.

14. Oct 28, 2009

### apeiron

Thanks Marcus, but is this bit correct? I really am struggling with the jargon in the FAQ.

The other thing that interested me is that Pecacci seems to have both newton's constant and the cosmological constant running to hit a fixed point. So two parameters that must intersect.

Does the cosmo constant actually run - have QM self-interactions?

There does seem a logic in a connection between the two constants as g sort of represents a spatial parameter - spatial curvature - and the cosmo constant a time-like parameter, expansion or growth of space.

15. Oct 28, 2009

### marcus

I think that's right. When I have needed to paraphrase it I've said much the same thing as you did.
I would advise reading Steve Carlip's recent paper to get a classic GR version of how that might happen. And a comparison of how spontaneous dimensional reduction happens in the various QG models (Reuter, Loop, Loll etc..) His classical GR discussion of it is the most graphic----although it has to be merely heuristic since classic would not really apply.

Here is a halfbaked analogy to think about (with a grain of salt). Take a 2D sheet of paper.

Crumple it into a ball. As you crumple, it gradually turns into a 3D object.

If you had 4D hands and lived in 4D space, you could continue to crumple it and it would gradually become a 4D ball, but we don't live in 4 spatial dimensions, so let's not think about that.

Let's think of the 3D ball, the wad of crumpled paper. Let's do an X-ray CAT scan. Let's do tomography. Lets examine the internal structure by imaging.

If our imager is low-resolution---if our CAT scan is blurry, then we will look in side and determine that it is 3D, just as it looks on the outside.

But now let's zoom in. Let's gradually increase the resolution. After a while we can begin to see that this 3D ball is really a foam made of 2D surface.

At macro scale of a centimeter----the mass of material within a radius of a given point varies as the CUBE of the radius.
the density behaves like an ordinary 3D density.

But at less than a millimeter scale---the mass of material with a given radius typically varies as the SQUARE of the radius---or as some power between square and cube because of occasionally including the paper of some nearby wall when walls are close together.

We can measure dimensionality by seeing how volume relates to radius. So dimensionality in the this wad of paper can be empirically determined and it depends on scale.

So since this happens even with ordinary crumple paper, it shouldn't be surprising if it happens with the geometry of space. Empirically measured dimensionality must depend on scale and probably depends fairly continuously---getting larger with larger scale and smaller with smaller scale.

Yes this is a dumb simple example---not really how it works etc etc. But you can read Carlip for more sophisticated discussion.

Last edited: Oct 28, 2009
16. Oct 28, 2009

### marcus

Apeiron this line of questioning is gold. I really like this post. Don't have time to fully respond.

However if you read Percacci carefully (or any of Reuter's papers) you see explicitly stated that only dimensionless constants run. G is a physical quantity, not a number. So what they have to study is what Percacci calls G-tilda. The dimensionless version of Newton G. Remember k is the cutoff, an energy. We will take k to infinity.

˜G = G k2 Here both k and G are varying, I should write G(k) instead of plain G, to show this.

This ˜G is what goes to an UV limit. And also ˜Lambda = Lambda/k2

k, being an energy, is the reciprocal of length. But the cosmo constant Lambda is the reciprocal of area. So dividing Lambda by k2 gets you a pure number.
Percacci tells you in the paper what limit ˜Lambda converges to as k-> infinity.

And dimensionally speaking, Newton G is a length divided by an energy. And k2 is an energy divided by a length. So multiplying G by k2 again gets you a pure number. And Percacci tells you what number that converges to.
These are absolute universal numbers which do not depend on the system of units.

Now he also talks about the physical quantities G(k). The value of Newton G at various scales k.
Before I should have written ˜G = G(k) k2

but you understood I meant that. Because both k and G(k) are changing. It is the dimensionless pure number ˜G that goes to a limit as k-> infinity.

The behavior of ˜G we know. G(k) is a constant physical quantity for a long long range of small and moderate energies k. Newton told us this already. So that means ˜G must be increasing like k2. But then when k gets up near planck scale the behavior changes and ˜G starts to converge to a finite number. That means that G(k) has to decrease!

So the G(k) relevant to the big bang, or big bounce as some people model it, would be a much smaller physical quantity than what we are used to. G(here and now) >> G(bang).
To me it is not clear that the comparison is even meaningful because conditions were so different. So I don't put much weight on that comparison.

However by the same flimsy uncertain reasoning. Lambda(k) would be constant over a long long range of k, but then as k gets "Plancky" and Lambda(k)/k2 is starting to converge it must be true that Lambda(k) gets very very big! This only would happen when k is very near planck scale. It offers a possible explanation of a brief episode of inflation.

17. Oct 28, 2009

### MTd2

I guess I am finally understanding a bit why this is called non nonperturbative renormalization. Suppose I could magically write down infinite counter terms to get away with divergences. Although there were infinite many coupling terms, they would all magically conspire to slide to a stable value within a finite dimensional surface, made of observable eigenvectors.

I will give a humble opinion of mine. I have issues with calling this non-perturbative because a perturbative methods are used in all aspects of this idea. So, I'd rather call this either:

*Collective Renormalization
*Dynamical Self-Renormalization
*Collective Dynamical Self-Renormalization
*Orbit Renormalization
*Attractor Renormalization
*BEC Renormalization (referring to the emergence of collective structures in low temperature materials).
*BES Renormalization (Bose Einstein Surface, to correct a misleading idea of the above item)

But never non-perturbative renormalization. This is quite a confusing and misleading name... at least for me

18. Oct 29, 2009

### Fra

question on motivation

I apologize for this ignorant question but I have not so far looked too deep into these programs due to rejection of some of the starting points, but I definitely see some lage potentials in trying to make more sense out of the renormalization ideas that I can connect to at a deeper level.

The "space of actions" that we are talking about, does in my view sort of correspond to the space of observer (or inference systems). When you change the observational scale, that certainly means the actual observing context changes.

So the deeper idea here, is merely a special case of the general idea of connecting the laws of physics (as say encoded in action functionals) between two observers. This would IMHO, suggest the renormalization scheme itself (including the ERGE and the space of actions) are part of the real physics and not just a mathematical tool, becauase a bit more abstractly one can imagine that the "renormalization" is automatically done by nature all the time. In other worlds, the renormalization rules become on par with the normal phsyical laws, and thus there is an "action" also in the "action space", that conceptually one would EXPECT(at least I do) do be unified when this is fulyl understood.

So I seek the inside view of this, and then it seems a key is certainly how to CONSTRAIN some mathematically infinte fantazised the space of actions to a more "physical inside view" of DISTINGUISHABLE possible actions?

As it seems Reuter has done someone like this, he somehow truncates the picture here. But my question here is if anyone can point me to where this is motivated. Ie. does he do this simply because it's the only way to make real compuations (which is certainly a rational reason) or does he motivate this deeper in the sense that this "computability" is actually rooted in the constrains of nature itself, in particular the cmoplexit of observers?

If one would be able to go this route, I see plenty of possibilities, including complete TOE-style unification also of matter.

I'm sorry if this is a stupid question to the AS experts but I never really went into depth in this. So I wonder if there are some more promising ideas (like the one I seek) that is hidden somewhere in the current research, but that aren't obvious from the basic premises and introduction to these research progrms?

/Fredirk

19. Oct 29, 2009

### Fra

Re: question on motivation

In particular, I would expect even a connection to evolving law, where a physical view of the renormalization flow could relate to flow of evolution of law, and also by connecting constraining context to observers/matter, evolution and emergent of matter? So matter and law emerge together, in the sense that the more "non-trivial" matter systems that emerge to play the role of inside observers, more larger does the distinguishable "space of actions" become?

Then the truncation could be given a physical motication, as constraints coming from the context of beeing encoded in emergent matter?

Then the stable actions, would similarly correspond to stable matter, since the stable actions are then "preferred images" implicit in the observing system?

Anyone making similar associations to AS topic?

/Fredrik

20. Oct 29, 2009

### MTd2

Truncation is just an approximative correction to the full perturbed action. In this case of AS it just shows that in higher orders the of the truncate action correction does not add anything qualitatively after one gets enough terms to find the safe surface. The lowest order suffice, thus the name "non perturbative renormalization".

There is nothing that is traightforwardly deep in this method, in the way you imagine. ERGE and the flow are indeed physical in this case, more even so, in certain ways, than in the case of Yang Mills theory, because it is not just the physicists' trying to dig something out of diagrams. It is the couplings of the theory dynamically cooperating and organizing somehow among themselves to find a point stable in a surface, all this which ends up causing the renormalization of the theory.