Aug2-07, 02:03 PM
As I see it, and I may be missing the point entirely, this is a moment of opportunity and change for LQG and related researches.
In brief, the story is that for several years Reuter has been saying that newton's G and the cosmological Lambda RUN with increasing k. He and other people have been showing how this happens, but until now the QG consequences haven't been adequately studied.
The k parameter is an index of proximity and energy of interaction---its dimension is reciprocal length. As k --> infty, says Reuter, G and Lambda converge to their BARE values. (and the corresponding action coverges to what he calls the bare action.)
If LQG or any other non-string QG is to be considered a FUNDAMENTAL theory of the microscopic degrees of freedom where geometry and matter interact, then presumably it should be based on the bare Einstein-Hilbert action determined by the bare G and Lambda.
The bare G is less than everyday value (something Reuter calls "anti-screening") and the bare Lambda is much larger than the measured cosmological constant.
The bare E-H action is what Reuter et al have discovered to be the FIXED POINT towards which the renormalization flow converges. They seemingly do not have to put it in by hand---they specify what the theory is about and what its symmetries are and they claim they get bare E-H out, in other words it is not an assumption but a prediction of the theory.
If one takes seriously what Reuter says, then it seems as if LQG theorists should be using NOT the everyday (low k) versions of G and Lambda, which we can measure by macroscopic observation in our low-energy world, but rather the bare (high k) versions of G and Lambda.
Until around June 27 I had been thinking that Reuter's work was very interesting---and I was glad that the LQG community included him as a plenary speaker at Loops 05 and the Zakopane QG school---but I wasnt thinking much about the consequences of having constants run. Reuter's June 27 talk at Loops 07 triggered something.
One event that rang a bell, or sounded an alert, was that, just a month after Reuter's talk, Ted Jacobson posted his Renormalization and Black Hole paper. This takes seriously the idea that G runs and that the bare value is different from the everyday. And it seems to urge that people should take stock of the consequences of that. And, whether it turns out to be significant or not, in his recent paper Jacobson cites a Reuter paper. He doesnt make a big deal of it, but he includes that gesture.
Curiously enough this is not even new, in a sense. Reading LQG papers even from back in the 1990s, I recall having encountered references to the bare value of G---the awareness that G may vary with scale has always been there! But I don't think people back then had as clear a picture as they do now of HOW it runs. Reuter has plotted a trajectories for both G and Lambda. So the whole business seems to have a less abstract, more down-to-earth feel. Jacobson's paper exemplifies this---he shows how it can make a difference when one considers blackhole entropy.
Jacobson's paper: http://arxiv.org/abs/0707.4026
Reuter's 27 June talk at Loops 07
audio MP3 :
|Register to reply|
|Simple loop script in "Gnu Plot"||Programming & Computer Science||7|
|Edward Witten on LQG "for lambda > 0, QG does not exist in any dimension"||Beyond the Standard Model||5|
|"Cryotiger" closed-loop CCD chiller repairs/service||General Physics||0|
|"strings, branes, & LQG" forum should be "Quantum gravity and theories of everyth||Forum Feedback & Announcements||0|