Time Dependence in Quantum Gravity: Recent Bojowald Paper

In summary, the paper discusses the difference between classical and quantum theories with respect to time. It argues that in the canonical approach, where the Hamiltonian is a second class constraint, time is just a gauge degree of freedom. However, after 100 Planck steps, time kicks in and is an approximation that works well.
  • #1
marcus
Science Advisor
Gold Member
Dearly Missed
24,775
792
want to read together this recent Bojowald paper?
Time Dependence in Quantum Gravity
http://arxiv.org/gr-qc/0408094

Bojo is at Albert Einstein Institute (AEI-Gölm) one of
the premier places for QG. It looks like not too hard
and it clarifies things about time in QG. So could be
useful to assimilate.

Besides Bojowald the other two authors are
Pamapreet Singh (IGPG-Penn State) and
Aureliano Skirzewski (AEI-Gölm).

Meteor probably posted the link for this when it came out
but I only just now realized the special value of it.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
marcus said:
want to read together this recent Bojowald paper?
Time Dependence in Quantum Gravity
http://arxiv.org/gr-qc/0408094

Bojo is at Albert Einstein Institute (AEI-Gölm) one of
the premier places for QG. It looks like not too hard
and it clarifies things about time in QG. So could be
useful to assimilate.

Besides Bojowald the other two authors are
Pamapreet Singh (IGPG-Penn State) and
Aureliano Skirzewski (AEI-Gölm).

Meteor probably posted the link for this when it came out
but I only just now realized the special value of it.

OK, I have saved the paper and read the intro and the first section on Marolf's group averaging. I worked through the examples. Do we have any discussion points on this part of the material? Group averaging was also used by Thiemann in his string quantization, and we talked about Marolf's paper then.
 
Last edited by a moderator:
  • #3
selfAdjoint said:
OK, I have saved the paper and read the intro and the first section...

thanks! I thought already in the introduction there were some things people might want to discuss. I'd like to make it as inclusive as possible---minimal prerequisites.

You may already be too advanced to be interested in what i have in mind, selfAdjoint :smile:

Frankly I found the fourth paragraph of the introduction already interesting, where it says "A particularly striking difference between the classical and the quantum theory is the issue of time..."

this is at the bottom of page 2, which is really the first page of the article because page 1 is just a cover-page containing the abstract and listing the authors.
 
  • #4
Yes, I find it interesting that he brings in relational time, and then (it seems to me regretfully) rejects it because it can't be fitted into the math. What do you think about his next statements, that in the canonical approach where the Hamiltonian is a second class constraint, time is just a gauge degreee of freedom? The implication is that if you fix the gauge, as one does, you get no time.
 
  • #5
selfAdjoint said:
Yes, I find it interesting that he brings in relational time, and then (it seems to me regretfully) rejects it because it can't be fitted into the math. What do you think about his next statements, that in the canonical approach where the Hamiltonian is a second class constraint, time is just a gauge degreee of freedom? The implication is that if you fix the gauge, as one does, you get no time.

I'm glad to have someone to think about these things with. I think we at a place where intuition can originate or be cultivated.

Relational time works, as he says. And it is used for calculating the jumps from geometry to geometry at Planck scale. I remember how surprised I was when i read Bojo's 2001 paper about no-bigbang-singularity and the followup one by Bojo, Ashtekar, Lewandowski that did the same thing with more detail and rigor.

the hamiltonian constraint H Psi = 0 was a difference equation that showed the evolution of the universe geometry in little steps, Planck step by Planck step. And there was no time. They were using the size of the universe as the evolution parameter to correlate other stuff to.
The size of the universe was their clock.
At a fundamental level you have to have something real to be your clock and you correlate to that.

However Time kicks in after about 100 Planck steps. Amazingly the approximation to the semiclassical, the differential equation, the familiar time-evolution, is very very good after on the order of 100 jerky geometric jumps. And it is smooth sailing.

So Time is an abstraction, an approximation, that works if you don't try to parse evolution finer than about 10-40 second, or 10-41 second, that is, finer than about 100 Planck time units.

If you try to be fundamental you do not see a physically meaningful, MEASURABLE, time function----you have to make an arbitary choice of something measurable, like the scale parameter of the universe, and use it to keep track of the other measurable things. You arbitrarily choose something to be your clock. there is no ultimate criterion of steady running!
this is hard to swallow (or so i found)

but if you can wait you get a very good approximate time, idealization, not fundamentally physical, but practically approximated by possible-to-construct realistic clocks. It is only at the fundamental scale that you can't have this nice time to which we are so accustomed.

I am talking too much, sorry.

Also if you try to see things at a fundamental level you do not see a unitary time-evolution operator, a oneparameter group of nice probabilityconserving operators. you just see the damn constraint equation which says H Psi = 0. it is saying that the physical states are distinguished by being solutions of the H = 0 equation. And Bojowald derives his difference equation from this, showing how the geometry progresses in microscopic jumps.

So at fundmental level there is no unitary time-evolution, but again if you are patient and can wait 10-40 second then the semiclassical approximation kicks in and things begin to look very normal.

Now what i am saying is extremely impressionistic. Mostly comes from
bojowald papers and the basic ABL gr-qc/0304074 (Asht., Bojo, Lewand.)
But also this host of follower-research that has been repeating the same thing with variations since 2001. It is what i am used to seeing (impressionistically speaking) from each new paper as it comes along.
they talk about bounce, and inflation, and vary the assumptions, and
generalize the conditions and add in this or that, but the basic picture stays the same. bojowald has a lot of bibliography on pages 30-33.

Now you have pointed out a special nontrivial detail which is the
Don Marolf paper gr-qc/9508015 which is bojowald's reference [23]
and which has "group averaging".
he talks about that on page 3 (midpage) and then again on page 5 (top of page) and he does a sample calculation which can be interpreted as an instance of it. My feeling is that he is not seriously invoking anything. Everything here is kind of rudimentary so he does not have to invoke theorems or powerful technical tools. he is just doing something modest that happens to be a simple case of a celebrated technique. but he might have done it anyway. But that is just my take, and I am far from being the expert here. You are much more knowledgeable about Marolf and "refined algebraic quantization" and solving constraints, or so I think.

Well that's my unedited view of the matter for the time being. I will leave it un-concise and garulous and hope you can find something useful in it.
 
  • #6
BTW it turns out Aureliano Skirzewski is from Venezuela.

he was doing string theory or something at a university of Venezuela and
got a gig at Trieste for a year or so, and I guess Nicolai spotted him brought him to AEI.

I wouldn't have guessed from the name Skirzewski that he was from Venezuela, (but the name Aureliano is right out of Marquez great Columbian epic novel 100 years of solitude)

Looks like Thomas Thiemann is settled back at AEI now.
I am just guessing this from the list of participants and papers at the Perimeter Institute 19-31 October conference. Junior people at AEI who work with Thiemann (Brunneman, Dittrich) will be reporting in Canada on research that Thiemann is involved in. If I was going to write him email i believe I'd use his AEI address (but of course i could be mistaken)
 
  • #7
hope some others have looked at the Bojo paper Time in QG and have some thoughts about time in QG they want to share,

but for now its only selfAdjoint who I know is reading the paper---have to read at least the introduction, y'all it is easy enough for sure---and plenty of PF people could get interested the rest as well.

I am beginning to think that the view of space and time coming out of QG is right----time is not infinitely divisible. If you look closely enough you don't see time.

only a bunch of things changing together in a correlated way and you have to pick one as the evolution parameter--and physically measurable things change in a jerky jumpy fashion

so time just goes away when you look closely. I think this, I don't know it.

And I think space is not infinitely divisible either. Physically there is no such thing as a "diffeomorphism". When you look close the manifold goes away. the continuum goes away. I think that's basic (but again can't be certain)

the beauty is how the discrete evolution quantum regime can be calculated and merges into the semiclassical and ultimately the classical----how rapidly the approximation gets good. People are beginning to do QG computer modeling and it is really helpful that the convergence to the semiclassical or classical limit is fast. I will try to find a "Recent Progress" paper discussing this. IIRC Bojowald's gr-qc/0402053
Certainly Ashtekar discussed it on page 23 of his recent
http://arxiv.org/gr-qc/0410054 "Gravity and the Quantum" but he just gave a birdseye view.

---quote Ashtekar page 23---
...The detailed calculations have revealed another surprising feature. The fact that the quantum effects become prominent near the big bang, completely invalidating the classical predictions, is pleasing but not unexpected. However, prior to these calculations, it was not clear how soon after the big-bang one can start trusting semi-classical notions and calculations. It would not have been surprising if we had to wait till the radius of the universe became, say, a few billion times the Planck length. These calculations strongly suggest that a few tens of Planck lengths should suffice. This is fortunate because it is now feasible to develop quantum numerical relativity; with computational resources commonly available, grids with (109)3 points are hopelessly large but one with (100)3 points could be manageable.
---end quote---
 
Last edited by a moderator:
  • #8
hi guys, I just wanted to say that relational time not just works, is the "only" way to construct a consistent, realistic physical theory! You may want to take a look to these papers (I am sure Marcus will like them :))

gr-qc/0302064
Consistent discrete gravity solution of the problem of time: a model
Authors: Rodolfo Gambini, Rafael A. Porto and Jorge Pullin
*
quant-ph/0209044
Title: A physical distinction between a covariant and non covariant reduction process in relativistic quantum theories
Authors: Rodolfo Gambini, Rafael A. Porto
New J.Phys. 5 (2003) 105
*
quant-ph/0205027
Title: Relational Description of the Measurement Process in Quantum Field Theory
Authors: Rodolfo Gambini, Rafael A. Porto
New J.Phys. 4 (2002) 58
*
quant-ph/0105146
Title: Relational Reality in Relativistic Quantum Mechanics
Authors: Rodolfo Gambini, Rafael A. Porto
Phys.Lett. A294 (2002) 129-133
*
gr-qc/0101057
Relational time in generally covariant quantum systems: four models
Authors: Rodolfo Gambini, Rafael A. Porto
Phys.Rev. D63 (2001) 105014
*
 
  • #9
Edgar, thanks for posting these articles.
I am not entirely clear about minor differences between
Gambini/Pullin approach to quantum gravity and, say, that of
Ashtekar. Whatever the differences of approach, I see a move in the direction of relational time which is more or less inclusive of everybody. Please correct me if I have the wrong impression.

On another matter, did you happen to see these articles?

Gambini Porto Pullin
Realistic clocks, universal decoherence and the black hole information paradox
http://arxiv.org/abs/hep-th/0406260

Gambini Porto Pullin
No black hole information puzzle in a relational universe
http://arxiv.org/hep-th/0405183

I was intrigued by their limit (derived by thought-experiment)
on the possible lifetime and precision of real clocks. Time, even macroscopically, has no meaning apart from measuring it. Operationally one cannot speak of time unless one has a real clock. If one tries to make a clock more and more long-lived and more and more precise then one is compelled to make it so massive that it collapses forming a black hole. Then if one tries to use black holes as one's clocks, this turns out to be the best possible sort of clock but unfortunately they evaporate. And so there is a theoretical limit on how good it can get. I thought this was nice.
If you read them how did they seem to you?
 
Last edited by a moderator:
  • #10
selfAdjoint said:
Yes, I find it interesting that he brings in relational time, and then (it seems to me regretfully) rejects it because it can't be fitted into the math. What do you think about his next statements, that in the canonical approach where the Hamiltonian is a second class constraint, time is just a gauge degreee of freedom? The implication is that if you fix the gauge, as one does, you get no time.
Ouch. The Hamiltonian is a second class restraint? That pretty much blows me right out of the water. To follow that up with the concept of time as a gauge field is truly painful. I had a hard enough time understanding the whole spacetime concept as it was. I admit, I ate pizza back when I thought I understood the concept. And I played hearts on weekends, but I never played with the cards face down while eating pizza. God may not play dice, but surely eats pizza and plays cards on Sundays.
 
  • #11
Marcus said:
Edgar, thanks for posting these articles.
I am not entirely clear about minor differences between
Gambini/Pullin approach to quantum gravity and, say, that of
Ashtekar. Whatever the differences of approach, I see a move in the direction of relational time which is more or less inclusive of everybody. Please correct me if I have the wrong impression.

What I didn't see in any of those Pullin-Porto papers titles was the word quantum. It looks like they are doing all their work in classical theory.

BTW Marcus, back a few posts, on the 19th, you wondered if you were talking too much. Absolutely not! I missed that post before and just now went back to read it, and it cleared a lot of things up for me, because I haven't really been following relational time, or really the cosmology angle of LQG at all. Thank you.

I wonder if we can parcel out the study here. You have a clear view of relational time in LQG cosmology, and Edgar1813 can tell us some detail about the Pullin-Porto approach, and I'll reread the part on group averaging in Bojo's paper and bring something or other from Marolf to the party. And what happens then, happens. How about it?
 
  • #12
Hey,

Yes Marcus I'm aware of those papers too. With respect to the differences between consistent discrete quantum gravity (Gambini-Pullin approach) and LQG I would say that the main difference is the disappearance of the space-time constraints (Internal constraints, like Gauss law, is expected to survive as well as the loop transformation.) which allows a cleaner quantization and a consistent relational description, where all observables including time are turned into quantum operators. You may want to take a look at this paper:

Dirac-like approach for consistent discretizations of classical constrained theories. http://xxx.lanl.gov/abs/gr-qc/0405131

for an introduction in the classical realm. With respect to the word "quantum", selfadjoint might want to take a closer look to the papers I think include all the ingredients including quantization:

A relational solution to the problem of time in quantum mechanics and quantum gravity: a fundamental mechanism for quantum decoherence
http://xxx.lanl.gov/abs/gr-qc/0402118

and

Consistent discrete gravity solution of the problem of time: a model
http://xxx.lanl.gov/abs/gr-qc/0302064

Some of the papers Marcus mentioned are also worth reading.

The idea of decoherence seems to be related to treating time as a quantum physical object and its fundamental character, as Marcus pointed out, is associated to fundamental limits nature puts in how to build up an accurated clock.
What seems to happen is that intrinsic Heisenberg uncertainties limits the mass of the clock from below and black hole collapse limits it from above. Combining the two there is a fundamental inequality that turns out to be an equality for black holes promoting them as the best available clocks in nature!
All this observation are quantum in nature as well as the modified non-unitary evolution that seems to be implied by the non existence of ideal time.
All this implies that nature has an intrinsic non-unitary behavior in "real time" that, although small is big enough to rule out the black hole information puzzle and turned into not a paradox at all!
Instead of looking for a modification of gravity what it's been said here is: "Look QM is intrinsically non-unitary once time is taken to be a quantum real observable, why should we bother about the BH puzzle in the first place since the BH will loose coherence anyway at the same speed as Hawking radiate."

Best
 
  • #13
selfAdjoint said:
What I didn't see in any of those Pullin-Porto papers titles was the word quantum. It looks like they are doing all their work in classical theory.
...

The way I see it, Gambini and Pullin are a quantum gravity team of comparatively long standing. Porto is a junior member who has appeared sometimes more recently.

The question to ask might be how to describe Gambini group's approach (or approaches) to Quantum Gravity? And what do they mean by "discrete" quantum gravity?

For what it's worth, here are the most recent 4 papers from this group:

1. Rodolfo Gambini, Jorge Pullin Consistent discretization and loop quantum geometry
gr-qc/0409057

2. R. Gambini, S. Jay Olson, J. Pullin Unified model of loop quantum gravity and matter
gr-qc/0409045

3. Rodolfo Gambini, Rafael Porto, Jorge Pullin Fundamental decoherence from relational time in discrete quantum gravity: Galilean covariance
gr-qc/0408050

4. Rodolfo Gambini, Jorge Pullin Consistent discretizations and quantum gravity
gr-qc/0408025

I have sometimes seen Gambini's approach desribed as the "Southern School" of LQG, to distinguish it from the versions developed by Ashtekar and Rovelli. If there are two schools, then the two cite each other a lot. Gambini et al will cite Ashtekar, Thiemann etc. and they in turn cite Gambini. Distinct, at least on the surface, but somehow not conflicting.

Some Gambini papers I've looked at don't resemble LQG at all, to my limited perception at least! I'm afraid I don't know enough sort it out. They go to the same conferences. the same family. Like brothers: the same but different.

============
Notice that I am saying "Gambini et al." instead of, as you said, "Pullin Porto" papers.
It may seem pointless and ridiculous of me to try to get clear on who's-who details like this, but let's take time to sort things out. I will use the arxiv counts to form an idea of the Gambini bunch.

Gambini has 64 papers, many with Pullin, a few with Porto, and lots with other people as well.
Pullin has 25 papers, most are with Gambini. Indeed, aside from his MoG newsletter and 5 others, all are with Gambini
Porto has 11 papers, all with Gambini. The 7 most recent are also with Pullin. So far, anything that is Pullin-Porto has also been Gambini-Pullin-Porto.
============
Just to get a sample of their interests, by glancing at titles, here are the 11 papers you get on arxiv for Rafael Porto:

1. Fundamental decoherence from relational time in discrete quantum gravity: Galilean covariance

2. Realistic clocks, universal decoherence and the black hole information paradox

3. No black hole information puzzle in a relational universe

4. Dirac-like approach for consistent discretizations of classical constrained theories

5. A relational solution to the problem of time in quantum mechanics and quantum gravity induces a fundamental mechanism for quantum decoherence

6. Loss of coherence from discrete quantum gravity

7. Consistent discrete gravity solution of the problem of time: a model

8. A physical distinction between a covariant and non covariant reduction process in relativistic quantum theories

9. Relational Description of the Measurement Process in Quantum Field Theory

10. Relational Reality in Relativistic Quantum Mechanics

11. Relational time in generally covariant quantum systems: four models


for more detail, see
http://arxiv.org/find/grp_physics/1/au:+Porto_R/0/1/0/all/0/1

for a full list of Gambini preprints (which include these 11) see
http://arxiv.org/find/grp_physics/1/au:+Gambini_R/0/1/0/all/0/1


=====

Does a clear pattern emerge from this small sample? Not for me anyway!
But there it is, in case you want to draw your own conclusions. The titles certainly do not all say "discrete QG" on them---nevertheless suspect that is what Gambini and co-workers would call the main focus of their effort.
or maybe they would call it a version of LQG.

Nonunitary probably knows, and could explain.
 
Last edited:
  • #14
Edgar, I just now saw your post. I like your summary of this argument!
edgar1813 said:
...
The idea of decoherence seems to be related to treating time as a quantum physical object and its fundamental character, as Marcus pointed out, is associated to fundamental limits nature puts in how to build up an accurated clock.
What seems to happen is that intrinsic Heisenberg uncertainties limits the mass of the clock from below and black hole collapse limits it from above. Combining the two there is a fundamental inequality that turns out to be an equality for black holes promoting them as the best available clocks in nature!
All this observation are quantum in nature as well as the modified non-unitary evolution that seems to be implied by the non existence of ideal time.
All this implies that nature has an intrinsic non-unitary behavior in "real time" that, although small is big enough to rule out the black hole information puzzle and turned into not a paradox at all!
Instead of looking for a modification of gravity what it's been said here is: "Look QM is intrinsically non-unitary once time is taken to be a quantum real observable, why should we bother about the BH puzzle in the first place since the BH will loose coherence anyway at the same speed as Hawking radiate."

I have bolded the punchline, for emphasis. this is a fine logical argumentation. But I want to be cautious. I was listening to a talk by Ashtekar on "Black Hole Evaporation" that he gave 20 September. In his treatment, if someone can wait a very long time, the final state is pure.
There is ultimately no decoherence. It is as if Ashtekar is unaware of
the Gambini et al argument. So I am trying to keep these two conflicting possibilities in mind---not to let one win out over the other, and hope that ultimately something will resolve the contradiction.

Edgar, did you happen to listen to the audio of Ashtekar's talk?
http://www.phys.psu.edu/events/index.html?event_id=934;event_type_ids=0;span=2004-08-20.2004-12-25
 
Last edited by a moderator:
  • #15
Hey,

Just a remark, Gambini and Pullin have each more than 110 papers. They have contributed to classical GR as much as QG.
Gambini indeed was amongst the first to introduce loop space in gauge theories in the early 80's and he has done important contributions to lattice YM theories as well. Pullin also made seminal contributions to black hole collisions, gravitational waves and numerical GR. Rodolfo has been recently awarded the TWAS (third world academy of science) prize in the ICTP Trieste for his work and leadership in his native country: Uruguay.

http://www.ictp.trieste.it/~twas/publ/TWAS_12Dec03_Uru.html

Jorge also received the Edward Bouchet award of the American Physical Society.

http://www.aps.org/praw/bouchet/01winner.cfm

As far as I know they have been working together since 1990 and published more than 50 papers in collaboration.
Rafael A. Porto was ('is') Rodolfo's student and they have been working together since 2000. He is now at Carnegie Mellon U. getting a phd and working as well in gravitational radiation and effective field theories.
As you can see from their papers their main motivation it's been trying to construct a consistent, purely quantum relational description of covariant theories and the understanding of its physical implications, from a philosophical view to experimental observation. They have made contributions to the interpretation of relativistic QM and QFT as well building upon relational ideas and also proposed a potential physical distinction between the covariant-realistic approach they developed and the non-covariant instrumentalist one. Incidentally their first paper was the first in showing
that relational time could be consistently applied in several models considered crucial tests.

GPP have been working together as a team since 2002 and they have shown that, remarkably, the Gambini-Pullin quantization approach seems to allow for the introduction, for the first time, of a consistent purely quantum description therefore solving the so called 'problem of time', one of the long-standing fundamental problems of quantum gravity. The potential empirical evidences are now in the form of a fundamental decoherence effect which renders the BH paradox unobservable and might be tested in a near future in macroscopical quantum superpositions like Bose-Einstein condensates. Furthermore, the theory has already been successfully used to treat a variety of systems, including cosmological models, Yang-Mills theories, BF-theory and general relativity on the lattice. The formal structure has been since emerging.

I believe there is by now compelling evidence that these ideas seem to be in the right track. However, as a friend of mine used to repeat: "Prediction is very difficult, especially if it's about the future." I think it was Bohr although he claims it was a famous beisbol player :)

best.
 
Last edited by a moderator:
  • #16
I have been reading (but not really deeply) a couple of the papers you kindly gave links to. I have a question that is probably easy to answer but which is bothering me. The authors discuss the importance of their n, the counter of the steps in their consistent discretization. They say they couldn't develop their relational time in the discretization approach without n serving as an ordering parameter, and they point out that n has some of the mathematical properties (orthogonal to physical observables, etc.) that time has in classical theory. So my question is, have they perhaps smuggled time into their theory in the disguise on this n?

This is not intended to be a knock at their work, or at the implications of relational time. It's just a request for clarification.
 
  • #17
selfAdjoint said:
I have been reading (but not really deeply) a couple of the papers you kindly gave links to. I have a question that is probably easy to answer but which is bothering me. The authors discuss the importance of their n, the counter of the steps in their consistent discretization. They say they couldn't develop their relational time in the discretization approach without n serving as an ordering parameter, and they point out that n has some of the mathematical properties (orthogonal to physical observables, etc.) that time has in classical theory. So my question is, have they perhaps smuggled time into their theory in the disguise on this n?

This is not intended to be a knock at their work, or at the implications of relational time. It's just a request for clarification.

Hi selfAdjoint, by coincidence I also have been reading (probably) a similar couple of papers, and wondering what, if anything, I could say.
To be specific, this morning I've been reading
http://arxiv.org/gr-qc/0408025
Consistent discretizations and quantum gravity
http://arxiv.org/gr-qc/0409057
Consistent discretizations and loop quantum geometry

I see that in his most recent review paper ("Gravity and the Quantum") Ashtekar cites the first of these papers. the second was probably too new at time of writing for him to have included.
I hope that you and Edgar can make some progress with these questions. I would like to participate, but I have a rehearsal this morning.

My understanding about the discrete steps being a kind of time probably parallels what you are saying. In cosmology, Bojowald chose a size index as the "clock", something that increases in discrete steps, and correlated the other observables to that. he does not use the word "relational" but he does pick some real process (in this case the expansion of the universe) as a clock and he does the analysis with discrete timesteps. In Planck-scale cosmology (as Bojowald does it, and maybe other areas as well) there is no external criterion for saying that the discrete steps are "equal".

the idea of a steady uniform time quickly emerges as an approximation.
but the more fundamental Planck-scale analysis seems to call for
some real process advancing in discrete steps to which one can correlate the rest.

this is a bit hasty, I wanted to contribute at least something to this conversation before having to get ready to leave for the morning.
 
Last edited by a moderator:
  • #18
Hey,

The meaning of time by itself is a rather subtle issue it would take lot time to discuss. What I can say is that think of $n$ as coordinates in GR since actually they are just a discrete counterpart. In classical GR, as Einstein pointed out, coordinates are just labels and physics is built upon what he called coincidences (you can read about the hole argument in Rovelli's papers as well). The relationship between coordinates and observers is somehow subtle (Rovelli has a set of wonderful papers about this issues you might want to take a look:

1) QUANTUM REFERENCE SYSTEMS.

2) WHAT IS OBSERVABLE IN CLASSICAL AND QUANTUM GRAVITY?

By Carlo Rovelli Class.Quant.Grav.8:297-332,1991)

either we fix a gauge and attach to it some "physical meaning" through the study of the solution and its geometrical properties or we introduce the idea of gauge invariant correlations between what Rovelli calls "partial observables", namely those objects with ontological meaning as clocks and rods. In the latter coordinates are just used as a device to construct relational observers in pretty much the same fashion that one use any parameter to describe the world line of a particle given the same physics for x(x^0).
At the classical level both procedures give the same "physics" and that is perhaps why people doesn't really bother much with the difference. The introducction of $n$ and the lost of constraints may resemble the gauge fixing process at the classical level although it is not quite the same since we still have all degrees of freedom in the theory and one doesn't add further constraints.
At the quantum level the story detour completely. One can still fix a gauge and define a classical time which will be kept classical througout the evolution with a sort of Schrodinger like equation. The constraints now are second class and are impossed as strong relationships. This approach lacks the fundamental ingredient that time itself is a physical observable subject to quantum fluctuations and in principle in equal foot to all other physical variables. in other word, the classical choice might not represent a realistic clock or rods (Think in a rotating coordinate system, the "edge" of the axis will exceed the speed of light!)
On the contrary, in the relational spirit one quantize the whole space and then introduce the correlations between quantum partial observables choosing one, of several of them as the clock. The quantization itself lies in formal grounds, namely build up a set of n-Hilbert spaces, a canonical algebra of well defined selfadjoint operators and a unitary mapp between this n-equivalent Hilbert spaces.
One could have done this in the continuum theory as well using the coordinates as auxiliar elements to construct the
physical measurable objects like the correlations and using the unitary mapp given by the Schroedinger equation in coordinate time $t$.
In that sense $n$ doesn't play a different role that the time coordinate in GR. Furthermore, the decoherence effect obtained from the relational approach could be also derived from the continuum description in a similar fashion if we would succeed in building a consistent theory (I haven't looked at it but I imagine it can be also obtained from bojowald's approach).
There are subtelties I won't discuss here but it could be done in principle. The idea of discrete quantum gravity departs from usual QG since the phase space has more degrees of freedom due the lack of constraints, it recovers GR though since the symmetries are still encoded in the solutions. It provides us with a consistent quantization and a relational evolution with all variables, included time, turned into the quantum realm.
The meaning of $n$ is like discussing the meaning of the metric in GR or the coordinates of space time itself. Somehow physics has been built upon objects that do not seem to have a direct observable character but whose symmetries or general properties allow a cleaner treatment.
As in any physical theory there is an underlying ontological meaning for certain objects that it is put into the theory by us. That is why this is a human construction, and as Einstein used to say: "I can't add my brain into the theory". There is always something "outside" the theory although inside the universe, and depending the type of question asked nature will answer relationally. Different question will give different, perhaps even non comparable answers.
You can think of the universe as a block sequence of snapshots in $n$ where internal correlations are measured. For a perfect clock we can correlate this god-time with what we call time and have the ilusion of "evolution" and a well defined local theory. In general probabilites would be associated to the whole story of the universe and the formalism, although still applicable in non Schroedinger-like regimes, would be rather different to what we experience today.
Most of the papers of GPP and GP have discussed on these issues, I hope I have made it clearer.

best
 
  • #19
Thank you for that clear explanation. With that and reading the "Consistent Discretations and Loop Quantum Gravity", I think I understand the approach better. In the latter case we have a time in the classical theory, and they discretize it out, resulting in a hamiltonian system without constraints (because would would have been a constraint turns out to constrain the system at different n's and thus vanishes identically). Then they are free to choose anyone of the observables after quantization to play the role of time using conditional probability.
 
  • #20
thanks Edgar, it does make it clearer.
If I remember correctly the concept of "partial observable" was discussed in Rovelli's textbook Quantum Gravity, in chapter 3 but also in some sections near the end of chapter 2. I will see if I can find page references for some discussion.

IIRC a partial observable is PARTIAL because to make a full meaningful observation involves measuring the correlation between it and something
else.

a partial observable is like saying "Three O'clock!"
or "25 meters North from my nose!"

It does not mean anying unless associated with something else, an event, a motion, a prediction.

At some point rovelli points out that one can observe a P.O. ("in a vacuum" so to speak) but one cannot predict it will have some value as long as there is nothing on which to base the prediction.

Anyway there is the idea of incompleteness about it. Some other measurement must be made simultaneously or inconnection with it, to give the whole thing meaning. the real thing of interest is the correlation of other measurements with the P.O.

rovelli also warns, in chapter 3, that he sometimes calls a partial observable by the alternative name relativistic observable. and
also if the context is understood he simply says "observable".

To me this simpler usage makes sense----I do not expect quantum observables to have some absolute meaning apart from correlations between different ones----so I do not need to hear the qualifier "partial".

Well then, in this context, the measurement of time, by looking at some clock or by observing the expansion of the universe or whatever----looking in my glass to see how much beer is left----this is just another observation, which I can correlate with other observations. there is no One Absolute Clock and there is not any more a Perfect Steady Time, just there are a lot of observables all on the same footing.

I see that i am just repeating what Edgar and selfAdjoint already said or understood. this comes from attending choral rehearsal all day. this is very interesting but I will go have a nap.
 
  • #21
In English the word "partial" has the connotation that something is imperfect or incomplete. It is not all it should be. A partial success, for example, is not all it should be.

I think rovelli thinks in Italian and what he wanted was a term for
COMPONENT observable.

this is like one's HI-FI components---the speaker, the amp, the CD player.
the idea is that each one is perfect and all it should be, but that
the different components work as an ensemble----when plugged together so they "correlate".

Well, that is just a guess. I have no way to know. But still when i read in the textbook where it describes partial observable I will interpret that to mean "component" because i like the english connotation better.
 
  • #22
Today is election day in my native country (Uruguay) and I am quite excited about the possibility that for the first time in 174 years the left party will win the election. I am nowadays in US and its been quite dissapointing not to listen a single word from either Kerry or Bush about south America.
It doesn't surprisse me although I was expecting more...as usual.
It is also sad that our scientists are usually disregarded and their work often unmentioned. I put the remark about Jorge and Rodolfo to show how two brilliant physicists have not received all the credit they deserve because they don't belong to the US mainstream.
It is fairly common indeed to see similar ideas repeated elsewhere without any reference to our work. Carlo is one of the best in doing so in fact. I admire his work which gave me a lot of inspiration and motivation. You can try to find GaPoPu or GaPu, GaPo, in his papers and see what I mean.
I should say though that Ashtekar and specially Smolin have seriously considered the ideas and cited the work.
It was nice to see in Lee's response to Susskind the GPP solution proposed to the BH paradox :). However non string people has actually made any coment about it or perhaps even read it. Susskind himself didnt mention anything about and I don't think as Marcus claimed that Hawking had read the paper neither. Actually I read in Lubos Motl web page the critics to Carlo's book (really hursh indeed) and he seems to believe that unitarity is essential for conserving probabilities in QM. He hasnt clearly read your posts Marcus otherwise noticing that the GPP mechanism works fine conserving both energy and probability.

Anyway, coming back to physics which is what brought me here and with respect to partial observables, I guess Carlo's motivation is to promote an ontological meaning for the phase space of a given theory. This is however also constrained by the usual meaning of "observable" in our labs. The potential vector in EM in a given spacetime point is not directly an "observable" quantity, nevertheless it give rise to "real" entities like the so called electric and magnetic field and in the quantum realm photons. One could argue also that spots in screen or arrows in voltimeters are the ultimate observables and that E,B are not different from A to that respect. Let's admite here we call observables those quantities which are unambigously determinated for all observers within a frame of reference. notice that E,B are in principle subject to Lorentz "rotations" so they are no unambiguosly determinated for all observers, however we have given rules to relate them which allows us to define the invariant object as well. We could ultimately talk about rods and clocks, and actually just distinguishability of here and there, as the only "observable" things in nature. We could also use Carlo phrasing that anything that can give rise to a meaningful expresion deserves to be called a partial observable. In that sense I agree that the word "partial" is not happy. It is partial however in the sense that we can't predict anything about it since prediction will be related to correlate that value with another meaninful one. Relational observable, or relational probabilities, namely correlations, are then the predictive power of the theory. Given a set of values obtained from nature taken as an input we can then predict another set as an output. We disagree however in the notion of order and how the input is considered in the theory, namely what is an input and what an output. He discusses this more in his paper with Mike (who incidentally is in Uruguay now)

http://arxiv.org/abs/gr-qc/0111016

Here they built upon the group averaging idea to construct the amplitudes to give rise to a spacetime description. It is worth reading and also comparing with

http://xxx.lanl.gov/abs/gr-qc/0402118

where a different approach is promoted.

It is too late now, I am still awake too excited about what's going to happen soon at home and missing family and friends.
It's been very nice to discuss with you two, I will be rather bussy in the next month so I might not be around for a while, I will read the posts though so keep thinking! :)

best regards,



"I will, therefore, take occasion to assert that the higher powers of the reflective intellect are more decidedly and more usefully tasked by the unostentatious game of draughts than by all the elaborate frivolity of chess."

Edgar A. Poe
 
  • #23
I am very uncomfortable with removing time from any equation. I'm not suggesting time is somehow sacred, or absolute, but answers that only appear in a timeless reference frame just rub me wrong. Without a time line, causality breaks down. I need a lot of observational evidence to get past that hurdle.
 
  • #24
edgar1813 said:
notice that E,B are in principle subject to Lorentz "rotations" so they are no unambiguosly determinated for all observers,

And the situation is even worse in general relativity where we only ever know any value up to an equivalence class of diffeomorphisms.


Afterthought: How would "loose observable" do as a replacement for Carlo's "partial". Loose in the sense of not pinned down, like an improper integral, or something subject to a gauge symmetry.
 
Last edited:
  • #25
Hi Chronos,

your comment brings us back to the opening post on this thread:

marcus said:
want to read together this recent Bojowald paper?
Time Dependence in Quantum Gravity
http://arxiv.org/gr-qc/0408094 ...

Here is the abstract:
---quote Bojo et al---
The intuitive classical space-time picture breaks down in quantum gravity, which makes a comparison and the development of semiclassical techniques quite complicated. By a variation of the group averaging method to solve constraints one can nevertheless introduce a classical coordinate time into the quantum theory, and use it to investigate the way a semiclassical continuous description emerges from discrete quantum evolution. Applying this technique to test effective classical equations of loop cosmology and their implications for inflation and bounces, we show that the effective semiclassical theory is in good agreement with the quantum description even at short scales.
---end quote---

Hi selfAdjoint, the visual image is helpful----of a loose or un-pinned-down thing---a "free observable". rovelli also says "relativistic observable" or just leaves off the qualifier and let's context take care of the distinction.

I appreciate Edgar taking the time (from graduate school, I guess, or something equally absorbing) to post with us! I hope that once he gets the high-priority stuff done that he will be back.

It is interesting that Michael Reisenberger is currently with Gambini in Uruguay. Till now he has mainly written solo or most recently several with rovelli. If and when Carlo and Rodolfo collaborate on a paper, that would be very interesting.

there is more in Edgar's last post than I can respond to right now. I will have more coffee and set the clocks back to standard time.
 
Last edited by a moderator:
  • #26
Chronos said:
I am very uncomfortable with removing time from any equation. I'm not suggesting time is somehow sacred, or absolute, but answers that only appear in a timeless reference frame just rub me wrong. Without a time line, causality breaks down. I need a lot of observational evidence to get past that hurdle.

Hi Chronos, your name means time doesn't it? :smile:
Please do not be uncomfortable! You and time are quite safe!

the problem is with the poverty of our language----there are many implementations of the notion of causal ordering and temporal evolution, but we have very few word-tags for them.

One way is to have time advance in little roughly-planck-scale steps.
In Bojowald analysis the "clock" is linked to the size of the universe.
The volume is increasing in tiny quantum jumps---volume is like the energylevels of the hydrogen atom, with discrete spectrum, so it cannot increase except in a discontinuous way. since this is the clock, the clock is advancing in tiny Planck-scale ticks. After roughly 100 steps one can switch over to a more familiar time because the approximation is so good.

the ordering is still there, which is what one basically needs for causality, but the continuity has been given up at the very microscopic level--------and the continuity comes back at a less microscopic level, as a limit: continuity is a approximate feature that emerges from the more fundamental picture.

However the fundamental picture is still unfinished! they are still working on it. So none of this is definite or certain! there are problems, like suppose I use a different clock and slice things differently, what then?
What a headache.

Chronos you might like to look at these recent Gamb. et al. papers.
they advertise really nice time evolution by unitary operators even! But it is not continuous. There is a unitary "matrix" that gets you from instant 6
to instant 7. But you do not notice that it is discrete because the universe uses such small steps---she is discreet about being discrete. I will remind us of the Gamb. et al. links.

http://arxiv.org/gr-qc/0408025
Consistent discretizations and quantum gravity
(this one has an alternative title: Canonical Quantum Gravity and Consistent Discretizations)
http://arxiv.org/gr-qc/0409057
Consistent discretizations and loop quantum geometry
 
Last edited by a moderator:
  • #27
BTW there is a philosophical point to remember.
If evolution is proceeding in tiny discrete steps then ONE CANNOT SAY HOW BIG THE STEPS ARE (or so I think)
because there is no continuous scale at the fundamental level


there is no parallel timescale against which to compare the tiny discrete steps

it is only AFTER THE FACT, when one has waited a few hundred steps and successfully made the transition to some conventional clock time (to an approximation of the real world) that one can look back and
guess-estimate how long each one of those fundamental timesteps should be considered to be, in one's chosen conventional clocktime terms. But that looking back judgement of the size of the steps is (to me) tenuous and un-rigorous and based on arbitrary assignment of similitude. It depends on what one chooses to watch as an indicator of evolution.

I think that even though the continuity of time may not be there at fundamental scale, at least one can say mathematically what it is. so continuity is at least half OK. On the other hand the "steadiness" of time is even less well founded. There is no rigorous criterion that does not involve arbitrary choices and approximation. IMHO. For a long time we humans thought that the rotation of the Earth was steady and we defined the second of time as 1/86400 of the rotationperiod. Hah! Now we think of Cesium microwaves as steady. We like to believe in things. We always must have something which we believe is stable. The rate that clocks go is constantly being affected by the everchanging gravitational field, which is different if you are somewhere else in the galaxy or in the solar system or even at different terrestrial elevation. Well then, how about the rate at which the universe is expanding? No, too bad, even that is changing.

No fundamental criterion of steady. but that is only a philosophical point. Pragmatically we all know what is a good clock, right?
 
Last edited:
  • #28
You've almost got me converted, Marcus. The consistent discretization approach does preserve the causal ordering, but with n arbitrary. This really does seem to go wel with LQG, because of course all the LQG mat ( the Hilbert space/cylinder function/metric stuff) is done in a spatial slice, so you can do successive slices with the CD steps orthogonal.

Whether this is deeply ontological, or more like doing QCD on the lattice, I am not sure yet, but it sure looks like a good way to go. I am cool with no unitary time evolution with relative time, but I still worry about conservation of probability in operator actions on Hilbert space. Yes I know what the LQG authors say, but I haven't yet seen a refutation of Larsson's general argument. Discussion on s.p.r did succeed in splitting the probability concern off from the time evolution one.
 
  • #29
selfAdjoint said:
You've almost got me converted, Marcus...

My gracious. Stop right here! I am not in the business of converting especially you but would like to say anybody. differences in perspective among friends are too valuable (if only for sake of discussion)

but more seriously. I don't understand the consistent discretizations (abbr. CD) approach. It would be a great help to me personally if it interested you enough so you would continue looking and commenting---which might shed some light

BTW in the CD approach they say they do have unitarity of a very small step evolution operator. I will try to find the page reference. Yes, top of page 3 of http://arxiv.org/gr-qc/0409057
 
Last edited by a moderator:
  • #30
Hey,

This is probably my last post for a while but I want to say something before checking what is going on at home (I just took a looooong nap :) ). I'd like to recommend anybody to take a deep breath and read the papers carefully. I am not saying that you, marcus and selfadjoint, didnt actually spent a lot of time on them, and I understand the excitement of new ideas!, but most of the questions you are raising now are largely covered on them and the authors have made a huge effort to make them the most self-contained and clear as possible. CD is not an easy subject we don't completely understand but some issues like unitarity and conservation laws are 'easier' to tackle. By construction probabilities are 'conserved' since they add up to one for any consistent time choice and for any slide, which doesn't mean a given $n$ since in principle the physical $t$ and $n$ are not correlated (that is why predictions involve the whole story of the universe!). Incidentally the formalism allows to study the time of arrival problem in a better setting. The semiclassical regime, where one can construct a local well defined theory and a classical clocks emerge strongly correlating $t$ and $n$, has the conservation of probability manifest even though unitarity in physical time $t$ is lost. This is due to the double commutator structure of the added term to the usual evolution equation. 'Energy', namely the local Hamiltonian operator borrowed from $n$ in the semiclassical regime that serves as 'generator' in $t$ (it is not quite so due to fundamental (unavoidable) quantum fluctuation. That is why unitary in $t$ is lost. Most of the papers discuss this deeply.) is also conserved and furthemore entropy always increases or stay the same obeying the 2nd law. This is the Lindblad type structure that people has been studying also in environmental decoherence where similar features arise. I don't know about Larsson theorem, perhaps you can give me some references to it. Unitarity in $n$ just reflects the fact that canonical transformations in classical physics are mapped into unitary equivalent class in QM.
Once the classical theory is understood, quantization follows standard canonical rules, Indeed Dirac was the first to introduce these ideas getting close to the path integral formalism. In order to retain the symmetries of the classical theory one trade Dirac brackets for commutators and one can think of the classical relationships between $n$ and $n+1$ as generalized Heisenberg 'evolution' between operators. The mapp that moves you in $n$ happens to be unitary (in general is an isometry indeed and that's all we need.) which is a consequence of the formalism once the phase space is promoted to self-adjoint operators in a set of, so far equivalent, n-Hilbert spaces. Some constraints of the classical theory might still be present in the discrete counterpart and a further classification of 'observables' would be needed. In those cases the selfadjoint condition for the first class objects will restrict the physical Hilbert space leading to structures like Gauss law in LQG. That is why CD isn't a 'forgeting' of all stuff and loops will still play an essential role. You can take a look at the original paper and the Maxwell case,

http://xxx.lanl.gov/abs/gr-qc/0205123

Cayetano Di Bartolo, Rodolfo Gambini, Jorge Pullin

or the examples in,

http://xxx.lanl.gov/abs/gr-qc/0405131

Cayetano Di Bartolo, Rodolfo Gambini, Rafael Porto, Jorge Pullin

You can also take a look at the first GP paper on relational time and the issue of consistency and unitarity,

http://xxx.lanl.gov/abs/gr-qc/0101057

Unitarity in the$n \to n+1$ mapp in CD is a consequence, a consistency check indeed, rather than put by hand in the formalism. This is like in Heisenberg matrix theory where selfjadjoint 'time dependent' operators give rise to spectral decomposition which naturally account for unitary 'evolution' (this is studied in GP paper). The $n$ motion is not the 'physical' one and once conditional probabilities of partial observables (selfadjoint operators in n-Hilbert spaces) are consistently introduced, 'real time' evolution shows up in general in the form of history correlations, and in the semiclassical regime as the usual Schroedinger evolution plus a fundamental decoherence effect due the first principles and the relational meaning of time as 'real clocks'.
CD has successfuly opened the road to introduce a consistent relational description where conditional probabilities, Page-Wooters ideas, can be implemented without the traditional problems due the presence of constraints that are gone in CD. Kuchar paper on the problem of time in QG, (that you can get from Jorge's website and the reference is in one of the GPP papers I don't remember which), discusses this limitations in depth.
I strongly recommend papers where simple model are used to explain the ideas and that happens almost in all GPP papers :)

see u around,

"I didn't do it, nobody saw me...you can't prove anything!"

Bart Simpsons
 
  • #31
So, this is basically to reply to Marcus's request that I study a paper and report some nitty-gritty on consistent discretization, and the paper http://xxx.lanl.gov/abs/gr-qc/0205123 that edgar links to seems to be a good place to start. It's early (May 2002), and that would ordinarily put me off, but I don't think they have much modified their approach in applying it to different systems. In addition to the EM case edgar mentioned, they also do BF theory in this paper. Since BF theory was the "demo system" on which they showed their discretization approach in LQG, I thought that would be a good case to consider. So tomorrow morning I will print it off and get down to work on it. Any comments?
 
  • #32
selfAdjoint said:
So, this is basically to reply to Marcus's request that I study a paper and report some nitty-gritty on consistent discretization, and the paper http://xxx.lanl.gov/abs/gr-qc/0205123 that edgar links to seems to be a good place to start. It's early (May 2002), and that would ordinarily put me off, but I don't think they have much modified their approach in applying it to different systems. In addition to the EM case edgar mentioned, they also do BF theory in this paper. Since BF theory was the "demo system" on which they showed their discretization approach in LQG, I thought that would be a good case to consider. So tomorrow morning I will print it off and get down to work on it. Any comments?

I will try to follow suit. I have been comparing these papers and so far I think the paper about consistent discretizations that seems most accessible to me is what I believe was the first one:
http://arxiv.org/abs/gr-qc/0206055
Canonical quantization of general relativity in discrete space-times

I will print that one out and see if I can summarize or comment tomorrow.


I hope it's not a burden to be drawn into help on this, sA--- busy as you are with other responsibilities. Also you have actually helped quite a lot already on getting this thread going:
just by our being here, there was a nucleus. then edgar1813 appeared, and he gave a pretty good introduction to the subject!
Dont feel any pressure of expectations on this and proceed only as much as you feel inclined, out of personal interest!
 
Last edited:
  • #33
Thanks for the thoughts, Marcus. I'm doing a little business this morning, but I want to get to the paper this afternoon. My printer's old an cranky, so it'll take a long time to print it off (many paper jams - rollers are too smooth). Nevertheless this has become quite interesting to me and I want to spec up.

In bed last night I had this thought, perhaps due to my previous comment about lattice QCD. Suppose you did this discretization for QCD, and then had the freedom to pick any suitable observable for your clock. Wouldn't it pay you to pick an observable that maximized your transparent understanding on whatever particular phenomenon you were studying? For example nonperturbative chiral physics is hot in lattice QCD; couldn't you pick some clock that ticks chirally? I just barely know enough to ask the question, but it seems a no-brainer that if you had this freedom it would pay to exploit it.
 
  • #34
selfAdjoint said:
Wouldn't it pay you to pick an observable that maximized your transparent understanding on whatever particular phenomenon you were studying?

It looks that way to me also. I think that something approximating our familiar concept of time could be reconstructed using real measuring devices---in conjunction with the simplest forms of physical law (whether Newtons laws of motion or something in advanced particle physics). Call the se real physical contrivances clocks---as soon after the big bang as macroscopic creatures like us could exist, they ought to able to construct clocks and establish both time and physical law. I worry about whether this is too philosophical. Why shouldn't we consider ideal time fundamental. Have to get back to it after lunch.
 
  • #35
I am finding this (Gambini, CD---consistent discretizations) business not easy. I am discovering that I have to read the other paper too (the longer one gr-qc/0205123) along with the easier-looking one that I chose.
It will take time to get acquainted with what the Gambini-group does.
I was puzzling over what becomes of diffeomorphism invariance.

Some reassuring words were in gr-qc/0409057
middle of page 1: "Remarkably, the generator of diffeomorphisms of the continuum theory is preserved by discrete evolution..."

I believe, however, that this is just referring to spatial diffeomorphisms.
They elaborate at top page 3-----the "conserved quantity...is...the diffeomorphism constaint of the continuum theory."

I think I mentioned these quotes earlier. But the whole business of discretizing inspires me with caution and appears hazardous to invariance: diffeo or any other kind.

I am trying to read these papers on their own terms. It is, i reckon, possible that their recent (2002) proposal of CD really does break a kind of logjam in LQG and bypass the hamiltonian snag and solve the problem of time-evolution. It also seems to drag LQG more in the direction of numerical relativity. Nice if quantum gravity became more computery.
 

Similar threads

  • Beyond the Standard Models
Replies
9
Views
469
  • Beyond the Standard Models
Replies
3
Views
2K
  • Beyond the Standard Models
4
Replies
105
Views
10K
  • Beyond the Standard Models
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
28
Views
4K
  • Beyond the Standard Models
Replies
1
Views
2K
  • Beyond the Standard Models
Replies
28
Views
4K
Replies
1
Views
3K
  • Beyond the Standard Models
Replies
1
Views
2K
Back
Top