Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Gambini & Pullin on measuring time in quantum physics

  1. May 13, 2009 #1
    So we had a thread about the FQXi essay contest a couple weeks back, and when I first saw the winners list this one...
    ...jumped out at me, both because the inclusion of the word "undecidability" indicated the paper might actually touch on matters (i.e. formal logic) I feel qualified to comment on; and also because I am instinctively filled with uncontrollable rage whenever I see "free will" appear in the same sentence as the word "undecidability". I decided to give the paper a look and write it up for the thread, but since it took me awhile to get around to this I'm just posting it in its own thread now. The paper turns out to be only a little bit at the end about "free will" and "undecidability", and mostly about the question of how to meaningfully measure time in a quantum system. Here's what I got out of it:

    The paper starts with reviewing one of the old problems with reconciling General Relativity with quantum physics: General Relativity has no absolute time, no universal clocks, only relative distances between events in spacetime; but evolution in quantum physics is formulated specifically in terms of an absolute time variable, and if you try to reformulate the theory relatively you lose the ability to compute things.

    They focus on a specific 1983 proposal for relativizing quantum physics, which they call "Page–Wootters" (this sounds to me like a very respectable sports bar) where instead of thinking in terms of a universal t you pick some specific physical quantity which you define as your "clock"; you then formulate all other observables in terms of "how does this variable evolve as a function of the clock variable?". Put more accurately, you calculate the conditional probabilities of your observable having value A given the clock variable having value B. They explain this proposal fell apart because in a GR world there do not turn out to be any measurable things which could suitably serve as the "clock quantity". They then claim to have now solved this problem by applying ideas from a 1991 Carlo Rovelli paper; apparently in this paper Rovelli introduced an idea of "evolving constants”, which Gambini et al describe as a sort of artificial observable meant to behave like a time "parameter". What Gambini et al claim to have done is found a way to set up calculations such that you start out defining events relative to Rovelli's artificial "evolving constants" quantities; but then in the end the "evolving constants" cancel out entirely, and you're left with only the conditional probabilities of one-event-happening-given-another-event that Page–Wootters was meant to have provided in the first place. They work out the technical details of this in a separate paper, and claim to have yet another paper in which they use principles like this to formulate practical quantum notions of the "clocks and rods" that GR depends on so heavily. Well, okay.

    Once they start calculating the dynamics of some quantum system relative to these quantum clocks and rods, various unusual things happen. For example, in normal quantum physics, non-unitary changes-- in other words, information loss-- occur only when a measurement is performed. But relative to their Wootters-ish clocks and rods, unitary evolution no longer occurs at all and information loss happens continuously. They seem to be suggesting that this can be viewed as analogous to the clock mechanism undergoing quantum decoherence, which (if I'm understanding them correctly) from the perspective of the clock mechanism looks like the rest of the universe losing information. This bit-- the idea of using progressive information inaccessibility to model quantum evolution in a way that "looks" nonunitary-- was extremely interesting to me, but unfortunately they don't dwell on it.

    Instead at this point the paper shifts gears, and they start talking about what their Wootters-ish construction teaches us about the philosophy-of-science issues behind decoherence. Because I am not entirely sure I understand decoherence, I am not sure I entirely understand this part of the paper either. Let's stop for a moment and see if I can get this right: As I understand, "Decoherence" is an interpretation of quantum mechanics (or a feature certain interpretations of quantum mechanics adopt) where "wavefunction collapse" is an actual physical phenomenon that emerges when unitary systems become deeply entangled with each other very quickly. As Roger Penrose puts it in "Road to Reality", traditional quantum physics looks at the world as having two operations, a "U" operation ("Unitary", reversible) and a "D" operation ("Decohere", irreversible); when we choose an interpretation of quantum mechanics one of the things we're picking is what we choose to interpret the "D" operation as meaning (the wavefunction collapses, the universe splits, the pilot wave alters shape). If instead however we decide to take decoherence seriously, the distinction between U and D operation goes away completely; instead the "D operation" is just a specific bunch of U operations strung together, such that the results can present the illusion of something like the "D operation" having occurred.

    So, getting back to the paper, Gambini et al claim that the decoherence picture makes a lot more sense when you look at it in combination with their Wootters-ish construction. Specifically they bring up what they say are two traditional major objections to the idea that decoherence is sufficient to explain the measurement problem, and argue that both of these objections can be circumvented using their construction.

    The first of these objections against decoherence is that if you look at the "D operation" as being constructed out of U operations, then this means the "D operation" is in fact reversible-- because it's just a chain of [reversible] U operations. This is bad because near as we can gather from looking at the real world quantum measurement really does do something irreversible, something where information is lost in an irrecoverable way. This makes it seem like decoherence isn't the mysterious "D operation" after all. Gambini et al however point out that when you apply their Woottersy analysis, you find that you can show that decoherence as an operation does in fact lose information, and so is in fact irreversible and free of the risk of "revivals", relative to any given measuring device. In other words, they seem to have found a way to model quantum physics where the unitary picture that's supposed to be underlying quantum physics is everywhere preserved, but any given experiment will produce results as if state reduction occurs when a measurement is performed-- and all of this happens in a quantifiable way. That actually sounds really good-- if it actually works, it sounds like exactly what one would need to do in order to say one has solved the measurement problem.

    Depending, of course, on what exactly you consider "the measurement problem" to mean. This leads to the second objection against Decoherence the paper tries to rebut, which has to do with people focusing on the idea that a "measurement problem" solution should explain how it is we go from a quantum superposition of states to one single state. Decoherence analyses, again, tend to solve this by saying we don't go to one single state, we just enter a more complicated entanglement picture: whereas the Copenhagen interpretation would have the classical measuring apparatus imposing classicalness on a quantum system, the decoherence picture has the opposite happening, with a quantum system infecting an initially-classical measuring apparatus with quantumness. After this happens, the measuring apparatus is itself in a superposition of states-- such that each of those superimposed states individually sees the world as if the measured system were in a single state, but from the perspective of the ensemble the superposition never goes away. Not good enough!, goes the objection. Getting rid of superposition is the entire point!

    At this point the paper gets a bit more complicated and undergoes yet another gearshift, and here they start to lose me: here they get to the "undecidability" promised in the title. Basically they reiterate that their Wootersy construction describes a picture of the world where relatively speaking, on a small scale, systems are collapsing to single classical states and information is being lost; but mathematically, on the large scale, everything remains static and reversible and superimposed. But then they point out that from within the universe, you could never tell which of these two pictures, the small scale one or the large scale one, is the true one-- that is, it would be in principle impossible for you to experimentally determine whether you're in a universe where reversible operations are stacking in a way that presents the local illusion of information loss, or in a universe where it's actually just objectively the case that irreversible operations and information loss are occurring. They say it is "undecidable" which of these two things are happening.

    "Undecidability" is a word from mathematical logic and I'm not totally sure if I recognize the sense in which they use it here. In mathematical logic we say a problem is "undecidable" by a particular logical system if there is no possible way to demonstrate the idea is true or false by following the consequences of the logical system. An equivalent idea to undecidability is "independence"-- we can say a statement is "independent" of a logical system, if the statement could be either true or false without it having any bearing on the validity of the system. This is the same as saying the statement is not decidable by the system. Gambini and Pullin are in this same sense saying that the behavior of the world is "independent" of the ultimate truth about whether quantum state reduction is an objective thing or an illusion; i.e. it is "undecidable" whether when two systems interact they both go into a single classical state (as the Copenhagen interpretation says) or both go into a superposition (as the decoherence picture says). Okay, I think I agree with that.

    But then they do something squirrelly. They seem to be suggesting that because either of these two things could possibly be happening, that it's possible both could be happening-- that every time two systems interact, the universe gets to make a choice as to whether it's going to superimpose everything or collapse everything, and maybe it just freely toggles between the two. Why on earth would it do this? Their observation about the undecidability of the ultimate truth of the "D operation" looks to me like a fairly convincing argument that the ultimate truth of the D operation doesn't matter, and maybe we should find something more interesting to argue about. But instead they focus on this potential idea that because quantum systems might be randomly toggling back and forth between superimpose and don't-superimpose and we'd never be able to notice the difference-- "this freedom in the system is not even ruled by a law of probabilities for the possible outcomes"-- that something terribly interesting must be happening in whatever mechanism is [might be?] deciding how the toggling occurs. They say "the availability of this choice opens the possibility of the existence of free acts" and say this has bearing on the old argument about whether determinism in physical law precludes free will in humans, as if somehow humans got to have influence over the toggling and this is what "free will" means. I can't take this suggestion seriously. Even if we get past the question of by what conceivable mechanism the human brain could be influencing the outcome of this outside-the-accessible-universe decision, they're basically suggesting that "free will" comprises a set of decisions which-- as they have just specifically proven-- has literally no bearing whatsoever on anything that happens in the universe. That sounds like a really crappy sort of free will to have, as if Wal-Mart took over the world but then gave you a free choice of what color jumpsuit to wear in their industrial prisons.

    So they spend a good bit of space on this whole choice/will idea, but then when they finally get around to explaining what this decidability stuff has to do with the objection they originally raised all this to address-- is Decoherence really that useful as a way of explaining away the messiness of superposition if even after it happens all we have is an even messier, even more superimposed system?-- it turns out to not have a lot to do with the free will stuff at all. Instead they simply suggest that people might not mind so much that an "event" in a universe described by their Wootersy construction doesn't remove superposition from the system, so long as there were at least a specific, definable way in which the universe were different before and after the "event" occurs. They suggest you can provide this by defining the "event" as occurring at the exact moment that it becomes undecidable whether information loss has occurred or not. That sounds a lot more reasonable than the free will bit-- it's at least scientific-- but is "becoming undecidable" a quantifiable thing, something you can identify the specific moment where it happens when theoretically simulating the system? They don't give enough information for me to feel like I can answer that question.

    Anyway my objections about the end aside, overall this paper was very neat. Their whole argument at the end about free will and hypothetical coinflips outside the observable universe sounds like an unnecessary distraction to the much more interesting Page–Wootters-2.0 construction they describe in the first part of the paper, but it's easy to isolate and ignore that part of the argument if one wants. And anyway, I guess it would not be an FQXi paper if it didn't veer off into philosophy somewhere. I'd like to hear more about their method of quantifying the progression of decoherence and relative information loss, and I'd be curious whether anyone has heard anything about further work or knows whether they've been able to get any useful calculations useful out of their construction.
     
  2. jcsd
  3. May 18, 2009 #2

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    In my inexpert judgement this work is radical yet solid. Probably important. And has not yet been refuted. In fact there are a half-dozen papers by Gambini & Pullin that develop this idea, going back to 2004, as I recall.

    An amusing detail is that G&P took on a young co-author named Raphael Porto for several of these papers, and in between times Porto has been co-author with Jacques Distler. I do not watch Distler but it seems to me likely that he is aware of the G&P line of thinking. There may have been some reaction already, or we may eventually see one, and this will be interesting.

    What I think we should do is look at the main papers in this line that have appeared 2004-2008 and see who has CITED them. Also we should look at current conferences and see if Gambini has been invited to present these ideas.
    I should point out that Pullin was chairman of the April APS Denver meeting quantum gravity session. The "April meeting" is the main annual Americal Physical Society event, for theoretical physics.

    Personally, as I say, I believe the G&P work is important and solid. But I am always wanting to find real-world objective correlations against which to check my personal viewpoint. What i want to see is objective signs of recognition that Gambini and Pullin ideas are being taken seriously.

    You shouldn't just base this on the reception of their FQXi essay. IMO that is, with all due respect, a warmed over rehash of their 2004-2008 work with some savory seasoning to liven it up. Prepared to take advantage of the FQXi contest podium, a target of opportunity. You have to look at the main papers published in the regular channels and judge on that basis. IMHO.
     
    Last edited: May 18, 2009
  4. May 18, 2009 #3

    nrqed

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member


    It bothers me that you put so much emphasis on who has worked with whom, where people have worked, who is citing whom. I could not put my finger on why, exactly, but I knew that it bothered me a great deal.

    Then I realized why it bothers me so much. It is completely against the fundamental principles of good science. Someone who has little experience in physics and who reads your post will get the feeling that we, scientists, pay more attention to the names of people, where they have worked, who has cited them, who they have collaborated with and so on, than on their actual work to decide what is of merit. Good science should be the exact opposite! Ideally, we should ignore completely all those irrelevant things and focus on the work itself. We should judge a paper ONLY on the content, and not at all on any of these other facts. Ideally, we should not even name the authors of a paper and discuss only the physics content .

    It's true that it is hard to be completely objective. Of course, we pay more attention to a new paper by, say, Witten than by someone publishing their first paper. But getting into an author's whole professional life is totally irrelevant and gives the wrong idea about what good science should be, in my humble opinion. That's sociology, not physics.
     
  5. May 18, 2009 #4

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    I have already spent a lot of time discussing the physics of the Gambini and Pullin papers. The work from a purely physics perspective is obviously important, since it addresses major problems in an original way (black hole info paradox, problem of time in QG, etc. etc.)

    We have had threads about this going back to 2004-2006. Obviously some people aren't aware of this. In particular it has been a longstanding interest of mine.

    However the sociological checks are important because physicists are obviously a bit like herd or flock animals. Work can be solid and original and yet it can be ignored if it does not catch the attention of the (largely conformist) mass of the community which tends (for various reasons) to follow fashion.

    So now it is time to check to see if this work is getting some recognition.

    What we need to do is check cites for the series of papers. It will take some work. I have been asked in Private Message to make an assessment like this, so I will go ahead and do it. Or make a stab at it. We'll see what comes out of it.
    http://arxiv.org/abs/0809.4235
    http://arxiv.org/cits/0809.4235
    Conditional probabilities with Dirac observables and the problem of time in quantum gravity
    Rodolfo Gambini, Rafael Porto, Sebastian Torterolo, Jorge Pullin
    Phys.Rev.D79:041501R,2009
    (Submitted on 24 Sep 2008)
    "We combine the "evolving constants" approach to the construction of observables in canonical quantum gravity with the Page--Wootters formulation of quantum mechanics with a relational time for generally covariant systems. This overcomes the objections levied by Kuchar against the latter formalism. The construction is formulated entirely in terms of Dirac observables, avoiding in all cases the physical observation of quantities that do not belong in the physical Hilbert space. We work out explicitly the example of the parameterized particle, including the calculation of the propagator. The resulting theory also predicts a fundamental mechanism of decoherence."
    4 pages

    A point that caught my attention back around 2004 was that Gambini Porto Pullin had a resolution of the black hole info paradox, simply based on using a realistic clock, and some clever theoretical limits on the lifetime accuracy of a quantum clock, which made considerably better sense than the resolution offered, with lots of publicity, by Hawking at the same time. Hawking got the attention. G&P made sense. The contrast was striking.
    I will get to that, I am gradually working back. In case you don't realize it NRQED this involves very interesting physics :biggrin: so don't have a fit.

    Here's another one going back. If these things are not getting active attention, it sucks. I have not checked yet to see.

    http://arxiv.org/abs/0708.2935
    http://arxiv.org/cits/0708.2935
    Loss of entanglement in quantum mechanics due to the use of realistic measuring rods
    Rodolfo Gambini, Rafael A. Porto, Jorge Pullin
    Phys.Lett.A372:1213-1218,2008
    (Submitted on 21 Aug 2007)
    "We show that the use of real measuring rods in quantum mechanics places a fundamental gravitational limit to the level of entanglement that one can ultimately achieve in quantum systems. The result can be seen as a direct consequence of the fundamental gravitational limitations in the measurements of length and time in realistic physical systems. The effect may have implications for long distance teleportation and the measurement problem in quantum mechanics."
    6 pages

    http://arxiv.org/abs/gr-qc/0611148
    Fundamental spatiotemporal decoherence: a key to solving the conceptual problems of black holes, cosmology and quantum mechanics
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    6 pages, Honorable Mention GRF 2006, published version
    Int.J.Mod.Phys.D15:2181-2186,2006

    http://arxiv.org/abs/:gr-qc/0603090
    http://arxiv.org/cits/:gr-qc/0603090
    Fundamental decoherence from quantum gravity: a pedagogical review
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    9 pages, dedicated to Octavio Obregon on his 60th birthday
    Gen.Rel.Grav.39:1143-1156,2007

    http://arxiv.org/abs/hep-th/0406260
    http://arxiv.org/cits/hep-th/0406260
    Realistic clocks, universal decoherence and the black hole information paradox
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    3 Pages
    Phys.Rev.Lett. 93 (2004) 240401

    http://arxiv.org/abs/hep-th/0405183
    http://arxiv.org/cits/hep-th/0405183
    No black hole information puzzle in a relational universe
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    4 pages
    Int.J.Mod.Phys. D13 (2004) 2315-2320

    http://arxiv.org/abs/gr-qc/0402118
    http://arxiv.org/cits/gr-qc/0402118
    A relational solution to the problem of time in quantum mechanics and quantum gravity induces a fundamental mechanism for quantum decoherence
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    13 pages
    New J.Phys. 6 (2004) 45
     
    Last edited: May 18, 2009
  6. May 18, 2009 #5
    Hi nrqed,

    I agree with you completely.
     
  7. May 18, 2009 #6

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    Some people may not immediately "get it" that we are talking about interesting and highly original physics here. As we discussed in threads some years back, G&P found a clever bound on the duration/precision of a clock which depends on the clock forming a black hole if you try to push the durability and accuracy of the clock past this limit.
    In finding this bound they were inspired by an earlier paper of Wigner.

    This bound on physically realizable clocks causes a slow inevitable loss of unitarity, if one uses a realistic clock. They calculate this, and find it gives a resolution of the black hole information paradox, among other things.

    Now I happen to have communicated in the past several times with Rafael Porto and so I keep marginally aware of his research interests besides this one involving decoherence. So I am going to toss out these links. They are not supposed to prove anything, they are supposed to round out the picture. I find the Distler connection amusing. I expect that Distler's hep-th/0604255 actually draws some on Gambini Pullin work and may even cite them. Have to check this.

    http://arxiv.org/abs/0712.0448
    The Private Higgs
    Rafael A. Porto, A. Zee
    8 pages. Version published in Phys. Lett. B
    Phys.Lett.B666:491-495,2008

    http://arxiv.org/abs/hep-ph/0604255
    Falsifying Models of New Physics Via WW Scattering
    Jacques Distler, Benjamin Grinstein, Rafael A. Porto, Ira Z. Rothstein
    4 pages, 2 figures
    Phys.Rev.Lett.98:041601,2007

    Yes, in fact. The Distler paper cites this:
    http://arxiv.org/abs/gr-qc/0402118
    A relational solution to the problem of time in quantum mechanics and quantum gravity induces a fundamental mechanism for quantum decoherence
    Rodolfo Gambini, Rafael Porto, Jorge Pullin
    13 pages
    New J.Phys. 6 (2004) 45
    If you remember what the Distler paper was about, it naturally would cite that. The issue of unitarity and departures from it was paramount in the Distler paper. That was the one where Distler was so anxious to claim falsifiability.
     
    Last edited: May 18, 2009
  8. May 18, 2009 #7

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    OK, so I listed some of the papers that are context for the one Coin mentioned. And in particular I see that this one 0402118 has gotten 34 cites.

    In particular it has been cited by Steve Giddings (twice) and Don Marolf and Abhay Ashtekar and also by Max Tegmark and in another instance by Neil Cornish (he's a prominent cosmologist). And of course also by Jacques Distler.

    Now if someone is not very alert they might think that I am using a sociological fact to prove Gambini and Pullin are good. The opposite is the case.

    I've seen people draw that not-terribly-perceptive conclusion before.
    I already know Gambini and Pullin's work is interesting and original physics. I have explained that. I know they are good for physics reasons.

    The fact that they got cited by Giddings and by Marolf (Santa Barbara KITP) doesn't show G&P are good. We already know that. It shows the extent to which the broader physics community is smart and awake. It is a hopeful sign, in other words. Giddings and Distler are considered to be string theorists (though lately Giddings has shifted research focus some.)

    Max Tegmark (MIT) directs the FQXi which had that recent essay contest where I think G&P got the second juried prize or some such thing.

    So these are hopeful signs that the theoretical physics community is waking up to G&P's gambit. And it coincides with Coin's hunch that it was interesting. He spotted a recent paper of theirs about these themes and thought it was interesting enough to start a thread about.
    It is.

    I just checked cites on another. More recent:
    http://arxiv.org/abs/0708.2935
    http://arxiv.org/cits/0708.2935
    Loss of entanglement in quantum mechanics due to the use of realistic measuring rods
    Rodolfo Gambini, Rafael A. Porto, Jorge Pullin
    Phys.Lett.A372:1213-1218,2008
    (Submitted on 21 Aug 2007)
    "We show that the use of real measuring rods in quantum mechanics places a fundamental gravitational limit to the level of entanglement that one can ultimately achieve in quantum systems. The result can be seen as a direct consequence of the fundamental gravitational limitations in the measurements of length and time in realistic physical systems. The effect may have implications for long distance teleportation and the measurement problem in quantum mechanics."
    6 pages

    This (as well as the other one) has been cited in a paper by Don Marolf, and also in a paper by Neil Cornish. Marolf and Cornish are repeat customers.
    It hasn't gotten very many cites as yet (only published last year) but it got recognition from some high quality people. So that's how it goes. Gambini and Pullin have a highly original idea and it has been in obscurity much of the time since 2004, but maybe the community is beginning to wake up to it.

    Again, let's be clear: community awareness doesn't mean their idea is right. It may eventually be refuted! Popularity does not imply validity. The string theory example should be sufficient to disabuse one of that notion! :biggrin:
     
    Last edited: May 18, 2009
  9. May 18, 2009 #8
    Hi guys, thanks for the responses.

    I think sociological data is interesting and valuable as a thing unto itself. It is important not to let it supplant physics content, however I don't think it's a good idea to ignore the sociological data either. I look at things this way because, frankly, I do not know enough about physics to feel like I can accurately judge by myself whether ideas are solid. If however [sociologically speaking] there is a lot of activity around an idea that tells me the idea has been tested in the sense that a lot of people are thinking about it. This goes especially for something like the vaguely revolutionary kind of ideas being put foward by Gambini et al here. If these ideas had been around awhile and had had the chance to be evaluated but there weren't examples of people citing it or developing it further or collaborating on it, that would be a hint to me maybe there is some kind of obvious problem with it (scientific or practical) that I simply lack the background to see. So I find the links and background Marcus has provided here extremely useful because they seem to me to say that the Gambini et al ideas pass some sort of minimal "smell test".

    Aside from this I think understanding a physicist's history-- what they've worked on, who they've collaborated with-- is useful because it tells you a lot about the researcher's mindset. If you understand a writer's mindset you're more likely to understand what it is they're trying to communicate.

    That out of the way, nrqed, you say you'd like to discuss the physics content-- do you have any remarks on the physics content yourself? I'd be curious to hear your thoughts.
     
  10. May 18, 2009 #9

    Demystifier

    User Avatar
    Science Advisor

    This is wrong. Decoherence is not an interpretation, but an experimental fact. It is NOT wavefunction collapse. For more details I recommend
    http://xxx.lanl.gov/abs/quant-ph/0312059
     
  11. May 18, 2009 #10

    Fra

    User Avatar

    I'm not sure if I read that paper, you caught my interest, I'll try to skim ilater.

    I associate here directly to the notion of establishing if a particular "probability estimate" is true or false. IMO, clearly this is not physically possible. In reality the entire world has changed one you have acquired enough statistics to make an assessement. The exception are cases where the context is massive compare to the system unders study, like often is the case for particle experiments.

    As I see it, it is not necessary to even talk about stuff like "predicting a probability", because the only function of the probability is that it constitutes the basis for the observers actions. Once the action has been triggered, assessing correctness of past basis seems moot.

    This is why I think decidability in that loose physics-sense only makes sense in an evolving context. Without evolution and change, decidability seems ambigous. IF you picture the observers reasoning, as a logical system in a fuzzy sense, somehow the "consequences of the logical system", IS the observers actions, and in that way, the feedback to the observer from the environment, is somehow the decision. If this is consistent with the observers, prior state, no correction is needed, if not, a correction is needed. I don't see that we need to make up and imaginary probability ensembles. They aren'y needed IMO.

    If you look at biological systems, it makes little sense to ponder wether a particular "rat" makes correct decisions. What matters is that the rat is apparently a very fit creature that has managed to survive and evolve for a long time. So responding to your environments reactions to your possible imperfect actions is the important trait, rather than always making the correct action, because the measure of this "correctness" can not be cosntructed.

    Once you accept hte idea that the ability to learn and adapt, is more important than "beeing right" in some ambigous sense, it puts also decidability in a new light.

    I think this is an interesting question.

    Suppose I make a prediction, then this prediction is the basis of my actions, then once I get feedback on this action from my environment, it's an outdated question to try to RE-assess my prior prediction in the light of new data as to see if I was right or wrong. Regardless of wether I was right or wrong, the current question is how I can optimally merge the feedback with my prior state of information.

    /Fredrik
     
  12. May 19, 2009 #11

    Fra

    User Avatar

    I didn't get time to read the entire paper last night, but I started skimming, and Gambini says on page 7-8

    " One of them is the “regularity theory”, many times attributed to Hume
    [23]; in it, the laws of physics are statements about uniformities or regularities of the world and therefore are just “convenient descriptions” of the world. Ernest Nagel in The Structure of Science [24] describes this position in the following terms: “Hume proposed an analysis of causal statements in terms of constant conjunctions and de facto uniformities.. —according to Hume [physical laws consist] in certain habits of expectation that have been developed as a consequence of the uniform but de facto conjunctions of [properties].” The laws of physics are dictated by our experience of a preexisting world and are a representation of our ability to describe the world but they do not exhaust the content of the physical world.
    A second point of view sometimes taken is the “necessitarian theory” [22], which states that
    laws of nature are “principles” which govern the natural phenomena, that is, the world “necessarilyobeys” the laws of nature. The laws are the cornerstone of the physical world and nothing exists without a law. The presence of the undecidability we point out suggests strongly that the “regularity theory” point of view is more satisfactory since the laws do not dictate entirely the behavior of nature."

    I like this. However I am not sure how this view distinguishes from Rovelli in the application. I also partially agree with rovelli's reasoning. I guess it depends on how he applies this.

    As I interpret what he said above, which is close to my personal view as well, is that he tries to acknowledge that knowledge of "physical law", requires information, and information acquisition, just like a "physical state" does.

    Here I associate to Smolins idea of evolving law as well. Undecidability then, could possible be interpreted as a particular view of physical law. This is exactly that perspective I like.

    But the question is how you go on from here. My personal expectation of this, is that we will end up with an evolving law, and no there are not "meta laws" that govern this evolution. Rather do I think there is a hierarcy of laws, which can be seen to have originate from a condition of no represented laws.

    This origin problem becomes similar to the origin of mass. How can laws have evolved, starting from no laws?

    So in that sense, life is really like an unknown game, but without set rules. Finding out the effective rules, IS part of the game itself. One you get skilled you can even "learn" how to INFLUENCE the rules. But at no point can one make decisions about wether something is true or false. You can only place your bets, and move of from whatever feedback you get.

    I think this is why the logic we might need could be somewhat different.

    I'll read the other half of the paper tonight.

    /Fredrik
     
  13. May 19, 2009 #12

    Fra

    User Avatar

    Regarding the physical interpretation of decidability.

    In other words, as I see it, the "logical implications" whereby decisions are usually made in logic, is in physics a _physical process_, and the process of deciding something, is indistinguishable from the ordinary (time)evolution, because the only menas of "verification" is something like "place your bets, and revise your actions based on the feedback".

    So the decidability problems seems to be constrained to processes. Unlike normal logic, where the set of statements and implications exists mathematically, the physical imlpementation of this is more like "computations" which introduces time, but with the comlpication that there exists no universal computer hardware, instead the observers is his own hardware, so the "computations" might be constrained to the observes own microstructure.

    The thing that makes me doubt, and which reminds me of rovelli is this statement

    "It is not too surprising that in the resulting picture one does not have a unitary evolution: although the underlying theory is unitary, our clocks and rods are not accurate enough to give a depiction of evolution that appears unitary."
    -- page 3-4 in http://fqxi.org/data/essay-contest-files/Gambini_essay.pdf

    I suspect that the "underlying theory" here, does represent a sort of structural realist view, or birds view. I don't like that.

    This way of recovering unitarity in a more fundamental picture, must IMO still be subject to the same constraints. A conclusion is thus that such a theory is unreachable, unless you envision that in some form of mathematical reality like Tegemark. I fail to see the real utility of such picture. It seems more to give intellectual comfort and an illusion of something fundamental to fall back on, although it's imaginary.

    /Fredrik
     
  14. May 20, 2009 #13

    Fra

    User Avatar

    I got around to reading the last section last night, and Gambinis reasoning is quite interesting. I can connect to what he says, but wether the brief paper allows me to infere the best interpretation of his reasning I don't know.

    About the free will thing. I agree that whenever that appears in a paper I am sceptical. Because it's so commonly used in various crackpot stuff, but what Gambini says makes sense to me.

    Gambini has a reference ot another paper in reference [8] which supposedly contains details, but given that I haven't read this this is how I interpret this.

    To me the first observation here is that Gambini seems to partly acknowledge the limits of what I first called his "external birds view". I this view, there is always an uncertainty wether a superposition by a measurement device or another observer is collapsed or not. But from my perspective he is mixing the views here which is confusing. He seems to partly resolve the confusion by noting that from the "external" point of view, it's undecidable which is the case. Then from my perspective, in which there is no "external" view, this external view can be nothing but a third observer, and in that case, the conclusion is that this third observer has a "choice", to think that the imagine superpostion another observer has relative to a subsystem is still intact or not. Then this third obserer, bases his actions upon this. But in actuality, he would then act "as if both happens". This makes pretty good sense to me.

    I wouldn't call this "free will" but I can understand the association. The free will is then simply the players freedom to "place his bets", and he has no choice but to face the consequences of this choices. The consequences are the environments backreaction, which is also undeterministic.

    But I think more in terms of the observers betting process is a kind of unpredictable random process. The observer has en evolving dice, and this only choice is to throw it or not to throw it. Both choices are risky, so there are no safe strategies.

    Maybe I should try to locate that reference [8]. Gambinis seems to mix the external view here with constraints of it. I think that eventually the constraints on this external view will prove to simply be the ones of an internal view. Then I think I would like it even more :)

    /Fre
     
  15. May 27, 2009 #14

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    They have a new paper. John86 flagged it and added it to the QG bibliography thread:

    John didn't highlight the last sentence of their abstract, but I guess I will:

    It can therefore be argued that the emerging picture provides a complete resolution to the measurement problem in quantum mechanics.
     
    Last edited: May 27, 2009
  16. May 28, 2009 #15
    Here id the next paper by Gambini and Pullin.

    http://arxiv.org/abs/0905.4402
    The Montevideo interpretation of quantum mechanics: frequently asked questions
    Authors: Rodolfo Gambini, Jorge Pullin
    (Submitted on 27 May 2009)

    Abstract: In a series of recent papers we have introduced a new interpretation of quantum mechanics, which for brevity we will call the Montevideo interpretation. In it, the quantum to classical transition is achieved via a phenomenon called "undecidability" which stems from environmental decoherence supplemented with a fundamental mechanism of loss of coherence due to gravity. Due to the fact that the interpretation grew from several results that are dispersed in the literature, we put together this straightforward-to-read article addressing some of the main points that may confuse readers.


    a question for Marcus or Fra the emergent properties of space time and the CC problem. I try to visualize this time again. if you envision the emergent properties of space time as a macroscapical thing, is it than wrong to think about cosmological constant as the underlying sub stratum. And the universe as a macroscopical bubble emergent from that sub stratum ?.
     
  17. May 29, 2009 #16

    Fra

    User Avatar

    Thanks Marcus and John for reporting more papers from these! I have been unusually disfocused lately due to various car-issues, and I haven't been able to read any new papers for a week.

    About the cosmological constant I personall don't think of it as a "substratum" in the realist sense. So far I've associated a non-zero cosmological constant to the finite information constraint of hte observer - a finite observer can not establish a certain zero measure, I think this residual uncertainty or undecidability somehow is related to the cosmological constant. The effective measured cosmological constant is thus somehow relative to the observer, but that's not quite consistent with how we normally see this. But then again, if GR is emergent then an at least locally objective cosmological constant might appear.

    But exactly how this fits in the large picture of emergent dimensionality and spacetime is yet not clear to me. To think of it as a material substratum sitting in space is too realist-inclined to my taste. I think it's link to how it's measured, and thus unavoidable adds the complexity of the observer.

    I will see if I get time to read those papers this weekend.

    /Fredrik
     
  18. May 29, 2009 #17

    Fra

    User Avatar

    "We argue that it is fundamentally impossible to recover information about quantum superpositions when a system has interacted with a sufficiently large number of degrees of freedom of the environment. This is due to the fact that gravity imposes fundamental limitations on how accurate measurements can be."
    -- http://arxiv.org/abs/0905.4222

    This SOUNDS very much to my liking. I can HOPE that their "fundamental limitations on how accurate measurements can be" can relate to what I have called "intrinsic construction of measures". The "fundamental limit" is simply the constraint imposed by the "inside view" of a finite observer. At least that's how I think of it, it remains to see if this is comparable with their reasoning.

    I definitely will read this later.

    From my point of view, given since angle, the next question is, WHAT actions a systems takes given this undecidabilit of the "optimal action"? This is where in my thinking of evolving observers and law come in.

    By first impression was that they thought of another escape, but it remains to see.

    /Fredrik
     
  19. May 29, 2009 #18
    I wanted to say that there is a popular misconception about the word 'undecidable', even Max Tegmark was a victim of that misconception.

    Well-defined, simple and effectively computable 'laws' of physics in some Universe can make that particular Universe 'undecidable'

    As an example, imagine a Universe where only Turing machines exist, and nothing else. Time in that universe is integer (step number).

    Turing machine is DETERMINISTIC and quite primitive. However, even that simple world is undecidable, because there are some undecidable statements (for example, would Turing machine ever stop given the particular input).

    However, all these undecidable statements have EXISTS or ALL in the beginning of the formula (EXISTS step N that blah-blah), however, the transition from step N into step N+1 is always well defined and it is even effectively computable.

    Decidable and Effectively computable laws of physics can lead to globally undecidable universe, and I always wonder what type of Undecidability ('local' or globa) physicists are talking about.

    P.S.
    Another shocking example: Universe of well-known Conway Game of Life is undecidable
     
    Last edited: May 29, 2009
  20. May 29, 2009 #19

    marcus

    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    The word "undecidable" obviously has a meaning outside the context of mathematical logic. And the word is used when something cannot be decided. Just like "indistinguishable" is used when two things cannot be distinguished.

    I know of only one popular misconception about the word "undecidable", which is the misconception that it must always mean formally undecidable, as in the title of Gödel's book:
    http://www.amazon.com/Formally-Undecidable-Propositions-Principia-Mathematica/dp/0486669807/
    That is, there are people who think the word always has the technical meaning of "formally undecidable" or "recursively undecidable" and so can only be used in the context of mathematical logic, as with Turing machines, sets of axioms, etc etc.

    A naive person might think that whenever the word is used, there must be some narrow technical context of the sort he is familiar with. That is a misconception that I have seen occur. What other "popular misconception" about it is there?

    I didn't understand the following statement, Dima:

    What is the misconception you are talking about?

    Could you give an example of it?

    In particular, about Max Tegmark, would you please point to some instance where Tegmark "was a victim of" the misconception that you have in mind?

    So far I am completely mystified by your post.
     
    Last edited: May 29, 2009
  21. May 29, 2009 #20
    Yes, here: http://arxiv.org/PS_cache/arxiv/pdf/0704/0704.0646v2.pdf
    Page 22 and his CUH:

    He does not define any sub-levels of #3, while obviously there are multiple sublevels:

    For simplicity, lets talk about 'lattice universe' with integer time and number of states S(t) where t is integer. There are 4 options:

    1. Any statement about S1 and S2 is decidable.
    2. For any given initial state S(t) we can effectively calculate state S(t+1). In another words, we can effectively emulate the Universe.
    3. For any given pair of states S1 and S2 we can effectively calculate if S2 can be NEXT state (for t+1) of S1. In another words, we can effectively verify if laws of our Universe are violated or not on every step. Note that #3 is weaker then #2
    4. We cant do #3 - non-computable Unvierse.

    When Max Tegmarks talks about the computability, he does not clarify if he is talking about #1, #2 or #3 (because he sees o difference). But #1 is too strong - it is violated even in very simple systems like Game of life, so we cant hope that #1 is true. Probably he means #3, but I am not sure.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Gambini & Pullin on measuring time in quantum physics
  1. New Gambini! (Replies: 0)

Loading...