|May29-09, 05:49 AM||#18|
Gambini & Pullin on measuring time in quantum physics
I wanted to say that there is a popular misconception about the word 'undecidable', even Max Tegmark was a victim of that misconception.
Well-defined, simple and effectively computable 'laws' of physics in some Universe can make that particular Universe 'undecidable'
As an example, imagine a Universe where only Turing machines exist, and nothing else. Time in that universe is integer (step number).
Turing machine is DETERMINISTIC and quite primitive. However, even that simple world is undecidable, because there are some undecidable statements (for example, would Turing machine ever stop given the particular input).
However, all these undecidable statements have EXISTS or ALL in the beginning of the formula (EXISTS step N that blah-blah), however, the transition from step N into step N+1 is always well defined and it is even effectively computable.
Decidable and Effectively computable laws of physics can lead to globally undecidable universe, and I always wonder what type of Undecidability ('local' or globa) physicists are talking about.
Another shocking example: Universe of well-known Conway Game of Life is undecidable
|May29-09, 10:24 AM||#19|
The word "undecidable" obviously has a meaning outside the context of mathematical logic. And the word is used when something cannot be decided. Just like "indistinguishable" is used when two things cannot be distinguished.
I know of only one popular misconception about the word "undecidable", which is the misconception that it must always mean formally undecidable, as in the title of Gödel's book:
That is, there are people who think the word always has the technical meaning of "formally undecidable" or "recursively undecidable" and so can only be used in the context of mathematical logic, as with Turing machines, sets of axioms, etc etc.
A naive person might think that whenever the word is used, there must be some narrow technical context of the sort he is familiar with. That is a misconception that I have seen occur. What other "popular misconception" about it is there?
I didn't understand the following statement, Dima:
Could you give an example of it?
In particular, about Max Tegmark, would you please point to some instance where Tegmark "was a victim of" the misconception that you have in mind?
So far I am completely mystified by your post.
|May29-09, 03:00 PM||#20|
Page 22 and his CUH:
For simplicity, lets talk about 'lattice universe' with integer time and number of states S(t) where t is integer. There are 4 options:
1. Any statement about S1 and S2 is decidable.
2. For any given initial state S(t) we can effectively calculate state S(t+1). In another words, we can effectively emulate the Universe.
3. For any given pair of states S1 and S2 we can effectively calculate if S2 can be NEXT state (for t+1) of S1. In another words, we can effectively verify if laws of our Universe are violated or not on every step. Note that #3 is weaker then #2
4. We cant do #3 - non-computable Unvierse.
When Max Tegmarks talks about the computability, he does not clarify if he is talking about #1, #2 or #3 (because he sees o difference). But #1 is too strong - it is violated even in very simple systems like Game of life, so we cant hope that #1 is true. Probably he means #3, but I am not sure.
|May29-09, 03:44 PM||#21|
" I hypothesize that only computable and decidable (in Gödel's sense) structures exist"
Notice that he specifies a technical meaning "in Gödel's sense", because the word also can be used informally outside math logic.
So Tegmark was using "decidable" in the technical sense of mathematical logic, and may have blundered! That is interesting!
Maybe you should start a thread pointing this out. People interested in the formal mathematics of computation might want to discuss it, and give you feedback.
However, I don't think it has anything to do with Gambini and Pullin.
For them there is no formal logic context. Something is undecidable if you can't decide it.
They could as well have said "indistinguishable".
They are interested in where the quantum description comes to a point where you cannot physically distinguish between wavefunction collapse and the natural loss of information that accompanies realistic measurement (real world quantum clocks and rods.)
For them it is an important junction when you finally cannot decide by physical means whether the wavefunction has collapsed (because of some measurement interaction with environment) or instead information has worn out by the simple passage of time. Dima, you may see things differently. Please let me know if you think I have misunderstood their idea.
Let me quote from their abstract, to emphasize what I'm trying to say:
"We argue that it is fundamentally impossible to recover information about quantum superpositions when a system has interacted with a sufficiently large number of degrees of freedom of the environment. This is due to the fact that gravity imposes fundamental limitations on how accurate measurements can be. This leads to the notion of undecidability: there is no way to tell, due to fundamental limitations, if a quantum system evolved unitarily or suffered wavefunction collapse."
What they are talking about are fundamental physical limitations on the accuracy of measurements. For instance the fact that if you try to make a real clock too accurate and reliable it will turn into a black hole and you lose it. And there is, in the universe, no clock except for real clocks. So information inevitably decays---perfect unitarity is meaningless, one cannot operationally define it. And they have useful estimates of the unavoidable rate of decay---the rate at which the universe ineluctably forgets its past.
|May30-09, 12:37 PM||#22|
John86 and Marcus, thanks for the links! This is exciting, this looks like the sort of coherent development of their ideas that was missing from the 30,000-feet-view FQXi paper.
I haven't fully injested all this yet, but I find this detail particularly interesting from John86's initial quote:
An unrelated note: The clearer presentation in these new papers is making me curious exactly how their "interpretation" fits in with the many-worlds interpretation. (Their FAQ actually brings up many worlds, and says that they can't see any reason why they would be incompatible with many worlds but notes that once you've accepted their interpretation the philosophy-of-science motivation for accepting MWI becomes weaker.) It seems like in their interpretation information in the wavefunction is never "lost" except from the perspective of specific observables (clocks, rods etc), and nothing ever "becomes classical" except from the perspective of those observables-- that is their interpretation seems to be all about how individual objects can experience the illusion of classical physics. So if I understand this right, what happens if you choose to work with the "Montevideo interpretation" and many-worlds interpretation simultaneously? Is the idea that all "branches" on a quantum decision still exist within the wavefunction, but different observables will wind up stuck in a single "world" and only able to access the information in the universe's wavefunction from the perspective of that "world"?
|Jun1-09, 12:25 AM||#23|
I still haven't had time to read the papers yet, but i'd like to hand onto this dicussion and get back to it later.
Coin I understand your association to Penrose gravity induced collapse. I too think there is a connection between fundamental of QM and gravity, and I also didn't find Penrose arguments suffuciently clear or convinving. He seems to try to want to explain quantum decoherence by assuming gravity.
While I agree that there must be a general connection, I see it the other way. I think that in the general information view, these various "information inaccessability" or undecidability, that follows from (as I think at least) a constrained physical inside view somehow IS gravity. This way there might also be interesting informational perspective on the constants of gravity, including the cosmological constant.
So that we could even predict gravity, from a proper informational view. I think this is somehow perpendicular to what Penrose seems to think, but I also have a feeling that the final result could be close to this original intention, only that maybe there is another way of implementing it.
Maybe you're right the Gambinis thinking is a progression from Penrose original idea.
It probabably will be a few more days before I get raound to reading the papers properly.
|Jun1-09, 06:02 AM||#24|
A question meanwhile I have time to read the papers, perhaps for Marcus?
Since Gambini works in LQG as far as I know, perhaps someone can charaterize differences and similarities if you compare Rovelli and Gambini on this point?
Am I right in the initial impression (from not yet reading the papers) that Gambini here attemtps to look for something deeper information-wise, at a point where Rovelli makes a leap?
Anyone think that's fair?
( I really think this is interesting too, but I have been overloaded with non-physics stuff lately, but hopefully ina week or so there is some light, I hope to keep this discussion alive until I get time to join more)
|Jun2-09, 12:21 AM||#25|
*Starting to read a little*
To start at the introduction of Gambinis fundamental limit...
"This hits a fundamental limit if one includes gravity in measurements. To make accurate measurements one need to expend energy. When one considers gravity one is limited in the amount of energy one can invest in a measurement. Too large an energy density creates a black hole and spoils the measurement. This argument has been put forward by many authors. It is heuristic and without a full theory of quantum gravity cannot be rigorously worked out."
-- Gambini, Undecidability and the problem of outcomes in quantum measurements
He raises two points here.
Too large an energy density creates a black hole and spoils the measurement.
one is limited in the amount of energy one can invest in a measurement
The first point is AFAIK the more common stated in QG introductions, but the second one is actually more interesting, because it connects to the fact that there is a complexity bound originating from the home of any measurement; the observer himself. One can also imagine the extreme case where a small observer has to "invest" his entire own mass/energy in order to make an accurate measurement, but then something strange happens because the measurement itself implies a complex feedback to the observer itself.
It's not just a matter of beeing limited in measurement energy because the system to be observerd in question been excited into a black hole, it's a matter of the observer having finite resources as well. This other point has IMO not been payed as much attention. To compare with a game, the player can not make infinitely high stakes if his pockets are finite.
I think this second point is the key to the so called instrinsic view, and construction of instrisic measurements. This should add a further constraint on what measurements are possible, and thus also what actions that are possible, given that the actions are based on the observers state of information about it's environment.
This is IMO the more rare component I this reasonings I wish to have elaborated.
I think the right angle of the above might be important, since it's the starting point. First of all we do not know how a microscopic black hole would actually behave, but I really would expect it to be very different from an ordinary continuum-GR black hole just "scaled down". So the intuition that too high energy would collapse the measurement into a black hole still needs to be reconstructed in a more general setting. And I think the discrete information view is more promising than the continuum geometric view.
Along with hte first point, one might argue that at some energy, the object can not be distinguished from a blackhole since the measurements excites the object.
But also more importantly even, at some point the object itself can not distinguish a complex environment from a simple one, if it has information capacity; this would also possible constrains the complexity of it's response pattern? So maybe a very "small blackhole" would reduce it's behaviour to a well defined elementary interaction pattern?
This is Imo, the other side of the coin, that is nto so often mentioned?
|Jun2-09, 12:43 AM||#26|
|Jun3-09, 01:58 AM||#27|
[It seems like Gambini is making similar observations as I have done, but
IMO he does not think through the consequences.
statement. Exactly how large is too large? AFAIU, the only way to do so is
to introduce the observer's mass M into the equations. Ultimately, the goal
of physics is to describe observations, and the result of an observation
depends on the observer's mass; a light observer observes something
different than does an obese observer sitting inside his black hole.
This leads to a radical corrollary: any observer-independent theory, be
it QFT, string theory or LQG, is necessarily wrong or incomplete, and can
at best be a limiting case of a more fundamental observer-dependent theory,
whose predictions depends on the observer's mass.
right? To avoid problems with Pauli's theorem, it might be simpler to say
something similar about spatial localization. If the observer has finite
mass, there is a limit to how much an experiment can be localized.
In formulas, the observer's position and velocity do not commute, but
[q, v] = i hbar/M
|Jun3-09, 03:27 AM||#28|
|Jun3-09, 04:00 AM||#29|
Just a quick response on one thing, I still didn't read all the papers.
I noted in a previous thread I share part of Thomas ideas with introducing the observers mass; in some way, but I do think that I see this differently that you, which is a bit illustrated by the below.
The problem is if you consider an uncertainty relation between the observer and it's environment, you are implicitly defining another observer that observers the first observer, som kind of gods or birds view; that's exactly my point that that isn't valid. I suspect we differ in our views there. But for sure I agree that the observers mass "or some suitalbe complexity measure" for sure is missing in current models. I share your view fully on that point.
So what I mean with the limited energy for measurement is more abstract. I refer to a limited complexity that constraints all observable things, including physical law. It is not allowed to "explain this" by means of another external view, because that view is not at hand.
So I agree fully that "a light observer observes something different than does an obese observer sitting inside his black hole" but there exists no universal external description of this relation; this description is itself constrained by the same limits.
This is what I associate most closely to undecidable.
The difference here is between say something "undecidably undecidable" or something "decidable undecidable". My view is that we can not find hard universal constraints on the undecidability, instead it's evolving.
|Jun3-09, 04:07 AM||#30|
I guess what I'm objecting to is the separation of physics and science. The scientific context, the home of physical law, is subject to the same physical constraints as physical interactions and physical measurements as i see it.
Alot of physicistis doesn't seem to care about this.
If you ask what physical laws and "explnattions" a tiny physical observer can distinguish, then that is a different than just asking how a larger observer can partially explain the actions of the small observer due to it's small size. I think we can explain much more, including the actions - ie the form of lagrangians and hamiltonians etc - of nature if we take this even more seriously.
If the langrangians and hamiltonians are not universally fixed, but themselves merely evolving views of physical law, to which all observers have slightly shiften view, then the explanatory power would I think be far greater.
This is an even larger reconstruction indeed, but I think it's what we need, and it is what I personally expect.
|Jun4-09, 12:26 AM||#31|
In essenece, QM is an indeterministic, BUT, it is still a form of "determinisic indeterminism" because the indeterminism is encapsulated in the probability abstractions, and the idea is that the probabability is deterministically predictable.
This can not be valid IMO if you think bigger.
So this again relates back to the physical basis of the probability abstractions. And like I said to my defense before, the problem is not a mathematical one. The question is rather wether the mathematical abstractions and axioms of classical probability can be justified enough to encapsulate ALL indeterminism of the world, so that we end up with a "deterministic probability theory". I think not.
I rather think that a proper construction of what we have, will give a form of indeterministic indeterminism, which more takes the form of a game. All anyone has at any point, is an opinon of how to play. There are also expectations on how the game proceeds, but noone can predict it 100%. That is exactly why I like to think in terms of "games. That's exactly why it is a "game". Here is should be become intuitive also that a massive player, are more dominant and can make more viable predictions than a small observer.
Given that we accept this gaming idea, there is sense in decide something or estalibsh absolute truths, because all we have are expectations, from which we form our actions. Our actions are still not deterministically determined from the expectations, but they are "constraints". And once the game has proceeded, the situations repeats itself, and it has no survival uility to ask wether a predictions you made a year ago was right or false. Because that is somehow already erased, and replaced by a new prediction to make.
This is analogous to the human brain. A healthy human brain focus forward. Alot of research suggest that even our memories of the past, are mainly used in order to help us in the future. To accurately "recall" the past as it actually happened as per some actual time history, is not really the purpose of our memory. This is why the brain sometimes remembers the past in a way it actually didn't happen, it's because the brain transforms and reinterprets the record and instead remembers a form that it thinks is more useful for hte expected future. There are people who can very accurately recall in high detail what actually happened, and have amazing photohraphic memory, but these are usually disordered brains, like some savants. They have sometimes excellent, indeed almost supernatural memory, but then they have impaired ability to predict the future - figure which is the more important trait.
I see this quite analogous to the way I envision physical interactions, and evolving physical law.
|Jun4-09, 01:12 AM||#32|
hbar/M is ridiculously small in two limits:
1. hbar -> 0. This is classical physics.
2. M -> infinity. I presume that this is what you meant. This amounts to the assumption that all relevant energy scales are much smaller than M. This may seem obviously true in practical cases. However, note that *all* scales must be much smaller than M, including the energy of virtual quanta. Hence we cannot integrate over virtual momenta with energy higher than M, so M effectively becomes a cut-off.
In the absense of gravity, you can take the cut-off to infinity, after you have computed things in the regularized theory. But when gravity is presence, you again run into the problem that M is not only the inert mass but also the heavy mass.
This argument is of course a variant of the standard argument that renormalization hides new physics above the cut-off scale. What is unusual is that the new physics here is not another term in the Lagrangian, but fuzziness in the observer's location.
|Jun4-09, 01:33 AM||#33|
Basically, you are saying that results of measurements must depend on the mass of the measuring apparatus. I have no problem with that. The gravitational field of the apparatus has some effect on trajectories of observed particles, energy levels, etc. However, there are lot of other (usually unaccounted) effects, which are many orders of magnitude stronger than the gravity field of the observer. For example, electromagnetic fields produced by a nearby radio station, or earth vibrations from automobiles on the street... Are you going to discuss them as well? What is the practical point of discussing observers so heavy that they are buried in their own black hole? This is so bizarre.
|Jun4-09, 02:13 AM||#34|
As I see it, the point of this discussion is that the interactions between systems are fundamentally dependent on their relative complexity (and mass), in a way that is not acknowledged in current formalisms and renormalisation procedures.
The focus is not at the 100'th decimal position in a standard experiment, the focus is on the fundaments of theory and physical law. A focus that if kept, might help us solve problems of unification.
The more interesting question here is what happens when the observer is NOT massive. How does the actions of a small system scale when it's complexity drops? Is there a possible inference to be made here?
Ordinary renormalisation, is simply a reduction. It can reduce degrees of freedom, but that's isn't a realistic inside scenario because it implies a birds view - to be decidably shaved. I think there is a better way to do this, that will be less deterministic, and more undecidable (to keep using that word here) BUT that might come with additional deeper insights of emergent constraints and physical law.
|Similar Threads for: Gambini & Pullin on measuring time in quantum physics|
|time in quantum physics||Quantum Physics||19|
|Holography in LQG--new result by Gambini and Pullin||Beyond the Standard Model||2|
|Campiglia-Di Bartolo-Gambini-Pullin Paper||Beyond the Standard Model||0|
|Rafael Porto: the new paper with Gambini and Pullin||Beyond the Standard Model||0|