# Gambini & Pullin on measuring time in quantum physics

One can also imagine the extreme case where a small observer has to "invest" his entire own mass/energy in order to make an accurate measurement,...
That would be a great idea for a weigh-loss program... If not those pesky little black holes created in the process.

[It seems like Gambini is making similar observations as I have done, but
IMO he does not think through the consequences.

Fra; [ said:
Too large an energy density creates a black hole and spoils the measurement.
Since physics is a quantitative science, one would like to quantify this
statement. Exactly how large is too large? AFAIU, the only way to do so is
to introduce the observer's mass M into the equations. Ultimately, the goal
of physics is to describe observations, and the result of an observation
depends on the observer's mass; a light observer observes something
different than does an obese observer sitting inside his black hole.

it QFT, string theory or LQG, is necessarily wrong or incomplete, and can
at best be a limiting case of a more fundamental observer-dependent theory,
whose predictions depends on the observer's mass.

Fra; [ said:
one is limited in the amount of energy one can invest in a measurement
This is essentially the uncertainty relation applied to the observer,
right? To avoid problems with Pauli's theorem, it might be simpler to say
something similar about spatial localization. If the observer has finite
mass, there is a limit to how much an experiment can be localized.
In formulas, the observer's position and velocity do not commute, but

[q, v] = i hbar/M

an obese observer sitting inside his black hole.
Priceless!!

In formulas, the observer's position and velocity do not commute, but

[q, v] = i hbar/M
But this is a ridiculously small uncertainty, which can (and should) be ignored for all practical purposes.

Fra
Just a quick response on one thing, I still didn't read all the papers.

I noted in a previous thread I share part of Thomas ideas with introducing the observers mass; in some way, but I do think that I see this differently that you, which is a bit illustrated by the below.

This is essentially the uncertainty relation applied to the observer,
right?
I would say sort of but not quite. I definitely see it as much more involved than that.

The problem is if you consider an uncertainty relation between the observer and it's environment, you are implicitly defining another observer that observers the first observer, som kind of gods or birds view; that's exactly my point that that isn't valid. I suspect we differ in our views there. But for sure I agree that the observers mass "or some suitalbe complexity measure" for sure is missing in current models. I share your view fully on that point.

So what I mean with the limited energy for measurement is more abstract. I refer to a limited complexity that constraints all observable things, including physical law. It is not allowed to "explain this" by means of another external view, because that view is not at hand.

So I agree fully that "a light observer observes something different than does an obese observer sitting inside his black hole" but there exists no universal external description of this relation; this description is itself constrained by the same limits.

This is what I associate most closely to undecidable.

The difference here is between say something "undecidably undecidable" or something "decidable undecidable". My view is that we can not find hard universal constraints on the undecidability, instead it's evolving.

/Fredrik

Fra
I guess what I'm objecting to is the separation of physics and science. The scientific context, the home of physical law, is subject to the same physical constraints as physical interactions and physical measurements as i see it.

If you ask what physical laws and "explnattions" a tiny physical observer can distinguish, then that is a different than just asking how a larger observer can partially explain the actions of the small observer due to it's small size. I think we can explain much more, including the actions - ie the form of lagrangians and hamiltonians etc - of nature if we take this even more seriously.

If the langrangians and hamiltonians are not universally fixed, but themselves merely evolving views of physical law, to which all observers have slightly shiften view, then the explanatory power would I think be far greater.

This is an even larger reconstruction indeed, but I think it's what we need, and it is what I personally expect.

/Fredrik

Fra
This is what I associate most closely to undecidable.

The difference here is between say something "undecidably undecidable" or something "decidable undecidable". My view is that we can not find hard universal constraints on the undecidability, instead it's evolving.
This also relates a little bit to the original construction of QM, by Dirac etc.

In essenece, QM is an indeterministic, BUT, it is still a form of "determinisic indeterminism" because the indeterminism is encapsulated in the probability abstractions, and the idea is that the probabability is deterministically predictable.

This can not be valid IMO if you think bigger.

So this again relates back to the physical basis of the probability abstractions. And like I said to my defense before, the problem is not a mathematical one. The question is rather wether the mathematical abstractions and axioms of classical probability can be justified enough to encapsulate ALL indeterminism of the world, so that we end up with a "deterministic probability theory". I think not.

I rather think that a proper construction of what we have, will give a form of indeterministic indeterminism, which more takes the form of a game. All anyone has at any point, is an opinon of how to play. There are also expectations on how the game proceeds, but noone can predict it 100%. That is exactly why I like to think in terms of "games. That's exactly why it is a "game". Here is should be become intuitive also that a massive player, are more dominant and can make more viable predictions than a small observer.

Given that we accept this gaming idea, there is sense in decide something or estalibsh absolute truths, because all we have are expectations, from which we form our actions. Our actions are still not deterministically determined from the expectations, but they are "constraints". And once the game has proceeded, the situations repeats itself, and it has no survival uility to ask wether a predictions you made a year ago was right or false. Because that is somehow already erased, and replaced by a new prediction to make.

This is analogous to the human brain. A healthy human brain focus forward. Alot of research suggest that even our memories of the past, are mainly used in order to help us in the future. To accurately "recall" the past as it actually happened as per some actual time history, is not really the purpose of our memory. This is why the brain sometimes remembers the past in a way it actually didn't happen, it's because the brain transforms and reinterprets the record and instead remembers a form that it thinks is more useful for hte expected future. There are people who can very accurately recall in high detail what actually happened, and have amazing photohraphic memory, but these are usually disordered brains, like some savants. They have sometimes excellent, indeed almost supernatural memory, but then they have impaired ability to predict the future - figure which is the more important trait.

I see this quite analogous to the way I envision physical interactions, and evolving physical law.

/Fredrik

Meopemuk:

hbar/M is ridiculously small in two limits:

1. hbar -> 0. This is classical physics.

2. M -> infinity. I presume that this is what you meant. This amounts to the assumption that all relevant energy scales are much smaller than M. This may seem obviously true in practical cases. However, note that *all* scales must be much smaller than M, including the energy of virtual quanta. Hence we cannot integrate over virtual momenta with energy higher than M, so M effectively becomes a cut-off.

In the absense of gravity, you can take the cut-off to infinity, after you have computed things in the regularized theory. But when gravity is presence, you again run into the problem that M is not only the inert mass but also the heavy mass.

This argument is of course a variant of the standard argument that renormalization hides new physics above the cut-off scale. What is unusual is that the new physics here is not another term in the Lagrangian, but fuzziness in the observer's location.

Basically, you are saying that results of measurements must depend on the mass of the measuring apparatus. I have no problem with that. The gravitational field of the apparatus has some effect on trajectories of observed particles, energy levels, etc. However, there are lot of other (usually unaccounted) effects, which are many orders of magnitude stronger than the gravity field of the observer. For example, electromagnetic fields produced by a nearby radio station, or earth vibrations from automobiles on the street... Are you going to discuss them as well? What is the practical point of discussing observers so heavy that they are buried in their own black hole? This is so bizarre.

Fra
As I see it, the point of this discussion is that the interactions between systems are fundamentally dependent on their relative complexity (and mass), in a way that is not acknowledged in current formalisms and renormalisation procedures.

The focus is not at the 100'th decimal position in a standard experiment, the focus is on the fundaments of theory and physical law. A focus that if kept, might help us solve problems of unification.

The more interesting question here is what happens when the observer is NOT massive. How does the actions of a small system scale when it's complexity drops? Is there a possible inference to be made here?

Ordinary renormalisation, is simply a reduction. It can reduce degrees of freedom, but that's isn't a realistic inside scenario because it implies a birds view - to be decidably shaved. I think there is a better way to do this, that will be less deterministic, and more undecidable (to keep using that word here) BUT that might come with additional deeper insights of emergent constraints and physical law.

/Fredrik

Meopemuk,

Yes, in principle the result of an electromagnetic experiment depends on the observer's charge e as well as his mass M. We are usually only interested in the aspects of the experiment which are independent of e and M, i.e. we are only interested in the double limit e -> 0, M -> infinity. This limit is presumably well described by renormalized QFT.

The special thing about gravity is that e = M; heavy mass equals inert mass. Therefore, for gravity and gravity alone, the QFT limit e -> 0 and M -> infinity does not make sense, and the observer's physical properties can not be ignored.

the observer's physical properties can not be ignored.
If you are talking about extremely obese observers, this possibly makes sense. But for us normal-size people this looks like an irrelevant speculation about "100th decimal position".

Fra
I now read both the Undecidability paper and the montevideo paper.

I found the papers to not contains the details I was hoping for. Perhaps the other related papers contain more, or more is on it's way.

If I interpret them right then...

- What I like with their reasoning is that that they tend to relax the determinisim in QM, and that the unitary evolution is more like an expectation; this is exactly how I like to see it as well.

- Their view also contains a new view of physical law, where laws correspond to observers expectations, rather than hard forcing constraints acting from a gods perspective.

However they do not present a reconstruction of the interactions in a deeper informational way - which is what I personally expect to come along with this. I also expect an informational description of the evolutionary process.

A key problem they attacki is the problem of wether an observer can decide wether ANOTHER observers has performed a measurement or not. To me it's clear that this can never be decided 100%. In that respect I agree. The question I ask, is how to act, given that you don't know. IE. what is the logic of guessing, and playing this undecidable game?

I was hoping for more development there but it seems they are not yet there?

Or is this in another paper?

/Fredrik

Meopemuk,

The purpose of a Gedanken experiment is not that it be realistic (then you could do the real experiment instead), but to explore the limits of existing theories, and find the cracks where a better theory is needed. That is why I find it exciting that QFT assumes incompatible limits for the observer's heavy and inert masses.

Note also that major progress in the past has been connected to a more observer-dependent view of nature:
* SR tells us that the separation of spacetime into space and time is observer-dependent.
* QM tells us that position and momentum cannot both be observed sharply.

I suspect that QG requires us to take observer dependence to its logical conclusion: that the observer has quantum dynamics and affects the system just as much as the system affects the observer. After all, that is how all physical observers behave. To turn this philosophy into a concrete formalism will of course take a lot of effort. My attempts in this direction can be found on hep-th.

Fra
I'll jump right to a particular point here

I suspect that QG requires us to take observer dependence to its logical conclusion: that the observer has quantum dynamics and affects the system just as much as the system affects the observer. After all, that is how all physical observers behave. To turn this philosophy into a concrete formalism will of course take a lot of effort.
I think I fully share this vision. But from various readings on this, I have distinguished at leat a couple of quite different ways to handle observer dependence. I'm curious on Thomas view here:

1) Either you invent a theory, that represents a birds view, that "explains" how different observer views relate, and how different observers/systems interact. This is a form of realism still, a sort of realist view of physical law, that still acknowledges the observer dependence.

In this view, physical law, is a fact of nature, in a realist sense. Ie. the rules for relating different systems, are system independent. This is a mixed relational view, with a realist view of physical law.

Those who think like this considers some mathematical reality that exists independent of observation; but in a way that does explain how different real observations related. And this mathematical reality is accumulating truths about nature, as we learn about it.

2) Or you take the observers independence to also apply to the knowledge, not only of physical degrees of freedom such as position, momentum, charge etc in your environment, but also to physical law, or the regularity that is de facto realised by nature. In this view, the entire notion of physical law, becomes not elements of mathematical reality or some realist birds view, but is itself constrained by the relational complexity and capacity of physical systems.

I adhere to this second view, and from what I can see case 1 is more common. Smolin seems to be one of a few that seriously considers 2.

Current physics as in standard models etc, are definitely (1). The question is if the solutions to current fundamental problems are possible without leaving the view of law represented by (1)?

I don't think so.

This is why I was originally attracted to Rovellis reasoning. But the more I read the more did I see that he is stuck in (1).

As I see it, smoling is not, he is open for (2).

No doubt (2) is more strange, and weird, but also more consistent from the point of view of reasoning, and the philosophy of science.

This is why I think that the observer dependence is much deeper than the HUP applied to the observer.

If this even makes sense to anyone, I'm curious to hear Thomas comment on my decomposition into two principal ways here?

/Fredrik