# Nature of quantum theory of gravity.

• arroy_0205
I think it would be a really fascinating and beautiful process to try and get to the very heart of what physical laws are. In summary, according to the summarizer, the theory describing quantum gravity is expected to be discrete (rather than a continuum theory), nonlocal (rather than a local theory), and Lorentz violating (rather than a Lorentz invariant theory).

#### arroy_0205

Can anybody explain why the theory describing quantum gravity is expected to be discrete (rather than a continuum theory), nonlocal (rather than a local theory) and Lorentz violating (rather that a Lorentz invariant theory)?

Regarding discreteness: I don't think that's a must; there are theories using discreteness as input (CDT), others derive discreteness (LQG), in string theory discreteness does not show up. Discreteness is a well-know way to cure UV / short distance problems and (possibly) singularities; that's way many people expect something like that.

Regarding non-locality: why do you think that there must be non-locality? in LQG you can speculate about non-locality, but again, it's not a must; string theory allows purely local description, as far as I know

Regarding Lorentz-violation: string theory respects Lorentz-invariance in some sense (it does, if the chosen background does); I would say that Lorentz-invariance is not a fundamental symmetry of nature, it is only manifest in certain solutions.

Last edited:
arroy_0205 said:
Can anybody explain why the theory describing quantum gravity is expected to be discrete (rather than a continuum theory)

IMHO, the continuum is naturally understood as limiting cases of discrete systems.

Real numbers are usually constructed as completion of rational numbers by it's limits. But even the axiomatic approach makes uses of limits.

So I would say that even from a pure mathematical point of view, the continuum is less "fundamental" than integer number theory.

From my point of view, this is particularly when you consider that part of the point of quantum theory is that it is an information theory. And information certainly raises the question of how you count/measure information.

A continuum measure is far more nontrivial than a discrete one. So any proper definition of continuum models, IMO needs to be introduced as per a procedure, limiting or completion, somehow from more primitive constructs.

And if you believe that a finite physical system can not encode infinite information, then it seems plausible enough to me to think that the continuum must - in any information theory - be emergent, in the limit of large systems. In particular does this suggest that if you skip this process and jump right into the continuum world, certain tracking is completely lost.

Even if most of what we know are well described by continuum models, I personally think that informationwise, a discrete starting point is the most natural thing. If not for other reason, to understand the continuums mathematical redundancy better and abstract from it the physical degrees of freedom.

This argument is however quite different from that of LQG etc. To each his own :)

/Fredrik

Just to note CDT does not claim that spacetime is discrete. It uses triangulations in the same way a lattice does for QCD. One can still take the continuum limit in principle. CDT is just a method of computing the path integral. Discretness in LQG otoh comes from the the actual quantization of spacetime.

Wether spacetime is discrete or not is IMO a more "specific" question, and can still have somewhat different meanings, depending on wether you talk about discrete spacetime event sets or discrete geometries (that themselves ate continous).

I see this also from another way, prior to even defining spacetime in a new framework, for me personally I take the observer and observation processes to be one of the more fundamental aspects, and here I see that the very physical microstructure distinguishable to the observer (which also would be the container for any kind of state vector) is discrete and bounded. The complexity bound for each observer could be an important parameter, because it constrains all state vectors, and provides a hard limit for the normalisations. So this should also guaranteed that inifinites don't show up because no matter what the informtaion represents (space, time, mass, energy or anything else) the representation structure provides a sort of cutoff intrinsic to each observer.

In current QM, this is ignored. As I see it the relation between QM and GR is partly so that QM is a theory of measurement, given communication channels, but it doesn't properly analyse what happens at the nodes, and how the communication channels themselves may be affected by the process.

GR otoh, relates matter/energy and spacetime, which can be pictures to be a special case of relating nodes and channels, BUT it's not a measurement theory in the proper sense.

So I personally see that from an abstract ambiton of constructing a theory of measurement, that also models the observers (in particular explicitly notes how the complexity of an observer constrains what measurements are POSSIBLE), that current models are flawed, but that both QM and GR both have good partial insights.

This is why I believe in going back, and try to reconstruct on a very fundamental level a theory of measurement. I even think that gravity will come out by consistency.

The idea of actually taking classical GR, try to reformulation in as per some choice, and try to literally "quantize it" as per the same old OLD QM, is the wrong way to go.

Maybe we need to step back even before the foundations of QM and GR, and rethink. And maybe rather than trying to take them both as is and try to patch them togheter, reconstruct them both, but this time, without ignoring some of the obvious points that should matter to a measurement theory, such as saturation, not of communication CHANNELS, but of communication NODES (these are different things).

It's clear why in a particle experiment, that the laboratory frame is not easliy saturated. However, picture the inside view, where the observer is on par with the object it observes.

Edit: the predictions I envision here are at least two types.

1. First it's the general limits on the observations process, where current QM is possible just emergent in a "massive observer" approximation. And in the general case, the theory of measurement is even more indeterministic than QM, in particular do I expect there less determinism in the time evolution of state vector.

2. The other aspect thta I think will provide the key to a possible TOE unification model is that EVEN in the normal case of massive observer approximation (like we have in the particle-labframe scenario), if we can gain insight into the "inside view" of the subatomic world, we might understand WHY we have the interactions we have, and probably explain several of hte paramters in the particle standard model. The unification idea here would be that as you "scale down" the observers complexity, the observed, distiniguishable laws, become simpler and simpler, and there will definitively be expected transtiions along the scaling where previously distinguishable interaction become indistinguishable and at some point there is only one. I think the key to that, is the inside view. In this view, the mental question would be like "what would I do, or what would I be CONSTRAINED to do, if I was a quark?" Maybe, there really isn't much of a choice. From the outside view however, the choices are plenty! which is I think why we have the fine tuning problems etc. This observer scaling would be a completely new approach that would replace or at least change the way we currently do and understand renormalisation.

/Fredrik

Last edited:
I think we should ask: what are the reasons that different communities (s.t. – LQG - ...) don’t come together? what are the fundamental differences: philosophical question how to do science? or simply different approaches and lack of success?

How will QG change the way of doing science? Regardless who is right (and perhaps all existing programs are misguided and plainly wrong!) QG is a new challenge to science. I think it’s the first time that we try to develop a new theory w/o one single experimental hint. Even worse, QG tells us that these new hints may be simply not there, except in regimes and processes that are not tractable experimentally.

So the only guideline is to develop a new theory that
1) makes the same predictions as QFT and GR in the regimes where they are applicable
2) resolves problems in the QG regime where a naïve combination of both theories fail

Of course it could be that new QG effects will show up, but up to now these effects are not there. There are some ideas like non-trivial vacuum dispersion relations, but as far as I know the experts are not really sure if these predictions are based on solid grounds.

Today s.t. is accused not to produce predictions that are provable or disprovable by experiment. But I think this is a fundamental problem in all QG approaches; it applies to LQG and CDT as well (the prediction that spacetime becomes two-dimensional in the UV is nice, but not better testable than 10 or 11 dimensions in s.t.)

tom.stoer said:
how to do science? or simply different approaches and lack of success?
...
How will QG change the way of doing science? Regardless who is right (and perhaps all existing programs are misguided and plainly wrong!) QG is a new challenge to science.

I agree that we should put the current problem of fundamental physics in the context of science. It it no longer to separate the ontology and state of scientifically acquired knowledge, from the ontology of the scientific method. This is why I have personally come to have an evolutionary view of science.

The natural link between scientific knowledge and scientific methods on one hand, and physical states and physical interaction processes really does become very clear (IMHO at least) when you ponder a fundamental physical theory as a form of measurement theory. The boundary between SCIENCE(scientists interacting with it's environment) and PHYSICS(physical systems interacting with it's environment) itself, becomes thin indeed. That's how I see it.

I think that overall we do not understand the depth of this, not even conceptually. But when we do, it could possibly be the basis for the next revolution we need.

/Fredrik

arroy_0205 said:
Can anybody explain why the theory describing quantum gravity is expected to be discrete (rather than a continuum theory), nonlocal (rather than a local theory) and Lorentz violating (rather that a Lorentz invariant theory)?

No, I can not explain this, since I expect quantum gravity to be a continuous, Lorentz invariant and local theory myself.

However, I also expect QG to be observer dependent, for the following reason. Every physical experiment is an interaction between a system and an observer, and the outcome depends on the physical properties of both. In particular, the result depends on the mass and charge of the observer. Alas, predictions of QFT do not depend on these quantities, which means that some tacit assumption is made. Clearly, the assumption is that the observer's charge is zero (so the observer does not perturb the fields) and that the observer's mass is infinite (so the observer follows a well-defined, classical trajectory in spacetime; in particular, the observer's position and velocity commute at equal times). This assumption is consistent except in the presence of gravity, where charge and mass are the same; heavy mass equals inert mass. Hence QFT breaks down specifically for gravity.

Hello Thomas!

However, I also expect QG to be observer dependent, for the following reason. Every physical experiment is an interaction between a system and an observer, and the outcome depends on the physical properties of both.

I fully agree.

But to try to pinpoint how you mean this - if I may ask, what is your opinion about the idea of a birds view, that is viewed in a realist sense, that explains the observations made by each observer.

Can such a bird level view, be observer indepdendent in your view (ie in the realist sense; as if you DO take the realist view, it exists independent of observation, whatever that means), or would you say that makes no sense?

/Fredrik

No, I can not explain this, since I expect quantum gravity to be a continuous, Lorentz invariant and local theory myself.

However, I also expect QG to be observer dependent, for the following reason. Every physical experiment is an interaction between a system and an observer, and the outcome depends on the physical properties of both. In particular, the result depends on the mass and charge of the observer. Alas, predictions of QFT do not depend on these quantities, which means that some tacit assumption is made. Clearly, the assumption is that the observer's charge is zero (so the observer does not perturb the fields) and that the observer's mass is infinite (so the observer follows a well-defined, classical trajectory in spacetime; in particular, the observer's position and velocity commute at equal times). This assumption is consistent except in the presence of gravity, where charge and mass are the same; heavy mass equals inert mass. Hence QFT breaks down specifically for gravity.

Very interesting post actually. Not sure whether i agree. I think that nature is subjective. That is there is only subjective reality and objective reality is a useful creation we use to describe the world and give it meaning. Or possibly it is a subtle mix of objectivity and subjectivity. Clearly QM and GR bring objective reality into question in there own way, but both have limits in which the objective world is returned to us. Possibly the combination of QM and GR will shatter objective reality. But i also have ideas about how things become better defined(more objective) in physics if we consider larger and larger systems. That is if we write down the theory of a single atom we are implicitly assuming this atom is in a universe all on its own. This description has many uncertainties associated with it however it is also an incomplete description of an actual physical atom since we neglect its enviroment. My idea is that when we describe its environment to a greater and greather accuracy and thus describe the atom better these uncertaities are diminished; our description becomes more complete. My guiding principle here is that we, humans that is, need to break things down into concepts in order to understand them, but the reality is that nature works as one the only true description of reality is an infinite one and hence any finte description will always be in complete.

Yeah not sure where I am going with that but nm. But i think one should not be concerned with "the obsever" of an experiment as much as the whole enviroment(maybe just a choice of language). Hence any experiment is subject to its environment in an essentially non-local way in the EPR sense and also in the Wheeler delayed choice experiment way( ie non-local in a temperal sense as well as spatial).

Also i think the 2nd law of thermodynamics is insightful in such matters. If one thinks of entropy as information then it makes sense that when one makes a measurement one gains some information about the world and as the observer is a physical system too his and hence the universes entropy should increase. In the spirit of the 'statistical time hypothesis' i think the increase of information(or entropy) is a meaningfull defination of "time". Time as a coordiante otoh is essentially meaningless.

BTW all ideas in this post i don't nessarily believe they are just my thoughts

Finbar said:
Very interesting post actually. Not sure whether i agree. I think that nature is subjective. That is there is only subjective reality and objective reality is a useful creation we use to describe the world and give it meaning. Or possibly it is a subtle mix of objectivity and subjectivity. Clearly QM and GR bring objective reality into question in there own way, but both have limits in which the objective world is returned to us. Possibly the combination of QM and GR will shatter objective reality. But i also have ideas about how things become better defined(more objective) in physics if we consider larger and larger systems.

I don't know Thomas enough to know he entire vision, but I at least partially agree with him on several points judged from his post above.

I maybe misinterpret you but about what you say here, I understand your point and it's a common theme, but from my perspective is has a problem and it's related to what I call the inside view:

The point is that any consideration always takes place from the point of view of an observer, and this observer can hold (IMO) only finite information. This means that no matter how you measure size, at some point there is a complexity limit to how large system this one observer can relate to. This is IMHO, why the birds view fails.

Otherwise your idea could give a birds view, if we consider "the entire universe" to be one observer. Although that is perfectly fine in a way, it does not help other observers. I think this complexity constraints, implies an extra uncertainty.

Sometimes, atom vs Eeart-based-laboratory the assymmetry in complexity is so large that your ideas works fine. But I think this idea doesn't hold in the general case.

An alterantive to your suggested route, as I see it, is the evolving law and emergent objectivity view. But the FULL emergence of complete objectivity, is still running with the complexity contraint in my view. So given a finite observer, there is a limit to the objectivity. And this limit, has predictable impacts on this observers actions. This is IMO the key to the unification picture as well.

If you don't know you dont' know, but the complexity contraint metaphorically means that the box from which you pull your choices, gets smaller and smaller, until the point where no more than a few choices possibly fit in the box. (shrinking statespace)

To make sense of this idea, one also need to explain the ORIGIN of complexity. Which I see closely related to the origin of mass. Also this I picture in a game theoretic way. Life is game, with stakes. The winner gains stability, and ultimately control and complexity. But knowing your environment, you can ultimately also control your environment and consume it.

/Fredrik

Let me explain what I mean by observer dependence in somewhat more length.The main reference is http://arxiv.org/abs/0811.0900 . There is also a sequel with one number higher, but I am rather unhappy with that paper and will write a better one once I manage to assemble enough motivation.

The introduction of observer dependence amounts to little more than making a Taylor expansion of all fields. The point is that a Taylor series

f(x) = sum_m 1/m! f_m (x-q)^m

depends not only on the Taylor coefficients f_m, but also on the expansion point q, i.e. the observer's position. To each function f(x) we can associate many Taylor series, parametrized by q, whereas the function defined by the Taylor series is unique in good cases. The Taylor coefficients for different base points are of course related, but to make a Taylor expansion we must commit some definite base point. It then makes sense to talk about *the* observer carrying *the* clock, etc.

To quantize a Taylor series we introduce some dynamics for the observer's position. This adds some terms to the Lagrangian, and it is here that the observer's charge e enters. In particular, the observer decouples from the field dynamics if e = 0. The crucial novelty is that the Taylor coefficients f_m are the components of field measured relative to the observer's position, which is subject to quantum fluctuations. If

p = Mv

denote the (non-relativistic) observer's momentum, mass and velocity, Heisenberg tells us that

[q, v] = i hbar/M.

If the observer measures her position at some instant, she does not have a clue were she is at the next instant, because her velocity must be completely unknown.

There are two ways to avoid this quantum uncertainty:
1. Set hbar = 0. This gives classical physics e.g. GR.
2. Set M = infinity. This gives QFT

If we keep M nonzero and finite, and consider GR in this setup, there are again two limits:
1. hbar = 0. This describes GR coupled to a point particle with mass M.
2. Newton's constant G = 0. This should describe QFT with M as the cut-off scale. Because the observer's mass is infinite in QFT, all relevant energies must be much smaller than M, including the energy of virtual quanta. Hence M acts as a cut-off.

This makes it very clear why GR and QFT are mutually incompatible: if we let the cut-off scale M to infinity, the observer will interact with gravity and collapse into a black hole. Not good! To my knowledge, this is by far the most intuitive argument why QFT has big problems specifically with GR.

There is a web of interconnections between three crucial concepts in quantum gravity: locality, diff anomalies, and observer dependence.

1. Locality.
Classical GR is a local theory, and so is QFT (in an appropriate sense). What is happening here and now should be best described in terms of local data, and not e.g. in terms of data living on some holographic screen outside our visible dS universe. However, theorems (LOST and others) state that nontrivial correlation functions are incompatible with the space-time diffeomorphism symmetry of GR. This no-go theorem can be evaded by quantum mechanical breaking of this symmetry, i.e. by

2. Diff anomalies.
Contrary to popular belief, a gauge anomaly does not automatically render a theory inconsistent. What it does is to turn a classical gauge symmetry into a quantum global symmetry, which acts on the Hilbert space rather than reducing it. This may or may not be consistent, depending on whether this action is unitary. The diff anomalies relevant to quantum gravity generate a higher-dimensional generalization of the Virasoro algebra, which was discovered almost 20 years ago. It is not possible to construct representations of this algebra using the fields themselves, because such attempts leads to non-sensical infinities.

3. Observer dependence.
Instead, off-shell representations of the multi-dimensional Virasoro algebra can be built from space-time trajectories in the space of Taylor series. The relevant extensions are functionals of the observer's trajectory (the time evolution of the observer's position), and can hence not arise in QFT, where the observer is never introduced. This is in accordance with a theorem that asserts that there are no diff anomalies in 4D in QFT.

Hence we see that locality, diff anomalies, and observer dependence are closely related. You can not have one in quantum gravity without buying the whole package.

arroy_0205 said:
Can anybody explain why the theory describing quantum gravity is expected to be discrete (rather than a continuum theory)?

I find it very difficult to understand what people mean by "discrete space-time". By definition, any physical measurement must involve a physical system made of real particles. By another definition, empty space does not contain any particles. So, logically, you cannot measure anything in empty space, all your measuring devices and detectors are supposed to stay silent. Space (or space-time) is not a physical system that can be observed, so it is impossible to verify in experiment whether it is continuous or discrete. If this question is beyond experiment, then it should be beyond physics.

Let me explain what I mean by observer dependence in somewhat more length.The main reference is http://arxiv.org/abs/0811.0900 .

Thanks! I'll try to skim that paper when I get a chance, I noticed it was quite long so I won't be able to do it now.

The first impression is that your motviation and reasoning is different than mine, but I seem to share some of the conclusions regarding the impact of the observers mass, which makes it interesting for me to look into. Although mass and spacetime is reconstructed in my view, my starting abstraction is a complexity number, or information capacity/inertia. I expect this to more or less related to mass, but exactly how the observed mass as we konw it, relate to this is still unclear.

I need to read it more carefully to see the whole picture and identify your starting points.

/Fredrik

I think all theories talking about discrete spacetime admit at the same time that this discreteness is not necessarily subject to experiments directly. E.g.in LQG the area operator is no dirac observable.

But there are indirect effects for this discreteness. Some time ago in LGQ the vacuum dispersion relation has been derived in some semiclassical context. Because of this fundamental discreteness c=const. changed to a frequency dependent c.

I don't know how serious this was, because in LQG you still have problems in finding the correct semiclassical limit. The Hamiltonian is still not defined rigorously and the results often depend on he details of the trial states.

I don't know the current status, but this topic seems to be very interesting to me.

tom.stoer said:
I think all theories talking about discrete spacetime admit at the same time that this discreteness is not necessarily subject to experiments directly. E.g.in LQG the area operator is no dirac observable.

I think introduction of theoretical ingredients that are not directly observable is very dangerous. Before long you can find yourself counting angels on the head of a pin.

:-)

What about Hilbert spaces, gauge fields, metric, energy-momentum-density, ...

All not directly observable in the strict sense

tom.stoer said:
:-)

What about Hilbert spaces, gauge fields, metric, energy-momentum-density, ...

All not directly observable in the strict sense

Excellent point :)

However I do join Meopmunk in the ambition of sticking to observables. But I think if you take that seriously, hilbert spaces to mention one thing IS subject to the same critic. This is where I seem to differ with Meopmunk.

This is in line with my view, where hilbert spaces must be evolving.

Relating to my comment in this discussion https://www.physicsforums.com/showthread.php?t=312921&page=2

/Fredrik

tom.stoer said:
:-)

What about Hilbert spaces, gauge fields, metric, energy-momentum-density, ...

All not directly observable in the strict sense

I agree about gauge fields and energy-momentum density. They are not observable, but they are not fundamental either. They are just formal abstract objects. Theory can be formulated without them. This is best explained in Weinberg's "The quantum theory of fields" vol. 1. His idea is that the fundamental quantities in quantum theory are interacting operators of energy and boost. Quantum fields are just parts of a mathematical trick that allows us to build these interactions without losing the Poincare invariance.

I agree that Hilbert space, wave functions, Hermitian operators are highly technical things. However, they are deeply rooted in experiment. I would like to draw your attention to the approach called "quantum logic". It basically says that quantum mechanics is simply an analog of the classical probability theory in which certain observations cannot be performed simultaneously. According to this approach, quantum theory can be formulated in terms of observable quantities only, i.e., probabilities. The mathematical language for doing that is the theory of orthomodular lattices. However, this language is unfamiliar to most physicists and difficult to use. So, using Hilbert spaces and (non-observable) wave functions is the price we pay for mathematical convenience.

meopemuk said:
I agree that Hilbert space, wave functions, Hermitian operators are highly technical things. However, they are deeply rooted in experiment.

Yes I think is the best description from the human point of view. And in this view, no one can I think deny that our abstractions are de facto evolving, because human science and experimenting has evolved in the past and continues todo so, and with it our worldview.

To understand the emergence of hilbert spaces at human level amounts to understanding the EVOLUTION of human science and development of physics, which has a history of theory evolving during the feedback of experiment.

(This is a similar point Smolin makes in arguing in favour of his evolving law.)

*But* unless one thinks that this is only a human abstraction, and think the it's not a problem for the physicists, one would keep questioning and ask if the abstractions and state spaces, described by anything but a human, like a physical inside observer, would also be evolving? Because after all, physical systems that are non-human, behaves AS IF they actually somehow also work with similar state spaces.

Can we describe this process physically? What do we get it we treat "experiments" and "measurements" on a similar footing? Could "experimentally rooted" simply be thought of as "evolutionary favoured or selected"? I think so.

Then it seems the missing link here is to understand our own history, and how our present and our future relate.

A scientists, has reasons rooted in his experimental history, for holding certain abstractions. We seem to agree there.

Either you think an analysis of that takes us into the psychology of scientists brains (like Popper did) and is thus not relevant to science.

Or you think, like I do, that this is exactly the kind of thing we need to understand in depth.

If you take the view of QM, that it is about what an observer can measure, it doesn't seem to have much todo with humans. Instead shouldn't we ask, how the hilbert spaces of the environment observed by a piece of matter, is encoded in the matter itself? And how these respond to each other?

meopemuk said:
I would like to draw your attention to the approach called "quantum logic". It basically says that quantum mechanics is simply an analog of the classical probability theory in which certain observations cannot be performed simultaneously.

As far as I know the usual quantum logic solves none of the mentioned problems. It is more or less and equivalent way to introduce QM (the differences are technical more so than fundamental IMO) The same question applies to this logic, what is the origin of this logic?

I think there is an answer to that (to be nailed). And the answer to that is the same as to how hilbert spaces emerge.

I expect that in an evolutionary perspective, quantum logic would probably prove to be more fit than classical logic, and that observers evolve quantum logic behaviour since it makes them more fit - the question I ask, is to describe and understand in detail this abstract evolving logic.

/Fredrik

Fra said:
As far as I know the usual quantum logic solves none of the mentioned problems. It is more or less and equivalent way to introduce QM (the differences are technical more so than fundamental IMO) The same question applies to this logic, what is the origin of this logic?

I agree that quantum logic is an equivalent formulation of QM. The reason I've mentioned this approach is that states are described by probability distributions in it (rather than by vectors in the Hilbert space or wave functions). So, the benefit of quantum logic is that all theoretical ingredients have direct connections to observable things. The downside of it is that the mathematical formalism is unfamiliar and less advanced than the formalism of Hilbert spaces. My conclusion is that quantum mechanics actually manipulates with directly observable objects. Our introduction of (non-observable) wave functions in QM is just a small price we are paying for using easier math.

@arroy_0205: is this discussion interesting for you?

Let me explain what I mean by observer dependence in somewhat more length.The main reference is http://arxiv.org/abs/0811.0900
I've been extra busy lately and havent't had a connected time slot long enouhg to analyse your reasoning in detail but I'll get back to it. But I notice a few key points I like and where we are on the same page.
[PLAIN]http://arxiv.org/abs/0811.0900 said:
Of course, we can only make predictions, or check that our assumed dynamics is correct, provided that we know the state of the system. Hence we must first measure the partial observables {A, t} sufficiently many times to determine the state. Once that is done,
the outcome of further observations is predicted by the theory.

...

However, there are subtle physical problems with using the complete observables
(φ, x). The first problem is that we need to know the state of the
system in order to make predictions, and infinitely many observations are
required to determine the state uniquely. Typically, we must determine the
values of the field throughout an equal-time surface, sayx0 = 0. Rovelli suggests
that one should avoid this problem by making additional assumptions
about the state [8], something which I find unattractive.
I agree Not only is it unattractive, from my perspective it makes no sense at all. It's an idealisation, that ignores how information collected by a real limited observer actually is stored. Such type of idealisation or assumption is unaccetable in any attempt at this level to look for deeper fundamental understandings of these things. This is exactly what begs to be resolved which Rovelli, and others ignores this.

This is another way of phrasing the objection I have raise in several threads on rovellis unwillingness to question the physical basis of probability. That's more more less exactly that same issure you raise here, but you put it differently that I did, and probably resolve it differently (I will know when I get around to lookg through the trest)

/Fredrik

Last edited by a moderator:
I think physics with external observers makes know sense at all. The reason why because a fundamentel property of Quantum mechanics is Non-locality. And therefore there is symply non external enviroment. what i mean to say is that the properties of the fundamntal building blocks of the sub system and the environment are the same.

Like I noted I share parts of your points but you still picture it very much different than I envision.

From my point of view there are a few things that you seem not to resolve still. Some questions:

1) Do you consider linking (somehow) the observers MASS to the observers information capacity instead of just appealing to the usual classical mechanics dynamics?

2) You replace the QFT state space, for the taylor coefficients of the relative field, but do you consider "how much information, and mass" is required for a real observer to hold/possess all this information?

Edit: I ask because one of the problem of infinitely many measurements, is the array of detectors you mentione in QFT (ie fill space and detectors), but regardless of that problem, is that no matter how the information is acquired, we hve the representation problem, how a finite observer is able to represent in his state, all this information - Do you address this?

3) It seems to me you still maintain a sort of birds view, which is the view in which the detector is described and predictes. Do you distinguish between the detector and the observer? If not, then where is the observer that takes the role to describe the detector position etc?

And what is the "mass" of THAT observer? I think it must be infinite too?

Please explain if I'm wrong but it seems that you try to solve the observer problem by describing the observer and the environment by a uniform abstraction, but this is done by just relaxing the fixed infinitely massive observer, with ANOTHER fixed reference?? Which is the implicity observer in your reasoning from which the fields and manifold themselves are defined.

4) Also about hte continuum, which observer is able to distinguish the continuum you use? It seems again this is a sign of a external infinitely massive god observer.

/Fredrik

I think one of the main points from my point of view, where I'm curious how you handle it, is this:

You can incorporate the observer and the system, by introducting a third observer, but part of the critics of the first setup is still existing when referring the the third observer, becuse the third observer is not an external observer (unless you are a realist, then there isn't much to discuss), it must still be also a physical observer subject to measurement interactions etc.

So the hierarchy of observers, observing systems of other obervers, will inflate the observer to larger and larger, or if you don't want that, at some point the finite information capacity of the obesrver has to be faced from the inside view. IE. the incompleteness must be described in a way that does not refer to an external (complete) picture, where the incompletness is explaine by some reductionist argument.

I think your talor coefficients, is a form of data compression but maybe a more complex state spaces is needed. Would the taylor expansion be truncated so that the observer would at some point be unable to resolve higher coefficients (thus they are not visible)?

IF so, the problem of hte uniqueness of the taylor expansion appears. One could probably picuture (iat least in continuum models) an infinite amount of possible expansions, that when truncated them would make a difference.

In my view, I don't picture expansions like that, but I picture abstractly compression algorithms, which is responsible for the "representation", and this algorithm is subject to evolution and there is thus no fixed or universally preferred representation decomposition. Or rather the preference is observer dependent and always subject evolution.

I want to connect this also to the action, rather than to bring in all the classical mechanics action and lagrangian baggage, which I find to be too much because this baggage has an information content, that begs to be explained? Or what do you think?

/Fredrik

Hi Fra

I haven't thought anything about information, so I can't say anything about it. Regarding hidden assumptions about several observers, I think I avoid them. As the name QJT indicates, I regard the jet formulation to be fundamental, and the fields as secondary (but convenient) constructs.

In the jet formulation, we only have a single observer, equipped with a detector and a GPS receiver. The detector is encoded in the Taylor coefficients, which contain the local information about the field in the detector. The reading of the GPS receiver gives us time and position.

For the GPS receiver to work properly, there must of course be GPS satellites sending out a stream of photons with the position information. However, once the photons have left the satellite, we can forget about it, and regard the GPS receiver as a black box which tells us time and location by a local experiment. Indeed, this is how most people use a GPS receiver :-) So even if the satellites are heavy, their mass does not really enter.

Hmm.. we could of course still perform the same local experiments even if the GPS satellites are malfunctioning. This would still give us partial observables phi_m and q, but it would no longer be useful to regard q as a spacetime coordinate.

Since I regard the fields, both absolute and relative, as secondary constructs defined by the Taylor series, I think that their interpretation is not so important. But I will nevertheless attempt one, One need an observer with a GPS receiver, and a separate detector equipped with its own GPS receiver. Then phi is the reading of the detector, x the reading of the detector's GPS receiver, and q the reading of the observer's GPS receiver. To measure phi(x) for all x, we would thus need to fill space with detectors; there is AFAIU no way around that. But in the Taylor formuation we get away with a single observer.