# Logic of E-H action, ricci scalar, cosmological constant?

Fra
Logic of E-H action, ricci scalar, cosmological constant??

This crazy thread is mean to stimulate some reflections on the logic of Einsteins Equations. It would be interesting if those who have any ideas can join. Maybe it could be enlightning?

The common way of thinking about GR is that we have the geodesic equations describing what "straight lines" are, and what are the trajectories of test particles - this has a simple geometric interpretation. Ie. the idea that gravity is curved spacetime. This is the simple part.

And we have Einsteins equation describing the feedback of progression into geometry itself. Ie more tricky part to interpret is the dynamics of geometry itself. If anyone have any favourite interpretations of this, let it out.

I am trying to see from my point of view, the logic of reasoning implicit in Einsteins Equations, and it seems the E-H action is one simple place to start.

The EH action in vacuum in compact is
$$S_{EH} = k \int R dV - 2k \int \Lambda dV$$

Einsteins expected the cosmological constant to be zero, but now we expect that it's almost zero but not quite. S is an arbitrary measure chosen as -ln(P/Pmax) which measures the probability of our future expectations beeing right (in a special sense).

Now there is a simple comparasion between a choice of measure of probability of probability when applied to a simple case where the retained history can defines a probability distribution of the future, and the question asked here is, the coupling between history and expectations on the future. There is a simple but interesting analogy to EH action here.

$$\textbf{S} = M S_{KL} - ln(w/P_{max})$$

Here S_KL is the kullback-leibler information divergence, M is the "sample size" of history - corresponding the observers memory capacity. Pmax is a maximum expected "fuzzy probability" of correctly predicting the future. w is a weight factor, that approaces 1 as M -> infinity.

Note here the logic by which it is a correct "expectation" that ln(w/Pmax) -> 0 in the continuum limit, but incompletness suggest that it's not exactly zero.

See https://www.physicsforums.com/showthread.php?t=238501 for some more notes.

Since this is reflections only, the details need refinements another time but the interesting reflection I make here is that a certain type of statistical reasoning suggest that the

1) ricci scalar can be interpreted as a sort of "density of information divergence"

2) The cosmological constant seems to associate to the fact that we have discrete information! In the continuum limit, and a infinitely massive observer one would expect a zero constant, but a finite observer can never conclude a zero value. Rather one would expect it to be "almost zero".

3) In a certain sense (to be clarified) the EH action can be thought of as the "action of the action", and this is immediately realized to be nothing but an induction step.

This is very fuzzy and needs refinement. I'm currently trying to analyse the meaning of this. It does seem that Einsteins equation could be interpreted (alot of details missing though), rightfully as an "expectation" almost in line with that we "expect" the entropy of the universe to always increase, but the fact that we don't know with infinite confidence, suggest first of all the non-zero, but close to zero lambda, but even worse, the whole equation might possibly be just one step in a certain direction (guided by a certain logic of reasoning).

The comparasion here is the statistical interpretation of the second law. Ie. the total entropy (whatever that is - another question) does not have to increase, rather there i a high probability that it will increase (by construction, but as above the details and notions of probabiltiy is unclear - this is the basis of the discretisation of it I argued for in the other thread)

I think that no one that has been thining about this could avoid having some kind of personal associations or "interpretations" of einsteins equations.

What other possible logic do you see behind GR?

I hope this thread isn't too fuzzy for this section. The purpose is not to discuss withing the contact of GR, it's to discussed a possible larger context in which GR is seem to emerge as natural. I feel a need to gain a deeper conceptual understanding of the bits and pieces we are trying to merge in QG, and maybe if others share the same need, we can elaborate the conceptual basis for GR. Of course if someone has all the answers explicitly worked out in a stringent formalisms already, even better. But since I suspect no one has that, the reflections are bound to be somewhat fuzzy :)

/Fredrik

Fra
Hello dx, yes that's the perspective I'm talking about! But has anyone consistently induced R from such a spirit (yet)? I asked about this before but didn't get much response except from somone who knew Ariel Caticha.

I mean, not just a reformulation in terms of information rather than geometry, but to actually use logic of reasoning and "evolving information" to more or less derive GR, but in a more modern spirit (ie without illusions of classical realism that is clearly present in classical GR)?

I know Ariel Caticha (http://www.albany.edu/physics/ariel_caticha.htm [Broken]) has expressed this ambition, and I have read several of his papers. I like this way of thinking, but he has not as far as I know completed his goal. I really like plenty of this work.

If anyone knows of any more up to date papers on similar approaces it would be very interesting.

The vision I have is that we should not have to postulate the EH action. I think there should be a way to see why the EH action is right arguing from simpler assumptions, OR if it is not, then we should see what the corrections are. I of course suspect the latter.

Is the Einsteins equations truly fundamental or should it be understood as the result of constrained reasoning, or as a kind of equilibrium condition, if so what is the line of reasoning behind it, and how can we follow it?

/Fredrik

Last edited by a moderator:
Fra
Maybe it can be interesting to loosely associate to this thread https://www.physicsforums.com/showthread.php?p=1755562 where marcus highlighted

"We propose a theory of quantum gravity which formulates the quantum theory as a nonperturbative path integral, where each spacetime history appears with a weight given by the exponentiated Einstein-Hilbert action of the corresponding causal geometry"

"This emergence is of an entropic, self-organizing nature, with the weight of the Einstein-Hilbert action playing a minor role."

-- "The Self-Organized de Sitter Universe", http://arxiv.org/abs/0806.0397

Now they have made a leap into the reasoning (relative to my simple ponderings here) and already use complex amplitudes, but to take a step back the association to the above simple stuff is that the historic probability for a particular configuration with a certain information divergence, as judged relative to a part of retained history (truncated historic probabilities) is directly given an explicit form

$$P = \w e^{-M{S_{KL}}}$$

$$w = \left\{M! \frac{ \prod_{i=1..k} \rho_{i}^{(M\rho_{i})} }{ \prod_{i=1..k} (M\rho_{i})! } \right\}$$

where w -> 1 as M -> inf.
P is interpreted as a kind of "probability" of a particular future, constructed on a constrained part of history. M is the sample size.
S_KL is the relative entropy of the discrete induced relative frequency prob. space of history of M samples. In the formula there is also the notion of distinguishable states. It's yet to figure out if the # of dist. states associates to an event horizon area, or the "volume" inside. I haved suspect the latter but haven't been able to figure how that connects to the big picture.

This this is a diffusion equation, but not in time, but rather just relative change. Time must be pulled by an arbitrary choice of internal parametriation, and become a parameterisation or evolution. however I don't see a sensible distinction of the past and future. All there is is possible change. And there is only one direction, and that's forward. There's no such thing as negative change. The only sense I see in "the past" is the part of history that is implicit in the observers memory and evolved logic?

These are fuzzy associations, and as I see it there are two issues missing. The emergence of dimensionality and structure (decomposing the degrees of freedom in dimensions), and introducing relations. And to explain the complex amplitude that distinguishes simple classical diffusion from non-trivial dynamics of QM. I suspect the two are related.

The simplest possible related spaces are probabilities of probabilites. That least to a hierarchical structure but which is more or less driven by simple diffusion like dynamics. But once you add spaces related by more complicated relations (these relations are selected by evolution, for fitness) then most probably more complex expectatins will emerge, and possibly also the complex formalism, and hopefully there will be a clean line of reasoning for why this is so.

Could this be due to the choice of reasoning implicit in the selected internal structure of the observers microstructure? that the history is not just a simple storage of historic relative frequencies, but rather a system of relates such structures, and here the superpostion principle must emerge. And a plausible explanation here is if we can show that there is a selection for such emergence. And thus we will have by the same token as with the rest of this, an incomplete probabilistic argument that there is more reasons to suspect that this will emerge, than there is reason to suspect it wont. All in line with an self-assembling and self-learning observer. And considering that the universe is composed of a system of interacting parts (ie principal observers) this might be able to create a complete self organising universe.

/Fredrik

Fra
Correction: it should be $$P = w e^{-M{S_{KL}}}$$

/Fredrik

Homework Helper
Gold Member
Hi Fra, I'm interested in knowing whether anyone has tried to use the insights gained from Bayesian probability in rethinking the meaning of probability in quantum mechanics. Do you know of any work in this direction?

Fra
Yes, that connection is definitely in the minds of many, and it's well in line with they way I argue here.

I don't know your perspective but IMO to a certain extent the bayesian interpretation is close to the relational interpretation of QM, but which also comes in slightly diffeerent forms, and not everyone does explicitly use bayesian terminology, but IMO it's related.

For example

"Relational Quantum Mechanics", Carlo Rovelli
-- http://arxiv.org/abs/quant-ph/9609002

Rovelli IMO makes many excellent points and observations, but it's as I see it, a good start but not the complete answer. Ie. bayesian probability alone, does not solve all problems. The next complication is that of constrained complexity and the choice of logic.

"Bayesian Probability Theory and Quantum Mechanics"
-- http://math.ucr.edu/home/baez/bayes.html

But beyond this, there are also bayesian approaches to reasoning in another way, sometimes related to various max ent principles and entropy dynamics.

See for example this paper which shows IMHO many clever reflections

"Entropic Dynamics", Ariel Caticha
-- http://arxiv.org/PS_cache/gr-qc/pdf/0109/0109068v1.pdf

Abstract
"I explore the possibility that the laws of physics might be laws of
inference rather than laws of nature. What sort of dynamics can one derive
from well-established rules of inference? Specifically, I ask: Given relevant
information codified in the initial and the final states, what trajectory is
the system expected to follow? The answer follows from a principle of
inference, the principle of maximum entropy, and not from a principle of
physics..."

Now, I don't quite agree with all his reasoning in detail but that's not the point. It's good enough for getting on track. So if you read it, look for the spirit, not details, because the details are I believe under development.

I'm sure there are many other papers out there. My thinking is somewhat along this spirits, but as noted, there are different flavours of thinking within this spirit, that are not the same. But they still form a group distinct from many other approaches in that they take the line of reasoning, and logic of reason more seriously than many other more "technically minded" approaches. That's not to say that this won't get technical, it more refers to the strategy of research and way of reasoning.

I prefer to look for sound (nice looking:) reasoning, rather than nice looking mathematical symmetries etc. But that's just me.

/Fredrik

Last edited:
Fra
Hi Fra, I'm interested in knowing whether anyone has tried to use the insights gained from Bayesian probability in rethinking the meaning of probability in quantum mechanics. Do you know of any work in this direction?

If you know of anything else on this please elaborate.

IMO, it's not that many papers i know of that treat this from the angle I prefer. I prefer an angle which is a mix of information perspective, logic and philosophical aspects.

In the abstract view I would ask these questions

1) what is the logic of guessing?
2) what is the logic of correction?
3) The synthesis: the logic of a corretive guessing?

For (1) one easily imagines a kind of measure. For example. probability, entropy or action. It basically based in the current information, rates the possible next steps. And give enough say "equilibration" or statistics, the principle of minimum action or maximum transition probability is plausble.

But what about when the guess is wrong. Then is the next quesiton. How can our guess, but updated based on new information? Here are clearly some constraints. Info capacity constraints may prevent us from just adding ALL observations to a grand history. So we may need to make a choice, what is more important to remember. Which again involves a choice that needs rating. What part of our information is "more likely" to be important to keep?

Finally taken together, our "guessing strategy" will be an evolving ans self-organising one.

Alot of these things are not analysed to satisfaction in the standard formalisms. And I think such a first principle analysis is what I personally have rated to be the best way to make progress, based on my limited knowledge.

Often various actions are pulled out of nowhere and put into feynmanns integral and then one see what it gives, and one tries to renormalize away the nonsense. I find that to be a superficial method. I desired to find a deeper understanding of the formalisms from start. Then, maybe the choice of actions will also be much easier simply because we would know what it really means beyond the "it usually works" level.

/Fredrik

Homework Helper
Gold Member
Hi Fra,

I share many of your views on what the next stage of understanding quantum theory will be like. But I'm not in a position to really think about these things yet, so I can't say anything specific. I think the logic/information perspective is already present in rudimentary form in the way we talk about current quantum theory, especially the operator formalism. As Rovelli has pointed out, there is no absolute state of a system independent of observers, and the state may be different for different observers. If we move further along these lines, I think we must at some stage give up also the notion that electrons are absolutely electrons, protons are absolutely protons etc, just like we have given up the idea that objects have absolute properties like position and momentum. For example, the president doesn't have any absolute intrinsic presidentness. That property is only a kind of statistical property having to do with the collective knowledge of other people (particles).

Regarding your reflections on the logic of guessing etc, have you read any of E. T. Jaynes' books/papers? I strongly recommended his book "Probability Theory : The Logic of Science".

Fra
Hello dx,
I think the logic/information perspective is already present in rudimentary form in the way we talk about current quantum theory, especially the operator formalism.

Yes that's right, but the "information view" in current QM is IMHO not perfect, it's an idealised information view, which doesn't acknowledge stuff like limited information capacity and processing time.

But from my point of view this has been several steps.

When you are first familiar with classical mechanics and classical logic so to speak, the first exposure to QM is weird. One the other side I came out with a somewhat fuzzy but still effectively working understanding of the quantum world based on information exchange. This is nicely encapsulated in Bohr's quote that "It is wrong to think that the task of physics is to find out how Nature is. Physics concerns what we say about Nature.". That in essense takes an information perspective.

Dirac also argues in a fairly plausible manner, why probabilities are introduced, as to quantify our imperfect information.

However, one you accepted that classical realism and logic is not the most effective one, one can still observe in several ways how the QM formalism doesn't quite form a completely consistent logic IMO. See below.

As Rovelli has pointed out, there is no absolute state of a system independent of observers, and the state may be different for different observers. If we move further along these lines, I think we must at some stage give up also the notion that electrons are absolutely electrons, protons are absolutely protons etc, just like we have given up the idea that objects have absolute properties like position and momentum. For example, the president doesn't have any absolute intrinsic presidentness. That property is only a kind of statistical property having to do with the collective knowledge of other people (particles).

I like many Rovelli's points, but not all of them. It was some time ago since I read his paper now but from what I recall.

The notion that there is no absolute state of a system independent of observers is I think good.

But Rovelli also holds the position that QM is complete and doesn't need revision. Now the way he chooses to argue in favour of this, reveals that he is mainly arguing against a fictive "hidden variable" opponent. But I have another objection - that QM is TOO complete still.

But instead of suggesting that QM needs revision, Rovelli argues like this (clearly influence by geometric logic).

Rovelli argues that instead of the states of the system, we should instead consider RELATIONS. Ie. the notion of "relative state" is a relation. Note here the analogy where one can argue that there is no such thing as absolute probabilities, there are only conditional ones.

Now the obvious objection here, that Rovelli also points out, is that there can be no "absolute relations" either! Good.

But then his reasoning becomes foggy to me. He argues that the relations are to be treated "quantum mechanically". This is not satisfactory IMO.

So if I understand him right his reasoning goes something like this

- there are no absolute states, only relational states (ie relations)
- there are no absolute relations, only relational relations (ie relations)

The basic observation here is that this is an induction, and we are basically ending up with a sequence of iterating relations of relations of relations.

Clearly this infinite reasoning must be tamed. Either you just truncate it and consider a certain level of absolute relations, but then the logic of reasoning is broken.

Or you must find a dynamical principle that tames these relations. This is what I think we need. The limite information capacity puts a bound to the iterative scheme. But I am not clear exactly on how rovelli takes this on. on first reading I got the impression that he (in line with GR logic) truncates. But I could be wrong.

I took a break from his reasoning and reading some other stuff. But I'll get back to his logic later.

Regarding your reflections on the logic of guessing etc, have you read any of E. T. Jaynes' books/papers? I strongly recommended his book "Probability Theory : The Logic of Science".

Yes, Caticha is referring to him repeatadly in this work too. I read the incomplete part of his book that is on the net. It seems he died before the book is finished, and now some of his friend finished and published the book. I don't have the full book, but I partly like a lot of his writings. But I looked into this last year I don't remember the details, but like with most papers it contains good ideas to be added to others to get a satisfactory picture. He does, as far as I remember not explicitly deal with information capacity limits on the logic itself. Unless it's done later in the book.

/Fredrik

Last edited:
Fra
Some more highly personal reflections on rovelli's reasoning.

There are several things in Rovelli's reasoning that doesn't appear plausible to me, they rather appear to be a constructed reasoning in order to suggest the desired result - the QM formalism.

For example

C.Rovellit plausibly argues :

" Therefore there is no meaning in the “absolute” relation between the views of different observers. In particular, there is no way of deducing the view of one from the view of the other."

but then he expands:

"Does this mean that there is no relation whatsoever between views of different observers? Certainly not; it means that the relation itself must be understood quantum mechanically rather than classically."
:uhh:

later he says:

"Suppose a physical quantity q has value with respect to you, as well as with respect to me. Can we compare these values? Yes we can, by communicating among us. But communication is a physical interaction and therefore is quantum mechanical. In particular, it is intrinsically probabilistic."

The hidden structure here, that I find breaks the relational scheme is the notion of probability space.

He keeps discussing and elaborating this, and later in the paper Rovelli's escapes this by adding a fotnote.

"I do not wish to enter here the debate on the meaning of probability in quantum mechanics. I think that the shift of perspective I am suggesting is meaningful in the framework of an objective definition of probability, tied to the notion of repeated measurements, as well as in the context of subjective probability, or any variant of this, if one does not accept Jayne’s criticisms of the last."

I feel that to avoid the physical meaning of probability, is avoiding one the the main issues. I think that part of the logic of reasoning is implicit in the probability formalism.

Here is where I find rovelli's reasoning to not be entirely consistent. He argues that there is no absolute states, only relations, and even that there are no absolute relations.

But then he seems to add assumptions that suggest that the set of relations still has a given structure, according to his constructed logic. Here we have anothe dependence, which I think of as an implicit relation that has been fixed.

As I see it, his construction is relative to his choice of strategy which I don't quite share.

I think there should be imlpemented a feedback between strategy and interaction, where the logic of correction would provide the selective pressure that guides the evolution of strategies. Here I think the meaning of "probability" needs to be adressed.

To me the abstraction of probability is that of measure constructed (even if imperfectly) that points out the way forward. In this construction I suspect time will emerge as well.

As I understand it rovelli never attempts to construct this measure, he just assumes that it makes sense and "everybody knows what probability is". That is what I really don't like. I think that trying to construct it, beeing constrained by the observers comlpexity, will suggest additional constraints on the measuree itself.

This logic, seems to be consistent with rovelli's LQG which in a certain sense, as I understand it attempts to find the right relational angle of GR, to apply standard QM. Maybe it is an improvement, but I can't help thinking from my limited view that this isn't radical enough.

/Fredrik

Fra
the manifold

To get back to the intention of the thread.

How can one uderstand the meaning and logic of the spacetime manifold?

In line with my first simple associations above, I would be tempted to associate the spacetime manifold with the microstructure of distinguishable states. But then is seems the process of indexing the states, is a process that is constrained by the observers "memory". Geometry and distances are induced as per som choice of information measure - information distance between states a la two points are close if they are "likely to be mixed up". This is the way ariel Caticha argues for the information geometry construction of spacetime. He argues that instead of thinking of two points are unlikely to be mixed up because they are far apart, he turns it around and say that since they apparent ARE rarely unlikely to me mixed up, that are by construction far apart.

Spacetime coordinates would be something like any choice of indexing of these states. The observers memeory capacity would limit the size of the "visible manifold". I see no meaning in the notion of a manifold beyond what the observer can relate to.

An apparent problem with this analogy, is that it does not distinguish between matter and space. But OTOH I don't really think that is a problem. As I see it the distinction could lie mainly in the notion of inertia and confidence. It's simply unlikely to see sudden microscopic transitions of macroscopic space, therefore it effectively serves as a background. But in reality the distinction could be just relative stability of communicating structures.

The same problem seems suggest dimensionality to emerge out of selforganisation. This is in line with many papers that has been posted here.

So, dimensionality and matter vs space seems to, conceptually, be a possible result of self-organisation in the observers microstructre. An "image" is forming inside the observers microstructure, and this image ideally reflects what is going on, on the outside.

This is how one would also expect that classical logic, or QM superposition could be a simple matter of point of view - and the observers microstructure and image on it, is evolving to a consistent view. So there should be a clear logic ot be found to explain the emergence of quantum logic, from classical logic in the sense of transforming the point of view. And these transformations could perhaps been given a gaming perspective.

If we could find the formalisations of this, wouldn't it seem likely that the emergent general structures, would have emergent general properties that we might be ableo to identify with standard notions.

Like what is the meaning of mass, energy, entropy, particles, time from the point of view of a self-organising observer built from almost "nothing"? Can these concepts be shown to emerge unavoidable, as a consequence of the induction "logic of the logic" we seem to be circling around?

/Fredrik

Fra

The hidden structure here, that I find breaks the relational scheme is the notion of probability space.

He keeps discussing and elaborating this, and later in the paper Rovelli's escapes this by adding a fotnote.

"I do not wish to enter here the debate on the meaning of probability in quantum mechanics. I think that the shift of perspective I am suggesting is meaningful in the framework of an objective definition of probability, tied to the notion of repeated measurements, as well as in the context of subjective probability, or any variant of this, if one does not accept Jayne’s criticisms of the last."

I feel that to avoid the physical meaning of probability, is avoiding one the the main issues. I think that part of the logic of reasoning is implicit in the probability formalism.

Here is where I find rovelli's reasoning to not be entirely consistent. He argues that there is no absolute states, only relations, and even that there are no absolute relations.

I noticed some very practical mp3's of rovelli's intro talk at "The First Quantum Geometry and Quantum Gravity School" (http://www.fuw.edu.pl/~kostecki/school.html), and I just put them on my phone and listened to them scattered througout today and after listening to his talk my critical view from reading his book is strengtened.

The way he avoids the fundamental logical of quantum mechanics and considers it a separate and maybe harder problem is explicitly admitted in this talk in response to the q&a session (second file).

This is a point in his initially plausible reasoning as seen in his books and papers are IMO broken.

As I was aware hooft and penrose have at least tried to address this in a deeper way, this was also mentioned by rovelli. So far my limited understanding of Penrose ideas isn't to my liking. So what about hooft?

Now my question of this post is, does anyone know where to find hoofts most recent contribution to these issues? And I mean the fundamental issues.

As I see it the main issue is how the constraint from the observers complexity, really does constrain say the choice of states of gravity? How does it constrain the selecton process, and the selection itself? I think it must do, and it's disturbing how rovelli can manage to not consider this affecting the story?

A paper or a book where 't hooft lines out his reasoning and his tentative strategy?
Or anyone else for that matter?

/Fredrik

Fra

I am trying to see from my point of view, the logic of reasoning implicit in Einsteins Equations, and it seems the E-H action is one simple place to start.

The EH action in vacuum in compact is
$$S_{EH} = k \int R dV - 2k \int \Lambda dV$$

Einsteins expected the cosmological constant to be zero, but now we expect that it's almost zero but not quite. S is an arbitrary measure chosen as -ln(P/Pmax) which measures the probability of our future expectations beeing right (in a special sense).

Now there is a simple comparasion between a choice of measure of probability of probability when applied to a simple case where the retained history can defines a probability distribution of the future, and the question asked here is, the coupling between history and expectations on the future. There is a simple but interesting analogy to EH action here.

$$\textbf{S} = M S_{KL} - ln(w/P_{max})$$

Here S_KL is the kullback-leibler information divergence, M is the "sample size" of history - corresponding the observers memory capacity. Pmax is a maximum expected "fuzzy probability" of correctly predicting the future. w is a weight factor, that approaces 1 as M -> infinity.

Note here the logic by which it is a correct "expectation" that ln(w/Pmax) -> 0 in the continuum limit, but incompletness suggest that it's not exactly zero.

...

1) ricci scalar can be interpreted as a sort of "density of information divergence"

2) The cosmological constant seems to associate to the fact that we have discrete information! In the continuum limit, and a infinitely massive observer one would expect a zero constant, but a finite observer can never conclude a zero value. Rather one would expect it to be "almost zero".

3) In a certain sense (to be clarified) the EH action can be thought of as the "action of the action", and this is immediately realized to be nothing but an induction step.

This is very fuzzy and needs refinement. I'm currently trying to analyse the meaning of this. It does seem that Einsteins equation could be interpreted (alot of details missing though), rightfully as an "expectation" almost in line with that we "expect" the entropy of the universe to always increase, but the fact that we don't know with infinite confidence, suggest first of all the non-zero, but close to zero lambda, but even worse, the whole equation might possibly be just one step in a certain direction (guided by a certain logic of reasoning).

There was not much associations to this thread I suspect because I am mainly in the process of finding a new logic things are admittedly lacking explicit rigour and it´s mainly an appeal to intuition.

But I would like to add another suggestive reflection to this.

I was recently reflecting over the holographic principle reading Raphael Buosso´s review of the holographic principle from 2002 (hep-th/0203101) + some other papers for inspiration, and again there several analogies to the abstractions relating the above to the holographic ideas and then entropy bounds.

I noted that from my point of view, there are reasons to suspect a connection with the cosmological constant and the discreteness of information. The interesting part is that there are independent ways of reasoning to such a conjecture.

For de Sitter spacetimes the paper suggest a entropy bound that is inversely proportional to the cosmological constant, linking in general an entropy bound with a non-zero cosmological constant. Which is pretty much the essence of the association I made above.

$$S_{global} \leq \frac{3\pi}{\Lambda}$$

It seems one might still expect that a yet unknown more general deeper formulation of a kind of holographic principle remains to be found. What can it be?

If we look at the bekenstein bound

$$S_{matter} \leq 2\pi E R$$

it has the form of a bound of energy x scale.

Now consider

$$\textbf{S} = M S_{KL} - ln(w/P_{max}) \leq M S_{KL}$$

has a similarly suggestive form, here M corresponds to the "mass" of the microstructure or the "inertia" of a statement, and the information divergence is a kind of "scale" in the sense that it defines a kind of "distance" between two states. So the scale could be interepreted as the "extent of divergence"

It seems a possbility here for abstractions where the entropy bounds are simply a consequence of the way measures are constructed. Each measure, has the basic properties of "mass or inertia" and it has "size" or "scale" and this gives a relation between the measured value, it´s mass and it´s scale.

This might also suggest how dimensionality can emerge by extending the dimensions one by one, not too unlike the excitations of structure in string theory, but instead there are discrete structures, and each excitation is like an extension of the microstructure and there should be holographic relation between different dimensions that somehow constraints the construction of systems.

Systems that grow a lot int scale, loose stability, and therefore there might be an explanatgion to why certain dimensionalities are the optimum ones. Perhaps there is a way to argue against this from very deep first principles of combinatorics and a reconstruction of contiuum models. The Einstein equation is loosely suggestive and it doesn´t seem unreasonable to expect that a deeper first principle involving some deeper holographic principle should be able to pretty much derive the essence of GR.

Does anyone know of any state of the art papers in the field of holographic principle + information theory + observer that treats from a fundamental point, rather than within a specific framwork (such as semiclassical ones or string frameworks)??

I am getting more a more convinced that part of what I am looking for will have a strong relation to some version of a holographic principle and it will be the key to resolve the observer problem in QM.

/Fredrik

Fra

I apologize if this makes no sense, but perhaps someone can associate. A key point to add to make any sense out of this is that in my view, the measures of entropy (of which only the relative ones make sens) and the measure of action, are really to be seen as two special cases of hte same generalisation.

That's why I wildly mix entropy and action above.

/Fredrik

Gold Member

I am getting more a more convinced that part of what I am looking for will have a strong relation to some version of a holographic principle and it will be the key to resolve the observer problem in QM.
I got the same conclusion several years ago,but I can't find anyone to think about. One think that I always thought, and was proven correctly recently, it is that you just need the light-sheet entropy to count all the entropy, including the matter entropy.

That is a very useful, as we will see later, because, there is a fundamental flaw to characterize an observer, classicaly.

1. An observer can not be localized, because of quantum mechanics,

2. because of that, its nature cannot be measured localy, miscroscopicaly, when you see the Feynman diagrams,

3. and considering at least the whole Standard model, it just changes into an infinite quantity of states.

4.Worse, it can decay, so any observer is not essentialy unique.

So, you don't have an observer in quantum mechanics, you merely have a measure.

But, since you can measure the entropy at a given point, what I propose to do, is to find the quantum state of a light-sheet at a given point. According to bousso, the light sheet is projected in a holograhic screen. What I propose it is that the total holographic screen hould give you the interaction with the other light sheets, that is you would find how this light sheet interacts with the others. Note that the geometry of a light sheet is fixed, not dynamical, and it should represent the geometry of the most general proability wave at a given measured point. Note that the dynamics just happens in the holographic screen, and the geometry of the total space emerges just by summing the light sheets.

Fra

4.Worse, it can decay, so any observer is not essentialy unique.

So, you don't have an observer in quantum mechanics, you merely have a measure.

I am not sure I followed in what context you said this.

The fact that in ordinary QM, we consider "measurements" but not how this measures "lives" (if I may use that term), I agree on that. And this is exactly the problem. You can just read any of the standard introductions like Diracs text, and it is clear how he introduces the abstraction of measurements, but he fails to address the *physical construction* of the measures, and wether there are any limit relation between the number of measures, the resolution of measures and the uncertainty in the measures that makes sense at a time. I think such constraints exists and ignoring this is responsble for some of the apparent redundant amount of information existing in the standard quantum formalism. I think there is a solution to this.

Without the observer, there is no reference for the measurements. The measurements are somehow assume objective and observer independent. That makes no sense except in many special cases of course.

It is true that there are "many observers", but here is how i picture that:

Not all "possible" observers are equally fit! This means that some particular observers will distinguish themselves. In my vision part of the trick is to understand the evolution of these observers in relation to the evolution of the universe. And that in a sense they are the same thing since the evolution of the only speakable universe we can speak it is the IMAGE that does live or manifest itself as distinguished observers.

Thus I start by considering a "screen" or "communication interface", which I consider to present a number of expected distinguishable events. This number can be seen as an "area" of the communication channel. And then as communication takes place, this area maps out a higher dimensional object. This object is constrained by the structure of hte observers as I see it.

So I picture an observer, as a measure-complex, which defines not only the possible measurement outcomes but also the resolution and confidence in these measures. All this is bounded by what I think is associated the observers mass, or complexity.

This measure-complex can then be "tuned" or evolved to survive against perturbations, and the emergent structures are those that are consistent with the environment. Ideally these are our observers. For example elementary particles. So the predictive power of this picture
evolutionary scheme should ultimately be what the most probably stable structures to be observerd are.

So there is no clear cut distinction between observers evolving as per the laws of physics in their environment, or the laws of physics beeing influence by observers, I think it´s rather a relative evolution. Just like the one in GR between dynamics in spaceime and dynamics of spacetime. I think the same relation exists between observers and the environment in general. In the differential sense, the observer acts upon his OPINION or INFORMATION of the laws of nature, but he ALSO immediately responds to any constructive or desctructive feedback as to wether this INFORMATION needs to change.

So in a sense I picture the "internal structure" of observers, to IMAGE, the laws of nature in it´s environment. So what I think of as the logic of guessing and logic of revision is the same as the quest for the logic of physical interactions, because I think that is ultimately how the world is constructed. This is a vague suggestion of a deeper holographic principle that I´m seeking.

The main advantage I see, is that this whole "picture" provides an excellent intuitive handle on the world, although very unlike the traditional realist pictures.

I still find that little has been done in the specific angle I look for. I feel that a lot of this is very general stuff, that should find a abstract model independent realisation. And with model independent I also mean not assuming a spacetime manifold.This principle should rather perhaps explain, together with an evolutionary feedback the FORM of the current laws of physics, including GR and QM.

So the "problem" of the arbitrary observer, might be resolved by the idea that an arbitrarily chosen observer (measure-complex) should spontanesly and very quickly destabilise into some expected stable configuration. This explaines why in despited of the apparent lack of fundamental rules, we see order.

I´m always looking for other papers on this since my own progress is very slow for obvious reasons.

/Fredrik

Gold Member

I will try to come up with another way to show that an observer is a classical or emergent entity. An observer assumes a diffeormophism of world lines or world volumes of the measure equipment.

You can see that more clearly in high energy experiments. The fundamental observer here are the scattered particles, which actualy probed the interaction, you get the information from the measured particles when they are infinitely away from the interaction, that is in the detectors. (S-Matrix at the infinitum). The detectors themselves are also microscopicaly "pulsating" in a sea of particles, but since they are so big and massive, you can ignore the quantum behavior, and thus you can actualy set a reference frame, scientists, a computer, memory, in which you can store tables with averaged numbers, that is, the eigenvalues of the particles. So, in reality, when you set up an observer, you are also assuming that a part of the system is classicaly stable, and therefore, follows an averaged quantum path in which is so precise, that you can count and take measures.

So, a quantum system is:

the non measured space, but the only place where interactions happen + the measurement (no sense of talking about refencials right here, but right under)

A classical system is:

the place where the measurement taken, where no interaction happen + where the reference frame is set up and measurements are mapped into one of the possible allowed states.

So, the incoming particle has only quantum behavior when it is not seen. When it is detacted, it becomes a probe, an extension of the detector into the interaction.

Gold Member

The idea is then get rid of this ambiguity, by dividing the quantum and the classical world, in dual pictures, where the quantum world is the holographic reflection of the world. The classical is the light sheet and that is the only place where GR. This is like a dead end, unless you consider that the classical world as restricting parameters of that quantum world. That is, do not violate causality, light speed(?s), maximum entropy.

Thinking like this, I may argue, and propose, that one should view this holographic screen is a lattice, in the same way Wilcek proposes. The maximum entropy means the lowest possible coherent state, above that, everything becomes localized around the "atoms", and the space becomes broken. In the light sheet view, now according to bousso, it means a collapse into a black hole. I can think of more nice things, but I am uneasy about all this.

This does looks like string theory, in which a string, viewed in the base space, is really a quantum lattice, and when you impose the equivalent of "classical conditions" on it, you then get the target space, which is how "classical space" called here. Some of the restrictions are: it does not have a central charge, several bondary restrictions, close, opened, compactifications, so that you end up finding the "real world", with a number of dimensions, particles, S-Matrix at infinity, etc. Note that string is NOT an holographic model, I am just pointing out the semantic similarities.

Last edited:
Fra

I will try to come up with another way to show that an observer is a classical or emergent entity.

This is difficult to discuss. In one sense, the observers are emergent also in my way of thinking, but emergent and classical isn´t necessarily the same thing as I see it. I can equally picture an emergent superposition. I don´t think I clearlt understnand your suggestion.

I do not consider the covariant entropy bound as the ultimate deepest formulation of some holographic principle. I guess one problem is to exactly define what the holographic principle means, while many seem to agree that there is something to it, even though it´s hard to pinpoint.

An observer assumes a diffeormophism of world lines or world volumes of the measure equipment.

I try to distinguish the intrinsic view and the extrinsic view. There is a difference between one observer, describing a "measurement apparatous" interacting with a system, or an observer "living" the interaction with it´s own environment. It´s the latter view I think is the proper view. In a certain sense, there is only one observer. The fact that I can communication with parts of my environment which in a certain sense are also considered "observers" is a different story.

As I see it, the observers entire life is not known or even defined in advance. I think the focus lies in the current state, and how we expect it to evolve into the future. So I tend to to see the entropy bounds in this perspective, as a differential bound. The future is expected relative to the present, but I see not bound to the violations of the expectations. I more see it so that regardless of this, actions are based on the expectations are the only rational action. The observer bets his "life" or existence. The wrong actions will not preseve that behaviour.

About the s-matrix Bousso in his review article also notes that in the general case this abstraction is not well defined for various reasons, because there might not be any unique initial and final states and sometimes the lifespan of hte observer is finite. It makes sense for particle accelerator setups, but clearly that is a highly special case and argument from there can´t IMO be extrapolated to general cases.

Also the entire use of abstraction of various boundary conditions of the universe are highly suspect IMO. I think a fundamental theory must not make use of such suspect assumptions.

By trying to picture these issues from the intrisic view, of a an observer actually living his interactions, rather than some superobserver "observing without interaction" a second observer who is interaction with a system, the more profound problems are more apparent.

To me from the intrisic point of view, the observer is constantly evolving, learning and adapting. And the entire view of the universe, is constrained to LIVE within the degrees of freedom under the observers control. By learning to predict the environment hte observer can grow and acuiqre more degrees of freedom. But the opposite is also possible.

It´s in this intrinsic perspective I´m trying to attache a holographic principle. The light sheets somehow represent causal histories of distiniguishable degrees of freedom, and this I picture are related to the memory strucutres in the observers information storage.

I don´t think in terms of classical and quantum, I suspect that a quantum superposition can be explained as an evolved internal microstructure in the observer, that provides maximum fitness. To actually based he actions on all possibilities has an utility. I have not proved this, but I have some ideas how to actually scetch a mathematical reasoning for emergent quantum behaviou as a result of selforganisation in the observers knowledge. But there are other problems I need to solve first.

/Fredrik

Gold Member

I, like you, don't consider the covariant bound to be the deepest possible insight over things, instead I think it is another way to try to dig on the directly unobervable nature of quantum mechanics, pretty much in the same we can't detect objects like probality waves, virtual particles, ghost particles. But they do help us solve and influence things happening in the real world, so it begs us to believe in them, despite we not being able see them.

For instance, GR requires that an observer that follows a smooth world-line using a ruler and a clock. The problem it is that it doesn't exist anything that follows a smooth world-line, unless you are considering an averaged path. But we are looking in a micro world, not in a macro world. The problem with a ruller and clock it is that we it doesn't exist anything that has, in a microscopic sense, something that fundamentaly has a regular or constant lenght, or in which can clock-beat in a regular pace. I rather get rid of the concept of observer, and look for something else. One thing that I know it happens is the existence of measurements.

Thus, I want to measure something, when it is measured in average, gives you an observer, a clock, a ruler and GR.

Gold Member

My desire would be to combine my favorite salad:

1. Emergent Loll spice-time
3. exotic 4-dimensional manifolds
4. on Bousso's holography.

Fra

I rather get rid of the concept of observer, and look for something else. One thing that I know it happens is the existence of measurements.

Thus, I want to measure something, when it is measured in average, gives you an observer, a clock, a ruler and GR.

Ok I think I start to see your vision now. The problem of the observer you wish to solve by getting rid of it. Ie. you try to consider the concept of measurement, without considering an observer.

To me this is not too far off the standard formulation though, which impicitly uses some kind of superobserver? Here I have a different idea though. I take the opposite direction because I think an superobserver view is never a consistent intrinsic view.

However in my vision, it will still be true that, when one observer, observes a second observer holding a stick and a ruler, this second observer is "only" emergent with respect to the first one. This seems in line with what you imagine too. But I hold the opinon, that the entire "description of this" can not live all by itlsef, it only makes sense in the context of an observer. Ie. it takes an observer to express any statement of any kind.

I see no way to getting rid of an observer. Ultimately of course, if we consider that *I* am the observer, then I agree with you, that all other observers will be emergent. But I try to abstract the picture and ask for example: How does say an electron see the world? ie from his intrinsic point of view. That probably should make his interactions more rational.

So to the problem is arriving at stable sticks and rulers, I think these measures are emergent also within the intrinsic observer. I think the starting point does not contain a 4D manifold. The 4D structure will be emergent due to the observers development. So in a sense, I would say that the 4D structure is emergent, when the observer has equally evolved to the point where he has formed an understanding of his environment which suggest a 4D structure. So I think the higher measures, such as rulers and sticks, can be built from patterns of more elementary measures.

Some interactions are simply not present in the domain of low complexity, since those measures are constructed by combinations of the simplicity that isn't distinguishable until more complex (massive) observes has evolved.

In this sense I think the first part of evolution is the simplest if you see it from the intrinsic point of view. Since it was only simple observers around, it was severe constrains on "how complex" interactions they could participate in.

The emergence of dimension I picture as excitations of lower dimensions, here I can associate to string theory if you consider the excitations in one dimension at a time. Except I picture discrete states, not real numbers. A point is due to uncertainty excited into a probability distribution (String) that could be stable, or if not even this is stable it can further be excited into a 2D distribution. Where the previous measures provide the index in the higher spaces. In all this, I pictures the data living in the observer, and the information capacity of an observer bounds the possible structures that it can see.

/Fredrik

Fra

My desire would be to combine my favorite salad:

1. Emergent Loll spice-time

Your salad might be interesting :) I also believein emergent spacetime, but I do not think Loll is radical enough.

I want emergent spacetime, but I also want emergent actions. Because in my abstraction there is no clear logic in fixing some information, and varying some. I think there should be a deeper emergence where even the GR action is emergent.

That might be asking for too much, but I think it's possible, and hence I lack motivation in investigating approaches that doesn't meet the ambition.

/Fredrik

Gold Member

The electron, as any particle, is an entity that exists, in principle, only in the moment they are measured. In principle, they propagate in a sea of inifinite possibilities, it's all stochastic, and the only sense we can make out of this is the weight, or probability amplitude, that each of this possibilities carry with them. So, what matter for what the 'electron sees' is the set of all possible quantities that can be, in principle, measured in the moment of measurement.

An holographic entropy bound gives a saturation limit to the quantity of all possible measurements that can be made in a physical, by mapping that saturation limit to a space quantity that is actualy the boundary of that same system. The reason of why a space quantity appears it is that there must be a correlation with with different parts of a system, so that the probability of a probability that something is measured is dumped. Because of that, the microstates of a system must be now be counted not as integers, but as real numbers, that is, corrected by a probability function, based on the saturation.

Gold Member

Whenever a measurement is done, there must have will be a kind of lossy signal trasnmission, since there will be a probability that something won't be measured. According to Shannon, the maximization of the fidelity of the signal will be best in certain dimensions, and that is related to the sphere packing problem. I know that the best packing density is achieved in 8 dimensions, with the E8 lattice.

Fra

I think we can agree that we see things differently. If I had made more progress I might be able to argue more constructively. But you raise one point which I see as another key point.

The reason of why a space quantity appears it is that there must be a correlation with with different parts of a system, so that the probability of a probability that something is measured is dumped. Because of that, the microstates of a system must be now be counted not as integers, but as real numbers, that is, corrected by a probability function, based on the saturation.

You seem to argue like many others, that a degree of plausability or probability is necessarily represented by real numbers. Here I disagree - I ask myself, to what extent is different plausabilities distinguishable to a real observer? Here the continuum seems to be pulled out of thin air.

With this I mean that, of course it's trivially so that all rational numbers real numbers, but the illusion is that you get the impression of a PHYSICAL continuum, when you consider the state space to be real numbers. The argument is that almost everything we do are based on calculations of real numbers. However I disagree, because any actual real explicit numerical calculation are made with finite precision. And in computers real numbers are really usually represented by 32 or 64 bit strings. So I would say that argument that the choice of real numbers is obvious is not true.

In part of the reconstruction I picture, there will be a reconstructed and modified logic, and a modified probability. IMO, the relational picture suggest that not only the event space, but also the probability values are generally discrete.

What is done in probability theory, is that we take a lot of limits. That is fine as long as we talk about mathematics, but when you start to think what this means in terms of real physical representation and real computation these limits really doesn't correspond to reality as far as I see.

If you are going to assign any sensible physical meaning to something as a probability, then it seems clear that the resolution of the value space also requires information capacity.

What bugs me is that this is tactically avoided by many otherwise great thinkers. Rovelli explicity avoids discussing this in his book. He doesn't want to enter discussions on "the meaning of probability". Of course the question isn't the meaning of probability in the axiomatic mathematical sense but rather how this abstraction maps to reality. I take this point to heart and thinks that it's a deep mistake to step over this. This is IMO part of the origin of the entire continuum issue.

But it's hard to convince anyone by talking. I believe in this ideas becauase Ithink that I will actually be able to construct something that has power. So the ultimate argument of mine is still in progress.

/Fredrik

Gold Member

If you are going to assign any sensible physical meaning to something as a probability, then it seems clear that the resolution of the value space also requires information capacity. /Fredrik

I am not talking about a continuous of values. And you are right, I am assingning an information capacity to each state. It's a real number, but not a continuous. Let't say the capacity for a system is 5 states and you have 6 states in your system, so, you could count the states of the system as, for example, 0.9*1 + 0.8*1 + 0.9*1 + 0.9*1 + 0.5*1 + 1*1 + 0.9*1, instead of 1 + 1 + 1 + 1 + 1 . At least one state would necessarily be not counted in a given measure, it would be randomly excluded, they would be necessarily be lost in the noise of the system.

You could lose everything! But that's unlikely.

Fra

I need to think about what you suggest, but are you suggesting that a 6-state means 6 real numbers? How much information is required to specify a real number?

Or how are the real numbers generated, unless they can fill the continuum [0,1]?

/Fredrik

Gold Member

When not measured, there is infinite space for information. But, when measured numbers are just an aproximations to your fundamentaly random, so after some decimal places what is shon in the measrument is just a random and fundamental noise. So, those numbers, in principle, might even be inifinite, but it won't really matter, because you won't get rid of that noise. You will never measure that state with indefinite precision.

Fra

When not measured, there is infinite space for information.

Ok I see. In my way of thinking thouhg I have problems with this notion. I don't have any problems with uncertainty and lack of determinism, but I have problems with trying to force structure on the uncertainty.

I choose to only talk about information at hand. And regarding conjectured information about possible futures, I see problems if one has to average over infinite set of possibilities and I see divergence issues. I think that even in between measurements, the state of possibilities is bounded, but the bound is dynamical, not fixed. But at each instant, the action is evaluated only based upon the distiniguishable possibilities, that I think of as finite (or at least countable). But in addition to this, there are unpredictable elements, but instead of labelling them as random variables in a infinite information space I simply consider them "unexpected", and the difference to me is that actions are based only upon expected possibilities. But when unexpected events occur, the space of possibilities are evolving.

I guess it's the fitness of the final model that counts, and we might choose different ways.

If I may ask, what is your general expectation of the answers to some of these fundamental questions? Are you closer to some of the big programs, strings?

/Fredrik

Gold Member

one has to average over infinite set of possibilities
Why is that a problem?

the bound is dynamical, not fixed

Sure it is dynamical. The quantum differantial equations are solved before the states are measured, so there is room to change everything all the time.

Fra

Why is that a problem?

I guess I need to say it doesn't "have to be a problem" - it's mainly a problem in my view. I would assume that in your way of attacking the problem, you have another way of seeing it, so that perhaps it's not a problem.

But in my abstraction it is, and it has to do with my view of time and expected evolutions. If there are infinite and in particularly uncountable possibilities for the future then I wonder how stability in the expectations are ensured.

To me the bounded set of possibilities, for basis of actions (weighting all possibilities) is a key to construct the inertial concept. How do you compete with an uncountable set of possibilites? There is a way I can see the limits of finite sets of course, but then the exact way of taking the limit is important. This is mainly why I can't just start with a continuum. My only way of keeping track of what I am doing, is to consider the limiting procedure. And I guess only possible way, alternative to my "finite view", is to consider continuum models but where you somehow consider the the limits are taken at different rates. But that gets twisted and I think it doesn't help the clarity of reasoning.

I try to exchange ideas at this point, I am still struggling with a lot of the formalisms. My clearest guide is the conceptual reasoning.

I picture expceted time evolution as somewhat like a generalized diffusion process, and time can parameterize the diffusion path. But in reality, it's not "something diffusing", it's more like a random walk, that on average follows diffusion. The diffusion corresponds to a "geodesic", it's the evolution given no correcting feedback. But in reality correcting feedback tends to be present, and this corrects the microstructure that represents the possibilities. So I picture a weight or mass, to each possibility microstructure, that has the purpose of inertia. This is why my entire abstraction gets tossed on it's back if all of a sudden I am dealing with infinite sets, which correspond to "infinite inertia".

/Fredrik

Gold Member

But in reality, it's not "something diffusing", it's more like a random walk, that on average follows diffusion. The diffusion corresponds to a "geodesic", it's the evolution given no correcting feedback

Well, that's not much different from what I am thinking. It's just that the diffusion becomes larger and scattered near high entropic places.

If there are infinite and in particularly uncountable possibilities for the future then I wonder how stability in the expectations are ensured.
I guess only possible way, alternative to my "finite view", is to consider continuum models but where you somehow consider the the limits are taken at different rates.

The infinity here would be just the dimension of the vector of a state in the hilbert space, which would cause a unique projection when the measurement is taken. The only difference here it is that you have a function that gives you a probability of that projection not taking place. Note that the information state would not be destroyed and information lost.

So, let's see why this is similar to what you think and here is some of the consequences.A gedanken experimento would go like this:

You put a cesium clock near a strong gravitational field. A strong gravitational field means that you have a more saturated holographic bound, in general, so, when you approached the stronger parts of the field, the probability that the cesium would emit a photon with time, to beat the clock, would be lower. So, as you get close to a black hole, the cesium clock would beat slightly slower, in your frame of reference.

But, if you get too much close of the event horizon, the probability of any particle interaction happening inside your body or vacinity would be vanishingly small, because any boson, the carriers of force, in your body would have a really small chance to interact with anything. That means that approaching a horizion is cannot be as harmeless as in GR, but reduces you to a gas of free particles.

I picture a weight or mass, to each possibility microstructure, that has the purpose of inertia. This is why my entire abstraction gets tossed on it's back if all of a sudden I am dealing with infinite sets, which correspond to "infinite inertia".

Seeing that some of our ideas converge, I would like you to explain this point better.

THANK YOU VERY MUCH :D

Fra

If you want a proper rigorous exposition of what I described conceptually it's still in progress. I am working on some mathematical model but I think it benefits no one to communicate details that is immature. I think it would not serve a purpose, the understanding of the coming math starts with the conceptual part anyway. I do it the other way around. I start with a conceptual abstraction, and ponders how the mathematics must look.

Anyway I think a key is how I treat probability. I am considering something like "logical probabilities", which are encoded in the observers makeup. In my abstraction the mathematical structure of an observer is a system of communicating microstructures, and the CHOICE of structure and the STATE of the structure encodes the information the observer has, and thus the image of it's environment. This means tha logical probabilities does not have to conincide with an actual future distribution. It is an expectation on the future, which is the basis for actions. But of course "in equilibrium" the expectations will met the actual outcome, since there is nothing more to learn.

My models are combinatorical systems, which are related. And the complexion numbers relate to inertia, and I consider something like "counting evidence". So an opinion has inertia in the sense that it takes a certain *amount of* ANY contradicting opinon to bully or change a given one.

I do not start with spacetime. My "space" are the space of microstates, which is like statistical manifold, except a discrete version. And each manifold has a natural direction of time, but an observer can be a system of microstructures, this is how I imagine how non-trivial dynamics emerge out of this simple picture. It's just just diffusion ttype dynamics. Similarly there is a representation of a superpostion, as a relation between two related microstructures.

But this is in progress. I do not want to enter details until i have at minimum covinced myself beyond my own reason. I'm not quite there yet :)

/Fredrik