AlexCaledin said:
But how it can be coherent? What happens in every local measurement is choice between variants of the whole universe, while QM is telling about a limited experimental system.
Posting past midnight isn't a good thing but here is a simplified view of what i think of as an "inference interpretation", which is a highly twisted version of Peter Donis (1) version of the interpretation.
Coherence requires unifying unitary evolution with information updates, in the sense that in the unitary description by O3 of [O1 observing O2] must have a hamiltonian describing the internation interactions of the O1-O2 system that as per the inside view, is information updates. The problem is that if O1 is not a classical observer, the current theory does not apply. This is conceptually incoherent.
1) "Observer equivalence"
A coherent theory of physical inference must somehow apply to any observers inference on its environment. Not only to classical observers, because the difference is simply a matter of complexity scale(mass?). Current theory provides almost NO insight into the inferences of non-classical observers(*)
2) "Inferrability"
An inference itself contains premises and some rule of the inference. This rule can be a deductive rule such as hamiltonian evolution, or it can be a random walk. The other premises are typicaly initial conditions or priorly prepared states. Now from the point of view of requiring that only inferrable arguments enter the inference, we end up with the conclusion that we must treat information about initial conditions, no different than information about the rules. Ie. a coherent theory should unify state and law.
=> The inference systems itself, is inferred, and thus evolves. We natuarally reach a paradigm of evolution of physical law.
(*) This ultimately relates to unifying the interactions. To unify forces, and to understand how the hamiltonian or lagrangian of the unified interactions look like, is the same problem as to understand how all physical interactions in the standard model can be understood as the small non-classical observers making inferences and measurements on each other.
Once this is "clear", the task is to "reinvent" the mathematical models we need:
My mathematical grip on this, is that i have started a reconstruction of an algorithmic style of inference, implemented as random processes guided by evolving constraints. Physical interactions will be modeled a bit like "interacting computers", where the computer hardware are associated with the structure of matter. Even the computers evolve, and if this is consistent, one should get predictions from stable coexisting structures, that match the standard model. All in a conceptually coherent way.
Conventional models based on continuum mathematics should also correspond to steady states. In particular certain deductive logical system are understood as emergent "stable rules" in an a priori crazy game. In this sense we will see ALL interactions as emergent.
The big problem here is that the complexity here is so large, that no computer simulation can simulated the real thing, because the computational time actually relates to time evolution and tehre is simply no way to "speed up time". But theere is on exploit that has given me hope, and that is to look at the simplest possible observers, it would be probably doable to simulate parts of the first fractions of the big bang for one reason - the rules from the INSIDE are expected to be almost trivially simple at unification scale. They just LOOK complicated from the low energy perspective. But the task is huge indeed. But its not just a philosophical mess at all! Its rather a huge task of trying to reconstruct from this picture all spacetime and matter properties.
/Fredrik