I'm always uncertain what is discussed here. Some people discuss formalisation of current QM. Some discuss some existential philosophical issues. Some (like me) discuss seeking a new framework (which means this should goto the BTSM section) - so I apologize it I'm drifting the topic.
Fredrik said:
It's not a measurement if the result isn't recorded in some part of the environment that for all practical purposes can be described classically.
Interesting statement.
I can appreciate that this is somehow the perspective in which current QM is confirmed. Which means that there has to exists a suffciently complex controlled environment, that then is effectively classical in which the entire quantum theory and hilbert space is encoded.
BUT, I think such view of what a measurement is, is limiting, and is likely to be inadequate for solvine open issues like unification and QG.
I think a realistic analysis suggest that the above is cleary an idealisation, that is FAPP true in the normal laboratory physics domain, but which badly breaks down if one consider cosmological models, or models where the context where the THEORY itself is encoded, is inside the system and where it's impossible to record/store/hold all information. Alternatively one can see it as an open system. Which is why I think what we need to understand "observer" and "observe the observer" is to find a new framwork that is a learning model. Because a deductive theory for a not closed system, that is correct can not reasonably be encoded in the subsystem of the system under observation. It has to evolve, and it means it needs to be an adaptive inference model; not something basic in static hilberspaces.
The picture of static hilber space is IMHO, only sensible in the approximation mentioned - where we have a sufficiently complex "superobserver" encoding the theory to the extent that the recordings and statistics is essentially classical. This is what takes place in laboratory experiments.
But we still have no unification! Why?
In the picture I suggest (learning/inference view) there is a unification suggested between what rationa choices a superobserver makes and the ACTION of the system. As we konw, the action of the SYSTEM (ie. matter, and fields etc) usually we just PULL from classical models (then quantize etc), obviously this is deeply unsatisfactory, ugly and incoherent etc. We do it because it's the only way we know, and it partially works. BUT the inference views conjectures that the ACTION of matter (lets call it the "naked action) must take the same inference form as does the "rational choice" action of the superobserver - and THIS we can understand from decision theory, and it's essentially entropic to it's nature. The rational decision is made by counting evidence and weighting them. Then, what is needed ontop of that is to "renormalize" this naked choice to the scale of observation of the superobserver.
To work this picure out as a possible route to unification, my personal conviction is we need to also rework the quantum theoty as to take the form of generel inference, applicable also to open systems. In this way the entire fixed hilbert space thing, can not be the right starting point as I see it, as it assumes that the Structure of observer is somehow not changing so much that it deforms the theory indireclt by "scaling" the plattform on which the theory is encoded. So it also must incorporate a new view of RG.
The new QM, is entangled with the problem of theory scaling, and generation of mass (as determining the statistical mass of a theory (ie how MUCH data there is supporting it giving it confidence)).
Of course this is just my personal opinon, that determines my investments but this is in the light of which I think that the work of trying to formalize a structure that is likely to be inadequate seems somewhat misdirected used of resources.
/Fredrik