vanesch
Staff Emeritus
Science Advisor
Gold Member
- 5,102
- 20
haroldjrbw said:So the question is what or WHO caused the probability wave of the universe to collapse and appear to us in this fashion. I know some try to eliminate the role of the observer through no collapse theories like many worlds but Copenhagen, Bell's Theorum and delayed choice shows the observer affects which measurement occurs. I think this among other things points to a metaphysical origin.
Let us try to put things in order.
Quantum theory describes the state of a system as something that is evolving unitarily and reversibly as a function of time. But that state is experimentally inaccessible. When we "decide to make a measurement" we have to stop that time evolution, and from the state at hand (as a mathematically calculated entity) we can calculate probabilities of what's going to be the result of the measurement. According to the observed outcome, the state now jumps into an eigenstate corresponding to that result, and then evolves from there on unitarily.
For all practical purposes, this algorithm of calculating probabilities of measurements WORKS. It is von Neuman who formalized it.
And now comes the crux of the Measurement Problem:
how come that the evolution of a state, "unobserved" is radically different from an "observation". Both are mathematically incompatible, because the "observation" operation is a probabilistically chosen projector, and the "evolution" operation is a unitary operator.
If we think of "observation" as a physical phenomenon as any other, we have a problem. This problem still stands out. Modern theories such as string theory or quantum gravity do not add much to it, because they fundamentally still work within that framework.
You could think that the question is open to scientific inquiry, because you could ask the question as "what is performing the measurement ?", in that it must somehow make a difference if I consider the apparatus that performs the measurement to be part of the system (and _I_ am the observer), or not, and the apparatus is performing the measurement. However, decoherence theory (an application of QM) indicates that from the moment that we need a macroscopic system (many degrees of freedom, coupled to a thermal bath) as a measurement apparatus, it will not make a difference in the outcome if we include it in the system or not!
So on one hand decoherence is a blessing, because it tells you that the theory of QM is self-consistent: you can choose at what point you decide that the measurement is made without changing the predictions ; on the other hand it makes experimental inquiry into the measurement problem very hard.
There are ways to tackle the problem, but all of them are vey strange. One is that, in the end, consciousness is what performs the measurement. Another is that nothing performs the measurement (many worlds); but in that case, our subjective experience *chooses* a worldline.
Finally, a more down-to-earth approach is to try to slightly modify quantum theory, such that state evolution is not quite unitary, and can give rise to collapse (by introducing nonlinearities). But this approach has the difficulty that whatever you twiddle in QM, you seem to change the very accurate predictions which have been verified up to now.
cheers,
Patrick.