Cthugha
Science Advisor
- 2,100
- 581
vanhees71 said:There's nothing jumplike in the measurement problem. The only problem with some Copenhagen flavors of "interpretation" is the introduction of the collapse, which is neither needed nor physically consistent. That's why I follow the minimal statistical interpretation which, with a grain of salt, is just Copenhagen without collapse. Although I cannot be sure about this, because of Bohr's very murky style of writing intermingling always unsharp philosophy with science, this seems to be more or less Bohr's point of view.
Well, I do not disagree - at least I think so (and for the record: as an experimentalist I try to avoid discussing interpretations unless they advance to the point where they stop being mere interpretations and make predictions that can be tested experimentally or suggest a mathematical formalism that is easier to handle or results in computational speed-up).
In this field of physics people are usually interested in experiments involving conditional probabilities based on measurement outcomes of photon detection events, so there is some need to take measurements into account explicitly. If your take on this is that one should evaluate this using unitary evolution of the system, determine the probabilities for the outcomes of the first measurement, consider the unitary evolution of this system from the possible eigenstates again, determine the probabilities for the outcome of the second measurement again and so on and so forth: yes, this works. If you additionally assume that the measurement process (or decoherence or whatever you may call it) in this case is essentially a low-probability game - which means that you rather do not have a single photon interact with a single absorber in a manner that you drive the probability amplitude for absorption up to 1, but rather that you have this single photon interact with thousands of absorbers, where each of them is driven to absorption probabilities of, say, 0.03 and one of them finally "clicks": yes, this is still a fast but continuous process and you can still get the correct probabilities for this by following all the subensembles microscopically.
However, in terms of actual modeling, this approach is quite cumbersome. For open systems and a huge environment, I think it is only natural that people try to treat the environment in a more effective way and the quantum jump formalism is a natural one - treating the wavefunction instead of the density matrix saves a lot of time. Many people consider the "quantum jump" as a rather Bayesian update of our information about the system instead of being inherent. I always thought that within an open systems scenario, where one does not have access to the full information about the system, this is the closest thing to the bare minimal interpretation you can get.