Is the collapse of the particle wave function?

  • #51
kith said:
What do you mean by outcome? I do get a state of definite polarization resp. position immediately after the filter resp. slit. Regarding the time of the measurement, the situation is analogous to the measurement at the screen.

A definite measurement outcome is not a quantum state, it is a macroscopic "classical" event, like the registration of particle position on the screen. A definite outcome is whatever one predicts using the Born rule (without state reduction).
 
Physics news on Phys.org
  • #52
atyy said:
A definite measurement outcome is not a quantum state, it is a macroscopic "classical" event, like the registration of particle position on the screen. A definite outcome is whatever one predicts using the Born rule (without state reduction).
I see this as an argument for using the term "measurement" more restrictively but what are sequential measurements then? Textbook examples like multiple polarization filters, SG experiments, etc. don't produce intermediate outcomes.
 
  • #53
vanhees71 said:
It exists due to the preparation procedure, because after the 1st SG apparatus ##\sigma_z## and position are entangled. There's a 100% correlation between the value of ##\sigma_z## and the partial beams coming out of the 1st SG apparatus!

This is different from what I am talking about. In order to discuss collapse, there have to be two measurements, so by avoiding two measurements, it isn't clear that one needs collapse.
 
Last edited:
  • #54
kith said:
I see this as an argument for using the term "measurement" more restrictively but what are sequential measurements then? Textbook examples like multiple polarization filters, SG experiments, etc. don't produce intermediate outcomes.

The polarization filter is an interesting case I don't understand well. Regardless, the textbook treatment of a polarizer is indeed very similar to a measurement followed by collapse since the Born rule is applied to the quantum state for describing the action of a polarizer. My guess is that there is a deterministic unitary description of a polarizer, but it's not immediately obvious to me.

However, to make things easy to discuss, one can follow the polarizer with a detection with a definite outcome and time stamp, as is done in Bell tests.
 
Last edited:
  • #55
A polarizer is usually effectively described as a projection operator and as such you can argue that this is not a description in terms of a unitary time evolution. Of course this description is an effective and idealized one. A full microscopic theory of the interaction of the (quantized) electromagnetic field with the material of the polarizer is described by a unitary time evolution and you don't need any collapse anymore. Collapse is a FAPP description of preparation procedure in terms of filter measurements. As such I buy it, but not the claim that there's instantaneous collapse over the entire universe by a local measurement/interaction procedure. It's also not clear, where there should be made a cut between quantum and classical description in principle. Of course FAPP often the (semi-)classical description of macroscopic objects is precise enough, as in this example of a polarizer.

The same holds true, by the way, for other optical elements like quarter-wave plates or lenses, which are described by unitary operators. They are also effective descriptions of the much more complicated microscopic theory. You get it by coarse graining over the irrelevant microscopic degrees of freedom, where "irrelevant" refers to the macroscopic scale whose resolution is sufficient to describe phenomena.
 
  • #56
vanhees71 said:
A full microscopic theory of the interaction of the (quantized) electromagnetic field with the material of the polarizer is described by a unitary time evolution and you don't need any collapse anymore. Collapse is a FAPP description of preparation procedure in terms of filter measurements. As such I buy it, but not the claim that there's instantaneous collapse over the entire universe by a local measurement/interaction procedure. It's also not clear, where there should be made a cut between quantum and classical description in principle.

There is no claim that collapse is not FAPP, but that is because in the orthodox Copenhagen interpretation, the wave function and its time evolution are all FAPP.

Anyway, the polarizer is tricky. And yes, in discussing Ballentine's 9.28 we do need two measurements, since that is what he is discussing. Eq 9.28 is conditioned on the outcome of a previous measurement, so there must be a measurement after the first SG device that produces a definite outcome. Eq 9.28 is exactly the collapse, and as far as I can tell, there is no known way to derive it from unitary evolution and the Born rule.

To be clear, the situation in 9.28 is that there is measurement ##a## followed by measurement ##b##. A measurement is something which produces a definite macroscopic outcome. The probability of the outcome ##P(A)## is given by the Born rule, and the probability of the outcome ##P(B)## is given by the Born rule. However, experimentally there is also the joint probability ##P(A,B)## and ##P(B|A)##, neither of which are given by the Born rule and unitary evolution, because the Born rule only applies to measurements at a single time. So when 9.28 is used to calculate ##P(B|A)##, it is a postulate beyond unitary evolution and the Born rule.
 
Last edited:
  • #57
The fact that the beam after the first SG apparatus is split in two partial beams of definite ##\sigma_z## is already a measurement. The particles are macroscopically separated into particles with definite ##\sigma_z##. This can be verified by setting a screen to detect these two partial beams, but then you cannot make further experiments. So you take away the screen and take it as a fact that this separation is objective i.e., not due to the presence of the detector but due to the SG apparatus. Then you can do experiments with the partial beams running them through another SG apparatus measuring the same spin (no further split of beams) or another one (further split of beams with certain probabilities depending on the relative orientation of the magnetic field). As far as I know all SG experiments have verified the predictions of standard QT, including Eq. (9.28). Whether you take it as additional postulate or try to derive it (which in my opinion can be done in the simple case of the SG apparatus but of course not in all cases, as the example of the polarizer shows), is not so important. It's just a pretty well verified piece of QT, while the collapse hypothesis has already theoretical problems, let alone its empirical verification. E.g., I haven't ever heard about an experiment proving action at a distance in contradiction to the hypotheses of local relativistic QFT. All experiments so far are explained by local relativistic QFT. So there's no reason to believe in a collapse in clear contradiction to this very successful paradigm.
 
  • #58
vanhees71 said:
The fact that the beam after the first SG apparatus is split in two partial beams of definite ##\sigma_z## is already a measurement. The particles are macroscopically separated into particles with definite ##\sigma_z##. This can be verified by setting a screen to detect these two partial beams, but then you cannot make further experiments. So you take away the screen and take it as a fact that this separation is objective i.e., not due to the presence of the detector but due to the SG apparatus. Then you can do experiments with the partial beams running them through another SG apparatus measuring the same spin (no further split of beams) or another one (further split of beams with certain probabilities depending on the relative orientation of the magnetic field). As far as I know all SG experiments have verified the predictions of standard QT, including Eq. (9.28). Whether you take it as additional postulate or try to derive it (which in my opinion can be done in the simple case of the SG apparatus but of course not in all cases, as the example of the polarizer shows), is not so important. It's just a pretty well verified piece of QT, while the collapse hypothesis has already theoretical problems, let alone its empirical verification. E.g., I haven't ever heard about an experiment proving action at a distance in contradiction to the hypotheses of local relativistic QFT. All experiments so far are explained by local relativistic QFT. So there's no reason to believe in a collapse in clear contradiction to this very successful paradigm.

What you call the splitting of the two beams is not a measurement. One must add a screen or an ancilla to get a measurement outcome to which the Born rule applies. The screen will destroy the particle, preventing a second measurement, but in principle quantum theory allows the coupling of an ancilla, followed by a measurement on the ancilla leaving the system available for a second measurement. If there is no first measurement, then the conditional probability ##P(B|A)## and the joint probability ##P(A,B)## where both ##A## and ##B## are measurement outcomes is pointless from the point of view of the orthodox interpretation, since quantum theory is only a tool to calculate the probabilities of measurement outcomes.
 
  • #59
Then you could never do these SG experiments measuring subsequently first ##\sigma_z## and then ##\sigma_x##, which brings me to the question, whether this gedankenexperiment has ever been done in practice.
 
  • #60
vanhees71 said:
Then you could never do these SG experiments measuring subsequently first ##\sigma_z## and then ##\sigma_x##, which brings me to the question, whether this gedankenexperiment has ever been done in practice.

As far as I know, this thought experiment has not been done. The closest I can think of are the Bell tests, which are sequential measurements in some frames of reference. Bu the important point is that in the orthodox Copenhagen interpretation, a measurement is something that produces a definite macroscopic outcome, also called a classical outcome. It is when one has sequential measurements with sequential outcomes and one needs to calculate P(B|A) or P(A,B) that collapse is postulated. It is not necessary to postulate collapse, for example, a formula for P(A,B) can be postulated directly without any collapse (in which case, collapse can be derived). Another alternative is to do what bhobba does and postulate the equivalence of proper and improper mixtures following decoherence (again, collapse can be derived). But as far as I know, there has to be something beyond unitary evolution and the Born rule.
 
  • #61
kith said:
I see this as an argument for using the term "measurement" more restrictively but what are sequential measurements then? Textbook examples like multiple polarization filters, SG experiments, etc. don't produce intermediate outcomes.
atyy said:
The polarization filter is an interesting case I don't understand well. Regardless, the textbook treatment of a polarizer is indeed very similar to a measurement followed by collapse since the Born rule is applied to the quantum state for describing the action of a polarizer. My guess is that there is a deterministic unitary description of a polarizer, but it's not immediately obvious to me.

The polarizing beam splitter does in fact have a unitary description given in https://vcq.quantum.at/fileadmin/Publications/2001-13.pdf (Fig. 1.9). In the Copenhagen interpretation, the collapse is only needed for calculating the joint distribution of sequential measurements P(A,B). If one doesn't calculate that, for example, by doing the measurement and ignoring the definite outcomes, in all cases I know of there is a unitary description that is sufficient, which is why I guessed that a polarizer has a unitary description. The most common example in which state reduction can be used, but isn't necessary is decoherence followed by a measurement with definite outcomes in which the results are ignored. That can be modeled simply by unitary evolution and a partial trace yielding decoherence on observables on a subsystem. State reduction and decoherence are consistent, as long as the experimenter cannot undo the decoherence, which is a condition for using state reduction: the appearance of a macroscopic outcome which is "definite" or "irreversible" to the observer. Incidentally, this is agreed on by Peres https://books.google.com/books?id=IjCNzbJYONIC&source=gbs_navlinks_s (p376), who like Ballentine, is not very keen on collapse, except that Peres's book is nicely written.
 
Last edited:
  • #62
atyy said:
As far as I know, this thought experiment has not been done. The closest I can think of are the Bell tests, which are sequential measurements in some frames of reference. Bu the important point is that in the orthodox Copenhagen interpretation, a measurement is something that produces a definite macroscopic outcome, also called a classical outcome.
Just to get you right: you are saying that collapse is not needed in single measurements but it is needed in sequential measurements. Yet, no such sequential measurement has been performed yet (with the possible exception of Bell tests)? So excluding these, you are saying that collapse isn't needed to explain any experiment which has been actually performed?
 
  • #63
kith said:
Just to get you right: you are saying that collapse is not needed in single measurements but it is needed in sequential measurements. Yet, no such sequential measurement has been performed yet (with the possible exception of Bell tests)? So excluding these, you are saying that collapse isn't needed to explain any experiment which has been actually performed?

Yes, collapse (or another postulate) is not needed in single measurements, but in sequential measurements to define the joint probability or conditional probability. The Bell test is such an experiment. But yes, other than that, I don't know of a specific experiment. If I were to look, I might try implementations of quantum computing (but there one often uses the principle of deferred measurement in which a final simultaneous measurement is computationally equivalent to sequential measurements).

Anyway, apart from experiments, the question is whether Ballentine is right to reject collapse, and reach his Eq 9.30 as a derivation. Ballentine's Eq 9.30 has exactly the same status regarding experimental tests as collapse. So the difference is not whether it has been tested, but rather whether Eq 9.28 (which is collapse, when taken with Ballentine's stipulation that it is conditional on the previous measurement outcome) can be derived from unitary evolution and the Born rule alone, en route to Eq 9.30. I have no problem if he had said he prefers not to postulate collapse, but to directly postulate Eq 9.30, and then derive collapse as a result. (In principle, Ballentine is already wrong, since among his many fundamental errors he rejects collapse and accepts it via Eq 9.28, which is a contradiction.)

To be clear (and restate a point from post #21), Eq 9.28 can be defined using only unitary evolution and the Born rule as a conditional state for simultaneous measurements. Where the extra assumption enters is the assignment of a rule for time evolution to the conditional state so that the conditional state holds for simultaneous and sequential measurements.
 
Last edited:
  • #64
I'm not entirely sure whether it is necessary here, but another place where state reduction is used is the formalism of continuous observation, which is used in obtaining the classical limit. The continuous tracks in a cloud chamber are naturally treated by this formalism. The linked paper also treats imaging the resonance fluorescence from a single atom as a concrete example of a continuous position measurement.

http://arxiv.org/abs/quant-ph/0611067
A Straightforward Introduction to Continuous Quantum Measurement
Kurt Jacobs, Daniel A. Steck
 
Last edited:
  • #65
To add to the long list of texts (including Dirac, Landau & Lifshitz, Weinberg, Nielsen & Chuang, Holevo, Haag, Haroche & Raimond) that contradict Ballentine's claim of unitary evolution without state reduction, there is also Jonathan Dimock's "Quantum Mechanics and Quantum Field Theory: A Mathematical Primer".

https://books.google.com/books?id=Y4m8V7-83swC&source=gbs_navlinks_s (p41)
 
  • #66
atyy said:
Yes, collapse (or another postulate) is not needed in single measurements, but in sequential measurements to define the joint probability or conditional probability. The Bell test is such an experiment. But yes, other than that, I don't know of a specific experiment. If I were to look, I might try implementations of quantum computing (but there one often uses the principle of deferred measurement in which a final simultaneous measurement is computationally equivalent to sequential measurements).
This brings the dilemma with the collapse assumption to utmost clarity! You claim that in the Bell test (or the quantum eraser example) the measurement of one of the particles/photons would have an influence on the measurement of the other even if the measurement events (the "click of the detector") are space-like separated. This contradicts Einstein causality.

I prefer to claim that there is in fact no influence of the "first" measurement on the "second" (from the point of view of a reference frame, where the space-like separated click events are not simultaneously), and ergo no collapse. This is consistent with the very foundations of local and microcausal quantum field theories. You assume by construction that local operators, representing observables (like the energy density of the em. field (vulgo "photons"), which is the observable measured with photo detectors!), always commute. This makes the whole theory consistent with relativistic causality and it excludes and instanteous collapse in any reference frame. If this was not the case you have really curious effects in your physical interpretation: By simply changing the reference frame you can flip the time order of the clicks, which would imply that the collapse is a completely different "event", depending on the reference frame. No such asymmetries (violations of Poincare invariance) have been observed so far. Only the (I still claim totally unnecessary) assumption of an instantaneous collapse imply such self-contradictory statements!
 
  • #67
vanhees71 said:
This brings the dilemma with the collapse assumption to utmost clarity! You claim that in the Bell test (or the quantum eraser example) the measurement of one of the particles/photons would have an influence on the measurement of the other even if the measurement events (the "click of the detector") are space-like separated. This contradicts Einstein causality.

That is the claim you make, not the orthodox interpretation. In the orthodox interpretation, the wave function is not necessarily real, but simply a way to calculate the outcomes of experiments. Collapse of the wave function is consistent with all experiments to date, including the requirement of special relativity that no classical information is transmitted faster than light. If by "Einstein causality" one means that there is no superluminal transfer of classical information, collapse is consistent with that requirement. If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
 
  • #68
atyy said:
If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
Aren't those quantum correlations naturally explained by quantum observables commuting relations just like Bell's inequalities correlations would be explained by classical mechanics observables commuting relations without any need to recurr to local-nonlocal debates if the world happened to be classical? The experiments that violate the classical inequalities simply show the world is not classical then.
 
  • #69
TrickyDicky said:
Aren't those quantum correlations naturally explained by quantum observables commuting relations just like Bell's inequalities correlations would be explained by classical mechanics observables commuting relations without any need to recurr to local-nonlocal debates if the world happened to be classical? The experiments that violate the classical inequalities simply show the world is not classical then.

In using the quantum formalism as an explanation, one has to use the observables and the quantum state. When used as part of an explanation for nonlocal correlations, the quantum state itself is nonlocal, because a local classical preparation procedure at a particular time is assigned a quantum state in Hilbert space, ie. the quantum state is associated with the entire spacelike surface of simultaneity.
 
Last edited:
  • #70
atyy said:
In using the quantum formalism as an explanation, one has to use the observables and the quantum state.
With one caveat, the observables cannot be obviated in empirical(physical) theories, they are necessarily part of any such theory. The quantum states are abstractions that need not be part of reality, but can be just operational tools.
When used as part of an explanation for nonlocal correlations, the quantum state itself is nonlocal, because a local classical preparation procedure is assigned a quantum state in Hilbert space, ie. the quantum state is associated with the entire spacelike surface of simultaneity.
Besides applying here the above caveat that allows one not to take seriously that association until one is sure the state is not just epistemic, I would insist in always switching the term "nonlocal" for the less esoteric "nonclassical", since it is admitted that the concept of collapse need not be physical and nonlocality in QM is simply refuting thru confirmed predictions certain classical intuitive notions.
 
  • #71
TrickyDicky said:
With one caveat, the observables cannot be obviated in empirical(physical) theories, they are necessarily part of any such theory. The quantum states are abstractions that need not be part of reality, but can be just operational tools.

Besides applying here the above caveat that allows one not to take seriously that association until one is sure the state is not just epistemic, I would insist in always switching the term "nonlocal" for the less esoteric "nonclassical", since it is admitted that the concept of collapse need not be physical and nonlocality in QM is simply refuting thru confirmed predictions certain classical intuitive notions.

Yes, the state can be considered just a tool - but in which case it is not an "explanation", since an "explanation" is by definition real. So if the quantum formalism is to be an explanation, then the state is considered real by definition. So either the quantum formalism does not provide an explanation, or it provides a nonlocal explanation.
 
  • #72
atyy said:
Yes, the state can be considered just a tool - but in which case it is not an "explanation", since an "explanation" is by definition real. So if the quantum formalism is to be an explanation, then the state is considered real by definition. So either the quantum formalism does not provide an explanation, or it provides a nonlocal explanation.
Yes, it provides a nonclassical explanation that doesn't need to mention states although it affects their measure, the quantum commuting relations.
 
  • #73
atyy said:
That is the claim you make, not the orthodox interpretation. In the orthodox interpretation, the wave function is not necessarily real, but simply a way to calculate the outcomes of experiments. Collapse of the wave function is consistent with all experiments to date, including the requirement of special relativity that no classical information is transmitted faster than light. If by "Einstein causality" one means that there is no superluminal transfer of classical information, collapse is consistent with that requirement. If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
No! Relativistic QFT, including the possibility of entanglement over long spacelike distances does not contradict Einstein causality. It's by construction that interactions are local. The correlations are due to the preparation in an entangled state and this preparation event is timelike separated und does objectively before the measurements demonstrating the violation of Bell's inequality. There's no causal influence between spacelike separated measurement events in this "minimal statistical interpretation" of the quantum state. That's a very orthodox interpretation without any need for instantaneous collapses or something like it.

You also don't say that something collapses when the weekly numbers of Lotto are drawn and then become definite values, of which before I had only probabilistic knowledge.
 
  • Like
Likes TrickyDicky
  • #74
vanhees71 said:
No! Relativistic QFT, including the possibility of entanglement over long spacelike distances does not contradict Einstein causality. It's by construction that interactions are local. The correlations are due to the preparation in an entangled state and this preparation event is timelike separated und does objectively before the measurements demonstrating the violation of Bell's inequality. There's no causal influence between spacelike separated measurement events in this "minimal statistical interpretation" of the quantum state. That's a very orthodox interpretation without any need for instantaneous collapses or something like it.

What do you mean by "Einstein causality"? Do you mean (A) no faster than light transmission of classical information, or (B) that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events?

vanhees71 said:
You also don't say that something collapses when the weekly numbers of Lotto are drawn and then become definite values, of which before I had only probabilistic knowledge.

Are you just objecting to terminology? Ballentine's Eq 9.28 is "collapse" or "state reduction" or "a change from an improper to proper mixture after decoherence" (which is bhobba's preferred way to state the collapse postulate).

For example, Matteo Paris calls Postulate II.4 on p9 of http://arxiv.org/abs/1110.6815 "state reduction", and that is essentially Ballentine's Eq 9.28.

It is important that although collapse or state reduction is related to updating of one's knowledge after a Lotto drawing, it is not the same, which is the point that bhobba makes about the need for a postulate about the change from an improper to a proper mixture after decoherence.
 
  • #75
It is probably relevant here to be clear about the differences between relativistic and non-relativistic QM, it doesn't seemt to be fair to demand from NRQM to be Lorentz invariant when it is explicitly galilean invariant and therefore all discussions about causality, nonlocality, the weirdness of entangled correlations and ftl or atyy's use of Bell tests to try to endorse instantaneous collapse bringing in relativistic concepts like light cones or spacelike surfaces of simultaneity sounds like reproaching nonrelativistic QM for being nonrelativistic and are out of place.
Yes the commutation relations between momentum and position operators are not relativistic, but the important thing here and where their predictive power resides in quantum microscopic experiments is that they are also nonclassical by its quantum nature.

It is obvious something has to change in the commuting relations when going to QFT, and that is basically that position is no longer an observable operator, that is enough to avoid not only any ftl causal influences but the need of any collapse problem or conundrum when claiming to be relativistic.
Collapse in the nrqm case is just a redundant reminder that we are using a nonrelativistic approximation so it is not something that needs to be "solved",
 
  • #76
TrickyDicky said:
atyy's use of Bell tests to try to endorse instantaneous collapse

It is important to note that the key point is not that there is any FTL communication. The point is not FTL, but sequential measurements, and whether unitarity and the Born rule are enough to reach Ballentine's Eq 9.30. The Bell test is simply the best example I know of a sequential measurement.

I would like to point out that bhobba's ensemble interpretation, which seems correct to me, also does not agree that unitarity and the Born rule are enough, which is why he postulates the change from an improper to a proper mixture, which is equivalent to collapse.
 
  • #77
atyy said:
(B) that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events?
Again, in QFT both position and time are not observables, just parameters and as such the notions of "spacelike separation" or "past light cone" can't be used to reject or put forth an explanation of nonclassical correlations, these are orthogonal to such concepts.
 
  • #78
atyy said:
It is important to note that the key point is not that there is any FTL communication. The point is not FTL, but sequential measurements, and whether unitarity and the Born rule are enough to reach Ballentine's Eq 9.30. The Bell test is simply the best example I know of a sequential measurement.

I would like to point out that bhobba's ensemble interpretation, which seems correct to me, also does not agree that unitarity and the Born rule are enough, which is why he postulates the change from an improper to a proper mixture, which is equivalent to collapse.
It is well known that unitarity+Born are not enough, but to me collpse is not where we have to look at for answers, this is the preferred basis issue. One implicitly uses a preferred temporal frame to filter or reduce sequentially either thru measure or preparation of states.
 
Last edited:
  • #79
TrickyDicky said:
It is well known that unitarity+Born are not enough, but to me collpse is not where we have to look at for answers, this is the preferred basis issue. One implicitly uses a preferred temporal frame to filter or reduce sequentially either thru measure or preparation of states.

It is more than the preferred basis issue. For simplicity in the present discussion, if one assumes decoherence is perfect, the preferred basis can be specified. However, that still does not explain why there is a proper mixture, which is why bhobba also postulates that there is a change from an improper mixture to a proper mixture.
 
  • #80
Of course, I don't blame non-relativistic QT to be not relativistically covariant. It should be Galilei covariant, but it's of course not Poincare-covariant. Why should it be? Without relativity you can invoke a collapse with much less headaches than in relativistic QT. There's no problem with action at a distance in non-relativistic physics as, e.g., in Newton's law of gravity. The only problem is that nature doesn't realize the Galileian space-time structure but the relativistic one.

Einstein causality means that there's no causal connection between space-like separated events. The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system. This is so in local relativistic QFTs, which by construction do not lead to faster-than-light causal signals. That restriction is also compatible with the properties of the S-matrix to make physical sense (unitarity and Poincare covariance).
 
  • #81
atyy said:
It is more than the preferred basis issue. For simplicity in the present discussion, if one assumes decoherence is perfect, the preferred basis can be specified. However, that still does not explain why there is a proper mixture, which is why bhobba also postulates that there is a change from an improper mixture to a proper mixture.
It doesn't need to explain it because by definition of the epistemic notion of state one must admit an ignorance statistical interpretation for all mixed states, so one can avoid the"improper vs proper" mixed state distinction, in other words one doesn't need to explain it anymore than has to explain that standard QM is nonrelativistic(and in that sense incomplete as long as we consider a relativistic world as something more accurate when describing the physics) and therefore states cannot be anymore than an epistemic (incomplete) approximation tool.
 
  • #82
TrickyDicky said:
It doesn't need to explain it because by definition of the epistemic notion of state one must admit an ignorance statistical interpretation for all mixed states, so one can avoid the"improper vs proper" mixed state distinction, in other words one doesn't need to explain it anymore than has to explain that standard QM is nonrelativistic(and in that sense incomplete as long as we consider a relativistic world as something more accurate when describing the physics) and therefore states cannot be anymore than an epistemic (incomplete) approximation tool.

Perhaps "explain" is not the right word. Let me explain what I mean again. It is not enough to state that decoherence defines a preferred basis - one must say what the significance of the preferred basis is. The significance of the preferred basis is that it defines two sub-ensembles and a link between them: sub-ensembles of measurement outcomes and sub-ensembles of the resulting quantum state. This is beyond unitary evolution and the Born rule because these sub-ensembles were not defined within the theory prior to measurement or to perfect decoherence.
 
  • #83
vanhees71 said:
Of course, I don't blame non-relativistic QT to be not relativistically covariant. It should be Galilei covariant, but it's of course not Poincare-covariant. Why should it be? Without relativity you can invoke a collapse with much less headaches than in relativistic QT. There's no problem with action at a distance in non-relativistic physics as, e.g., in Newton's law of gravity. The only problem is that nature doesn't realize the Galileian space-time structure but the relativistic one.

Again, one has to stress that collapse causes no problems with relativity, because collapse does not permit faster than light communication of classical information. Collapse is part of the structure of relativistic quantum field theory. In part, it can be argued that if one adds relativity to the postulates of quantum theory, collapse need not be postulated, but can be derived for spacelike separated observations. The only problem with collapse is whether a Schroedinger picture exists in relativistic quantum field theory, but at least at the non-rigourous level at which the standard model is formulated, the Schroedinger picture is usually assumed to exist.

vanhees71 said:
Einstein causality means that there's no causal connection between space-like separated events. The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system. This is so in local relativistic QFTs, which by construction do not lead to faster-than-light causal signals. That restriction is also compatible with the properties of the S-matrix to make physical sense (unitarity and Poincare covariance).

By "The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system." do you mean that the only cause of the measurement outcome on anyone side is the preparation procedure and the measurement setting on that side, because both are in the past light cone of the measurement outcome?
 
Last edited:
  • #84
atyy said:
It is not enough to state that decoherence defines a preferred basis - one must say what the significance of the preferred basis is. The significance of the preferred basis is that it defines two sub-ensembles and a link between them: sub-ensembles of measurement outcomes and sub-ensembles of the resulting quantum state. This is beyond unitary evolution and the Born rule because these sub-ensembles were not defined within the theory prior to measurement or to perfect decoherence.
I have not stated anything about decoherence, and agree that decoherence is not enough to solve the preferred basis conundrum, and therefore also agree that this goes beyond unitary evolution+Born rule. But I'm sure that you can see that a purely statistical ensembles interpretation of QM avoids/ignores the preferred basis problem because it takes Born seriously that all there is behind the quantum states are probabilities and you have admitted in previous posts that in the two-state cases like the polarizer or Stern-Gerlach it is enough with unitarity. There are clear reasons to think that in the rest of the cases, specially the continuous eigenvalues case there is not a collapse in the way it is usually postulated in Copenhagen due to problems with the very concept of vector state.
 
  • #85
The please don't call it collapse! Collapse implies instantaneous, i.e., faster-than-light signal propagation (whether classical or quantum, I can't say, because I don't know what's meant by a classical signal)! What's meant by "collapse" is simply the acknowledgment of the outcome of measurement, caused by local interactions of the quantum system with the measurement apparatus. "Acknowledgment" doesn't imply a conscious observer but can be just the storage of the outcome of the measurement on a photo plate, a digital storage device or whatever.

The correlations described by entanglement are there because of the preparation procedure of the system in an entangled state. All measurement events are in the future light cone of the "preparation event". So there is no violation of Einstein causality (by construction of the theory!) as long as you describe the process within a local microcausal relativistic QFT.

Also, where is there a problem with a "preferred basis"? Isn't it simply the measurement device which decides what I measure? A state by itself has no preferred basis. You can express it in any basis you like, and the physical observable outcomes described by it, is independent of this choice of a basis (you can even formulate eveything in terms of abstract Hilbert-space objects withoug reference to any particular basis). When calculating the probabilities according to the Born rule, you have of course to define, what's measured and then take the corresponding eigenvectors of the measured observable. In some sense you can say that this choice of which observable you measure is to choose a preferred basis, but why not saying it in such a complicated way?

E.g., in the "teleportation experiment", what's measured are the single-photon energy-density distributions at Alice's and Bob's place, which can be interpreted as the detection probability position distributions around these places. If you take A's and B's measurement protocols with sufficiently accurate detection-time "stamps" you can also say that you measure the joint two-photon detection probability position distributions, and this reveals the correlations described by entanglement. There's no need for some instantaneous collapse nor the assumption of a "preferred basis", except the choice of the basis made in detecting photons (with a certain polarization at the two places in this case).
 
Last edited:
  • #86
vanhees71 said:
Also, where is there a problem with a "preferred basis"? Isn't it simply the measurement device which decides what I measure? A state by itself has no preferred basis. You can express it in any basis you like, and the physical observable outcomes described by it, is independent of this choice of a basis (you can even formulate eveything in terms of abstract Hilbert-space objects withoug reference to any particular basis). When calculating the probabilities according to the Born rule, you have of course to define, what's measured and then take the corresponding eigenvectors of the measured observable. In some sense you can say that this choice of which observable you measure is to choose a preferred basis, but why not saying it in such a complicated way?
Whether one sees it as a problem is quite subjective, clearly for you it is not. But it's simply a defining feature of quantum theory for being based on the introduction of a minimum quantum scale(h). No one can deny that it has fostered an industry of interpretational debates because when mixed with an abstracted from measurement basis-independent mathematical formalism it gives room to all sorts of philosophical ambiguities with little practical interest.
E.g., in the "teleportation experiment", what's measured are the single-photon energy-density distributions at Alice's and Bob's place, which can be interpreted as the detection probability position distributions around these places. If you take A's and B's measurement protocols with sufficiently accurate detection-time "stamps" you can also say that you measure the joint two-photon detection probability position distributions, and this reveals the correlations described by entanglement. There's no need for some instantaneous collapse
Right.
nor the assumption of a "preferred basis", except the choice of the basis made in detecting photons (with a certain polarization at the two places in this case).
This is conceding what you are rejecting all in the same sentence.
 
  • Like
Likes vanhees71
  • #87
TrickyDicky said:
Whether one sees it as a problem is quite subjective, clearly for you it is not. But it's simply a defining feature of quantum theory for being based on the introduction of a minimum quantum scale(h). No one can deny that it has fostered an industry of interpretational debates because when mixed with an abstracted from measurement basis-independent mathematical formalism it gives room to all sorts of philosophical ambiguities with little practical interest.
Right.
Well, I don't know, whether I should this take as a "bug or a feature" of the idea of a "problem of a preferred basis". I tend to the former conclusion ;-).
 
  • #88
vanhees71 said:
Well, I don't know, whether I should this take as a "bug or a feature" of the idea of a "problem of a preferred basis". I tend to the former conclusion ;-).
So do I.
 
  • #89
vanhees71 said:
The please don't call it collapse! Collapse implies instantaneous, i.e., faster-than-light signal propagation (whether classical or quantum, I can't say, because I don't know what's meant by a classical signal)! What's meant by "collapse" is simply the acknowledgment of the outcome of measurement, caused by local interactions of the quantum system with the measurement apparatus. "Acknowledgment" doesn't imply a conscious observer but can be just the storage of the outcome of the measurement on a photo plate, a digital storage device or whatever.

Well, then it is just a matter of terminology. As I have said many times throughout this thread and elsewhere, "collapse" is just another word for "state reduction" or "conditional state" or "a posteriori state" or "quantum jump", and is essentially Ballentine's Eq 9.28. The understanding you are arguing against is not part of the orthodox Copenhagen-style interpretation given in Landau and Lifshitz or in Cohen-Tannoudji, Diu and Laloe. The quantum state or wave function is not necessarily real in the orthodox interpretation, so the interpretation is silent on whether there is any "real" faster than light "influence".

However the minimal interpretation does have a classical/quantum cut, so there are classical signals. Together with the incorrect criticism of state reduction in Ballentine, the lack of an explicit statement of the classical/quantum cut is another fundamental misconception in Ballentine. Because of a failure to state the classical/quantum cut, Ballentine does not have a proper interpretation of the term "measurement", which requires a classical outcome. The lack of a proper conception of measurement shows when Ballentine claims that the orthodox interpretation is wrong and suggests that it fails the experimental test of the spin recombination experiment. The misconceptions concerning the classical/quantum cut, the physical meaning of "measurement", and the collapse postulate lead Ballentine further to dispute a standard explanation of the quantum Zeno effect, and to claim that the orthodox interpretation is inconsistent with data from continuous observations.

There are interpretations that do not require a classical/quantum cut such as the Bohmian interpretation, and the Many-Worlds approach. However, Ballentine does not explicitly state hidden variables, and the only hidden variable theory associated with his work is the erroneous one in his 1970 review. If Ballentine is implicitly assuming Many-Worlds, then his point of view is possibly defensible.
 
Last edited:
  • #90
Nick V said:
Is the collapse of the wave function of the electron in the double slit experiment based purely on the act of observation? Or could it be that the way the instrument used to measure the electron caused it to collapse by how it physically interacted with the electron? Keep in mind the delayed choice experiment.

You might not realize it but I believe the question you are asking is actually a very important one. The lack of particularly helpful responses is largely because "we don't know". That is to say, the common quantum physicist doesn't know, though they'd never admit it.

Your question pertains to a well known problem with what it is properly called the "Von Neumann chain". As you correctly intuit, there is some ambiguity in the Copenhagen interpretation of QM with regards to what exactly qualifies as an "observation". Does the detector actually "observe" the particle or is the detector just another part of the system?

Taking the double slit experiment as an example, when the electron or photon leaves the gun, it ceases to be localised precisely in space time, it becomes describable only in terms of probabilities. You can think of it as existing only in a sort of suspended state, it really takes on a superposition of possible states. To simplify the problem we assume there are only two possible outcomes, either the electron goes through the left slit or the right one. So for the period of time between firing and detecting the particle, it actually exists as a superposition of both states. It is in limbo, so to speak.

Now, suppose you set up a detector of some kind on the other side of the screen to check which slit to particle went through. But now suppose you leave the room... Lock the door and let it all happen. The electron will acquire the superposition, travel (as a wave) through the slits, the two waves will interfere and ultimately reach the detector but then what happens? A decision needs to be made. Does the detector record the electron or not? There is a very important question here. Has the system "collapsed"? Or is the detector now part of the system? Is the detector in limbo just like the electron was? The laws of quantum mechanics to not pick and choose. The principle states that everything in the universe is fundamentally a quantum system and so, logically, the answer is yes, the detector does pick up the quantum dichotomy. The detector has become a part of the system and has not been collapsed, it is now a superposition of two states, one in which the electron was detected, and one in which it wasn't.

Now, think about this. You could extend this chain of detectors detecting detectors for as long as you wanted and surely, each subsequent detector would simply pick up the superposition. This chain is interactions is called the Von Neumann chain and the million dollar question in all this is, "Where does it end?". We know it has to end somewhere because well... what happens when you walk in the room and look at the detector? Do you pick up the dichotomy as well? Maybe you do, there are theories to suggest that you split in two as it were, but you are only capable of experiencing one outcome per universe. Whatever the case, something special seems to happen whenever a "sentient being" observes the system. The chain is clearly broken, there is something fundamentally important about us in this way. Perhaps this is what it means to be "aware".

But this is not the place for philosophy, I merely bring this idea to your knowledge as I believe this may be exactly what you are looking for.

For further reading on the subject I suggest you acquire a copy of "The Self-Aware Universe" by Dr. Amit Goswami PhD.
 
  • #91
James White said:
. You could extend this chain of detectors detecting detectors for as long as you wanted and surely, each subsequent detector would simply pick up the superposition. This chain is interactions is called the Von Neumann chain and the million dollar question in all this is, "Where does it end?".
I have asked this question on PF in the past. From those discussions, I was under the impression that decoherence was believed to prevent the "Von Neumann Chain" or the similar "Wigner's Friend" dilemmas from being an issue.
Was I mistaken in that understanding?
 
  • #92
Feeble Wonk said:
I have asked this question on PF in the past. From those discussions, I was under the impression that decoherence was believed to prevent the "Von Neumann Chain" or the similar "Wigner's Friend" dilemmas from being an issue.
Was I mistaken in that understanding?

You are correct to be under that impression as this is the current explanation insofar as it can be called an "explanation". It's more a of cheeky way out, like so many things are in quantum physics unfortunately. This is entirely my opinion but I've been studying QM for some time and it seems to me that the majority of physicists nowadays are so thoroughly confused by it that they are burying the problem in necessarily complex mathematics and many of the analytic solutions are "hacks" at best.

If you want a purely "scientific" answer (that is, the kind of answer that would not be ridiculed among mainstream physicists) then the answer is essentially, "We don't really know", QM is still very poorly understood (although there are things we understand quite well such as QED). But if you're anything like me, you want more than just a "We don't know" and if that's the case then I cannot urge you enough to dare to venture slightly beyond the confines of accepted theory. Read "The self-aware universe" by Dr. Amit Goswami PhD, he is a well respected quantum physicist, you might even have heard of him. He struggled with questions just like this for decades before stumbling onto the answer, it's still largely a work in progress but I'd be very surprised if it didn't change your entire interpretation of QM.

Remember, major breakthroughs in science as always ignored and ridiculed before being accepted. The trick is to not take what anyone tells you as gospel.
 
  • #93
It could definitely be caused by the first. I don't know about the second.
 
  • #94
I've been enjoying the exchange between Atyy and Vanhees71, and I don't want to distract from that debate... or worse, get the thread terminated for an overly philosophical tenure. Yet, I'd like to follow up on this.
James White said:
Read "The self-aware universe" by Dr. Amit Goswami PhD, he is a well respected quantum physicist, you might even have heard of him.
I actually read Goswami's book several years ago, and was very intrigued by his perspective regarding QT's implications. However, I believe it was published in 1995, and my understanding is that the process of decoherence has become significantly more established since that time. Is there still a significant school of thought within the physics community that posits the requirement for a "conscious" observation of measurement to reduce (>_< trying to avoid the "collapse" term) the quantum state?
 
  • #95
Feeble Wonk said:
However, I believe it was published in 1995, and my understanding is that the process of decoherence has become significantly more established since that time. Is there still a significant school of thought within the physics community that posits the requirement for a "conscious" observation of measurement to reduce (>_< trying to avoid the "collapse" term) the quantum state?

If you look at bhobba's post #18, that is I believe the correct and consensus position. Decoherence alone is not enough to remove the need for a measuring apparatus or an observer to recognize a definite outcome, upon which the wave function collapses or an improper mixture is converted to a proper mixture. A "conscious" observer is not really any more defined than an "observer" or a "measuring apparatus", but if hidden variables or many-worlds are not postulated, there has to be something more that is able to register definite measurement outcomes.

In addition to the link in bhobba's post #18, other references agreeing with this point are
https://www.amazon.com/dp/0198509146/?tag=pfamazon01-20 p82
https://www.amazon.com/dp/3540610499/?tag=pfamazon01-20 p301

So yes, the pioneers of quantum mechanics did their job well. There were some errors, such as von Neumann's wrong proof of the impossibility of hidden variables, and the projection postulate is not the most general rule of state reduction, since it runs into trouble with continuous variables. However, although some textbooks like Bohm's wrongly stated the impossibility of hidden variables (Bohm corrected himself in later papers), others like Messiah's explicitly stated that hidden variables cannot be ruled out, but the Copenhagen interpretation will be used for simplicity since no experiments up to that time distinguished the interpretations. The state reduction rule has been generalized to continuous variables, eg. by Davies and Lewis, and Ozawa. So overall, the orthodox Copenhagen-style interpretation deserves its status, matching all experimental data to date, and yet recognizing its problems via the classical/quantum cut.
 
Last edited:
  • #96
Feeble Wonk said:
I have asked this question on PF in the past. From those discussions, I was under the impression that decoherence was believed to prevent the "Von Neumann Chain" or the similar "Wigner's Friend" dilemmas from being an issue.
Was I mistaken in that understanding?

Decoherence shows the most natural place to put the von-Neumsann cut is right after decoiherence. When you do that all these issues disappear.

Thanks
Bill
 
  • #97
James White said:
You could extend this chain of detectors detecting detectors for as long as you wanted and surely, each subsequent detector would simply pick up the superposition.

That isn't exactly what the issue is - its not really an issue with a chain of detectors because the cut can be placed before the detector - or even between detectors - however there is no need to go into the details here - start a new thread on the Von-Neumann regress if you are interested

Von-Neumanns analysis is out of date.

It has been superseded with our new knowledge of decoherence. What worried Von-Neumann was you can put the classical quantum cut pretty much anywhere before the conciousness of a human observer and since no place is different he chose to put it at the conciousness of the human observer. However such a view leads to all sorts of difficulties even in Von-Neumann's time but especially now in the computer age. For example in the double slit you can write the output to computer memory, copy that memory endlessly and then 100 years later view the output. Is that when the observation occurred? Did all the copies suddenly get observed then? If you promulgated such a view in a computer science class they would likely leave thinking you a bit loopy and try to contact some men in white.

It didn't catch on that much but did with a one very influential mathematical physicist - Wigner. Von-Neumann died young but Wigner lived to see the more modern work on decoherence. When he found out about some of the early work of Zurek he realized that Von-Neumanns analysis was no longer valid - there is a place that's different and that is just after decoherence. If you place it there all these type of issues disappear- some do remain - the most notable being the so called problem of outcomes which colloquially is why do we get any outcomes at all - or technically how does an improper mixed state become a proper one. Wigner did a complete 180% about face and advocate objective collapse models.

Thanks
Bill
 
Back
Top