Is the collapse of the particle wave function?

Click For Summary
The discussion centers on the nature of wave function collapse in quantum mechanics, particularly in the context of the double slit experiment. Participants debate whether the collapse is solely due to observation or if it results from the physical interaction of measuring instruments with electrons. The Copenhagen interpretation suggests that time evolution is deterministic between measurements, while randomness occurs at the moment of measurement, but this raises questions about when measurements happen. Alternative interpretations like Bohmian mechanics and Many-Worlds offer different perspectives, with Bohmian mechanics asserting deterministic evolution without collapse, and Many-Worlds proposing that all outcomes occur. The conversation highlights ongoing debates about the interpretations of quantum mechanics and the implications for understanding wave-particle duality.
  • #61
kith said:
I see this as an argument for using the term "measurement" more restrictively but what are sequential measurements then? Textbook examples like multiple polarization filters, SG experiments, etc. don't produce intermediate outcomes.
atyy said:
The polarization filter is an interesting case I don't understand well. Regardless, the textbook treatment of a polarizer is indeed very similar to a measurement followed by collapse since the Born rule is applied to the quantum state for describing the action of a polarizer. My guess is that there is a deterministic unitary description of a polarizer, but it's not immediately obvious to me.

The polarizing beam splitter does in fact have a unitary description given in https://vcq.quantum.at/fileadmin/Publications/2001-13.pdf (Fig. 1.9). In the Copenhagen interpretation, the collapse is only needed for calculating the joint distribution of sequential measurements P(A,B). If one doesn't calculate that, for example, by doing the measurement and ignoring the definite outcomes, in all cases I know of there is a unitary description that is sufficient, which is why I guessed that a polarizer has a unitary description. The most common example in which state reduction can be used, but isn't necessary is decoherence followed by a measurement with definite outcomes in which the results are ignored. That can be modeled simply by unitary evolution and a partial trace yielding decoherence on observables on a subsystem. State reduction and decoherence are consistent, as long as the experimenter cannot undo the decoherence, which is a condition for using state reduction: the appearance of a macroscopic outcome which is "definite" or "irreversible" to the observer. Incidentally, this is agreed on by Peres https://books.google.com/books?id=IjCNzbJYONIC&source=gbs_navlinks_s (p376), who like Ballentine, is not very keen on collapse, except that Peres's book is nicely written.
 
Last edited:
Physics news on Phys.org
  • #62
atyy said:
As far as I know, this thought experiment has not been done. The closest I can think of are the Bell tests, which are sequential measurements in some frames of reference. Bu the important point is that in the orthodox Copenhagen interpretation, a measurement is something that produces a definite macroscopic outcome, also called a classical outcome.
Just to get you right: you are saying that collapse is not needed in single measurements but it is needed in sequential measurements. Yet, no such sequential measurement has been performed yet (with the possible exception of Bell tests)? So excluding these, you are saying that collapse isn't needed to explain any experiment which has been actually performed?
 
  • #63
kith said:
Just to get you right: you are saying that collapse is not needed in single measurements but it is needed in sequential measurements. Yet, no such sequential measurement has been performed yet (with the possible exception of Bell tests)? So excluding these, you are saying that collapse isn't needed to explain any experiment which has been actually performed?

Yes, collapse (or another postulate) is not needed in single measurements, but in sequential measurements to define the joint probability or conditional probability. The Bell test is such an experiment. But yes, other than that, I don't know of a specific experiment. If I were to look, I might try implementations of quantum computing (but there one often uses the principle of deferred measurement in which a final simultaneous measurement is computationally equivalent to sequential measurements).

Anyway, apart from experiments, the question is whether Ballentine is right to reject collapse, and reach his Eq 9.30 as a derivation. Ballentine's Eq 9.30 has exactly the same status regarding experimental tests as collapse. So the difference is not whether it has been tested, but rather whether Eq 9.28 (which is collapse, when taken with Ballentine's stipulation that it is conditional on the previous measurement outcome) can be derived from unitary evolution and the Born rule alone, en route to Eq 9.30. I have no problem if he had said he prefers not to postulate collapse, but to directly postulate Eq 9.30, and then derive collapse as a result. (In principle, Ballentine is already wrong, since among his many fundamental errors he rejects collapse and accepts it via Eq 9.28, which is a contradiction.)

To be clear (and restate a point from post #21), Eq 9.28 can be defined using only unitary evolution and the Born rule as a conditional state for simultaneous measurements. Where the extra assumption enters is the assignment of a rule for time evolution to the conditional state so that the conditional state holds for simultaneous and sequential measurements.
 
Last edited:
  • #64
I'm not entirely sure whether it is necessary here, but another place where state reduction is used is the formalism of continuous observation, which is used in obtaining the classical limit. The continuous tracks in a cloud chamber are naturally treated by this formalism. The linked paper also treats imaging the resonance fluorescence from a single atom as a concrete example of a continuous position measurement.

http://arxiv.org/abs/quant-ph/0611067
A Straightforward Introduction to Continuous Quantum Measurement
Kurt Jacobs, Daniel A. Steck
 
Last edited:
  • #65
To add to the long list of texts (including Dirac, Landau & Lifshitz, Weinberg, Nielsen & Chuang, Holevo, Haag, Haroche & Raimond) that contradict Ballentine's claim of unitary evolution without state reduction, there is also Jonathan Dimock's "Quantum Mechanics and Quantum Field Theory: A Mathematical Primer".

https://books.google.com/books?id=Y4m8V7-83swC&source=gbs_navlinks_s (p41)
 
  • #66
atyy said:
Yes, collapse (or another postulate) is not needed in single measurements, but in sequential measurements to define the joint probability or conditional probability. The Bell test is such an experiment. But yes, other than that, I don't know of a specific experiment. If I were to look, I might try implementations of quantum computing (but there one often uses the principle of deferred measurement in which a final simultaneous measurement is computationally equivalent to sequential measurements).
This brings the dilemma with the collapse assumption to utmost clarity! You claim that in the Bell test (or the quantum eraser example) the measurement of one of the particles/photons would have an influence on the measurement of the other even if the measurement events (the "click of the detector") are space-like separated. This contradicts Einstein causality.

I prefer to claim that there is in fact no influence of the "first" measurement on the "second" (from the point of view of a reference frame, where the space-like separated click events are not simultaneously), and ergo no collapse. This is consistent with the very foundations of local and microcausal quantum field theories. You assume by construction that local operators, representing observables (like the energy density of the em. field (vulgo "photons"), which is the observable measured with photo detectors!), always commute. This makes the whole theory consistent with relativistic causality and it excludes and instanteous collapse in any reference frame. If this was not the case you have really curious effects in your physical interpretation: By simply changing the reference frame you can flip the time order of the clicks, which would imply that the collapse is a completely different "event", depending on the reference frame. No such asymmetries (violations of Poincare invariance) have been observed so far. Only the (I still claim totally unnecessary) assumption of an instantaneous collapse imply such self-contradictory statements!
 
  • #67
vanhees71 said:
This brings the dilemma with the collapse assumption to utmost clarity! You claim that in the Bell test (or the quantum eraser example) the measurement of one of the particles/photons would have an influence on the measurement of the other even if the measurement events (the "click of the detector") are space-like separated. This contradicts Einstein causality.

That is the claim you make, not the orthodox interpretation. In the orthodox interpretation, the wave function is not necessarily real, but simply a way to calculate the outcomes of experiments. Collapse of the wave function is consistent with all experiments to date, including the requirement of special relativity that no classical information is transmitted faster than light. If by "Einstein causality" one means that there is no superluminal transfer of classical information, collapse is consistent with that requirement. If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
 
  • #68
atyy said:
If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
Aren't those quantum correlations naturally explained by quantum observables commuting relations just like Bell's inequalities correlations would be explained by classical mechanics observables commuting relations without any need to recurr to local-nonlocal debates if the world happened to be classical? The experiments that violate the classical inequalities simply show the world is not classical then.
 
  • #69
TrickyDicky said:
Aren't those quantum correlations naturally explained by quantum observables commuting relations just like Bell's inequalities correlations would be explained by classical mechanics observables commuting relations without any need to recurr to local-nonlocal debates if the world happened to be classical? The experiments that violate the classical inequalities simply show the world is not classical then.

In using the quantum formalism as an explanation, one has to use the observables and the quantum state. When used as part of an explanation for nonlocal correlations, the quantum state itself is nonlocal, because a local classical preparation procedure at a particular time is assigned a quantum state in Hilbert space, ie. the quantum state is associated with the entire spacelike surface of simultaneity.
 
Last edited:
  • #70
atyy said:
In using the quantum formalism as an explanation, one has to use the observables and the quantum state.
With one caveat, the observables cannot be obviated in empirical(physical) theories, they are necessarily part of any such theory. The quantum states are abstractions that need not be part of reality, but can be just operational tools.
When used as part of an explanation for nonlocal correlations, the quantum state itself is nonlocal, because a local classical preparation procedure is assigned a quantum state in Hilbert space, ie. the quantum state is associated with the entire spacelike surface of simultaneity.
Besides applying here the above caveat that allows one not to take seriously that association until one is sure the state is not just epistemic, I would insist in always switching the term "nonlocal" for the less esoteric "nonclassical", since it is admitted that the concept of collapse need not be physical and nonlocality in QM is simply refuting thru confirmed predictions certain classical intuitive notions.
 
  • #71
TrickyDicky said:
With one caveat, the observables cannot be obviated in empirical(physical) theories, they are necessarily part of any such theory. The quantum states are abstractions that need not be part of reality, but can be just operational tools.

Besides applying here the above caveat that allows one not to take seriously that association until one is sure the state is not just epistemic, I would insist in always switching the term "nonlocal" for the less esoteric "nonclassical", since it is admitted that the concept of collapse need not be physical and nonlocality in QM is simply refuting thru confirmed predictions certain classical intuitive notions.

Yes, the state can be considered just a tool - but in which case it is not an "explanation", since an "explanation" is by definition real. So if the quantum formalism is to be an explanation, then the state is considered real by definition. So either the quantum formalism does not provide an explanation, or it provides a nonlocal explanation.
 
  • #72
atyy said:
Yes, the state can be considered just a tool - but in which case it is not an "explanation", since an "explanation" is by definition real. So if the quantum formalism is to be an explanation, then the state is considered real by definition. So either the quantum formalism does not provide an explanation, or it provides a nonlocal explanation.
Yes, it provides a nonclassical explanation that doesn't need to mention states although it affects their measure, the quantum commuting relations.
 
  • #73
atyy said:
That is the claim you make, not the orthodox interpretation. In the orthodox interpretation, the wave function is not necessarily real, but simply a way to calculate the outcomes of experiments. Collapse of the wave function is consistent with all experiments to date, including the requirement of special relativity that no classical information is transmitted faster than light. If by "Einstein causality" one means that there is no superluminal transfer of classical information, collapse is consistent with that requirement. If by "Einstein causality" one means that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events, then Bell's theorem excludes such an explanation.
No! Relativistic QFT, including the possibility of entanglement over long spacelike distances does not contradict Einstein causality. It's by construction that interactions are local. The correlations are due to the preparation in an entangled state and this preparation event is timelike separated und does objectively before the measurements demonstrating the violation of Bell's inequality. There's no causal influence between spacelike separated measurement events in this "minimal statistical interpretation" of the quantum state. That's a very orthodox interpretation without any need for instantaneous collapses or something like it.

You also don't say that something collapses when the weekly numbers of Lotto are drawn and then become definite values, of which before I had only probabilistic knowledge.
 
  • Like
Likes TrickyDicky
  • #74
vanhees71 said:
No! Relativistic QFT, including the possibility of entanglement over long spacelike distances does not contradict Einstein causality. It's by construction that interactions are local. The correlations are due to the preparation in an entangled state and this preparation event is timelike separated und does objectively before the measurements demonstrating the violation of Bell's inequality. There's no causal influence between spacelike separated measurement events in this "minimal statistical interpretation" of the quantum state. That's a very orthodox interpretation without any need for instantaneous collapses or something like it.

What do you mean by "Einstein causality"? Do you mean (A) no faster than light transmission of classical information, or (B) that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events?

vanhees71 said:
You also don't say that something collapses when the weekly numbers of Lotto are drawn and then become definite values, of which before I had only probabilistic knowledge.

Are you just objecting to terminology? Ballentine's Eq 9.28 is "collapse" or "state reduction" or "a change from an improper to proper mixture after decoherence" (which is bhobba's preferred way to state the collapse postulate).

For example, Matteo Paris calls Postulate II.4 on p9 of http://arxiv.org/abs/1110.6815 "state reduction", and that is essentially Ballentine's Eq 9.28.

It is important that although collapse or state reduction is related to updating of one's knowledge after a Lotto drawing, it is not the same, which is the point that bhobba makes about the need for a postulate about the change from an improper to a proper mixture after decoherence.
 
  • #75
It is probably relevant here to be clear about the differences between relativistic and non-relativistic QM, it doesn't seemt to be fair to demand from NRQM to be Lorentz invariant when it is explicitly galilean invariant and therefore all discussions about causality, nonlocality, the weirdness of entangled correlations and ftl or atyy's use of Bell tests to try to endorse instantaneous collapse bringing in relativistic concepts like light cones or spacelike surfaces of simultaneity sounds like reproaching nonrelativistic QM for being nonrelativistic and are out of place.
Yes the commutation relations between momentum and position operators are not relativistic, but the important thing here and where their predictive power resides in quantum microscopic experiments is that they are also nonclassical by its quantum nature.

It is obvious something has to change in the commuting relations when going to QFT, and that is basically that position is no longer an observable operator, that is enough to avoid not only any ftl causal influences but the need of any collapse problem or conundrum when claiming to be relativistic.
Collapse in the nrqm case is just a redundant reminder that we are using a nonrelativistic approximation so it is not something that needs to be "solved",
 
  • #76
TrickyDicky said:
atyy's use of Bell tests to try to endorse instantaneous collapse

It is important to note that the key point is not that there is any FTL communication. The point is not FTL, but sequential measurements, and whether unitarity and the Born rule are enough to reach Ballentine's Eq 9.30. The Bell test is simply the best example I know of a sequential measurement.

I would like to point out that bhobba's ensemble interpretation, which seems correct to me, also does not agree that unitarity and the Born rule are enough, which is why he postulates the change from an improper to a proper mixture, which is equivalent to collapse.
 
  • #77
atyy said:
(B) that the nonlocal correlations that violate the Bell inequalities at spacelike separation can be explained by events only in the past light cone of local events?
Again, in QFT both position and time are not observables, just parameters and as such the notions of "spacelike separation" or "past light cone" can't be used to reject or put forth an explanation of nonclassical correlations, these are orthogonal to such concepts.
 
  • #78
atyy said:
It is important to note that the key point is not that there is any FTL communication. The point is not FTL, but sequential measurements, and whether unitarity and the Born rule are enough to reach Ballentine's Eq 9.30. The Bell test is simply the best example I know of a sequential measurement.

I would like to point out that bhobba's ensemble interpretation, which seems correct to me, also does not agree that unitarity and the Born rule are enough, which is why he postulates the change from an improper to a proper mixture, which is equivalent to collapse.
It is well known that unitarity+Born are not enough, but to me collpse is not where we have to look at for answers, this is the preferred basis issue. One implicitly uses a preferred temporal frame to filter or reduce sequentially either thru measure or preparation of states.
 
Last edited:
  • #79
TrickyDicky said:
It is well known that unitarity+Born are not enough, but to me collpse is not where we have to look at for answers, this is the preferred basis issue. One implicitly uses a preferred temporal frame to filter or reduce sequentially either thru measure or preparation of states.

It is more than the preferred basis issue. For simplicity in the present discussion, if one assumes decoherence is perfect, the preferred basis can be specified. However, that still does not explain why there is a proper mixture, which is why bhobba also postulates that there is a change from an improper mixture to a proper mixture.
 
  • #80
Of course, I don't blame non-relativistic QT to be not relativistically covariant. It should be Galilei covariant, but it's of course not Poincare-covariant. Why should it be? Without relativity you can invoke a collapse with much less headaches than in relativistic QT. There's no problem with action at a distance in non-relativistic physics as, e.g., in Newton's law of gravity. The only problem is that nature doesn't realize the Galileian space-time structure but the relativistic one.

Einstein causality means that there's no causal connection between space-like separated events. The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system. This is so in local relativistic QFTs, which by construction do not lead to faster-than-light causal signals. That restriction is also compatible with the properties of the S-matrix to make physical sense (unitarity and Poincare covariance).
 
  • #81
atyy said:
It is more than the preferred basis issue. For simplicity in the present discussion, if one assumes decoherence is perfect, the preferred basis can be specified. However, that still does not explain why there is a proper mixture, which is why bhobba also postulates that there is a change from an improper mixture to a proper mixture.
It doesn't need to explain it because by definition of the epistemic notion of state one must admit an ignorance statistical interpretation for all mixed states, so one can avoid the"improper vs proper" mixed state distinction, in other words one doesn't need to explain it anymore than has to explain that standard QM is nonrelativistic(and in that sense incomplete as long as we consider a relativistic world as something more accurate when describing the physics) and therefore states cannot be anymore than an epistemic (incomplete) approximation tool.
 
  • #82
TrickyDicky said:
It doesn't need to explain it because by definition of the epistemic notion of state one must admit an ignorance statistical interpretation for all mixed states, so one can avoid the"improper vs proper" mixed state distinction, in other words one doesn't need to explain it anymore than has to explain that standard QM is nonrelativistic(and in that sense incomplete as long as we consider a relativistic world as something more accurate when describing the physics) and therefore states cannot be anymore than an epistemic (incomplete) approximation tool.

Perhaps "explain" is not the right word. Let me explain what I mean again. It is not enough to state that decoherence defines a preferred basis - one must say what the significance of the preferred basis is. The significance of the preferred basis is that it defines two sub-ensembles and a link between them: sub-ensembles of measurement outcomes and sub-ensembles of the resulting quantum state. This is beyond unitary evolution and the Born rule because these sub-ensembles were not defined within the theory prior to measurement or to perfect decoherence.
 
  • #83
vanhees71 said:
Of course, I don't blame non-relativistic QT to be not relativistically covariant. It should be Galilei covariant, but it's of course not Poincare-covariant. Why should it be? Without relativity you can invoke a collapse with much less headaches than in relativistic QT. There's no problem with action at a distance in non-relativistic physics as, e.g., in Newton's law of gravity. The only problem is that nature doesn't realize the Galileian space-time structure but the relativistic one.

Again, one has to stress that collapse causes no problems with relativity, because collapse does not permit faster than light communication of classical information. Collapse is part of the structure of relativistic quantum field theory. In part, it can be argued that if one adds relativity to the postulates of quantum theory, collapse need not be postulated, but can be derived for spacelike separated observations. The only problem with collapse is whether a Schroedinger picture exists in relativistic quantum field theory, but at least at the non-rigourous level at which the standard model is formulated, the Schroedinger picture is usually assumed to exist.

vanhees71 said:
Einstein causality means that there's no causal connection between space-like separated events. The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system. This is so in local relativistic QFTs, which by construction do not lead to faster-than-light causal signals. That restriction is also compatible with the properties of the S-matrix to make physical sense (unitarity and Poincare covariance).

By "The correlations in entangled properties of far-distant parts of a quantum system are there from the beginning of their preparation and not caused by measuring an observable locally on a part of the system." do you mean that the only cause of the measurement outcome on anyone side is the preparation procedure and the measurement setting on that side, because both are in the past light cone of the measurement outcome?
 
Last edited:
  • #84
atyy said:
It is not enough to state that decoherence defines a preferred basis - one must say what the significance of the preferred basis is. The significance of the preferred basis is that it defines two sub-ensembles and a link between them: sub-ensembles of measurement outcomes and sub-ensembles of the resulting quantum state. This is beyond unitary evolution and the Born rule because these sub-ensembles were not defined within the theory prior to measurement or to perfect decoherence.
I have not stated anything about decoherence, and agree that decoherence is not enough to solve the preferred basis conundrum, and therefore also agree that this goes beyond unitary evolution+Born rule. But I'm sure that you can see that a purely statistical ensembles interpretation of QM avoids/ignores the preferred basis problem because it takes Born seriously that all there is behind the quantum states are probabilities and you have admitted in previous posts that in the two-state cases like the polarizer or Stern-Gerlach it is enough with unitarity. There are clear reasons to think that in the rest of the cases, specially the continuous eigenvalues case there is not a collapse in the way it is usually postulated in Copenhagen due to problems with the very concept of vector state.
 
  • #85
The please don't call it collapse! Collapse implies instantaneous, i.e., faster-than-light signal propagation (whether classical or quantum, I can't say, because I don't know what's meant by a classical signal)! What's meant by "collapse" is simply the acknowledgment of the outcome of measurement, caused by local interactions of the quantum system with the measurement apparatus. "Acknowledgment" doesn't imply a conscious observer but can be just the storage of the outcome of the measurement on a photo plate, a digital storage device or whatever.

The correlations described by entanglement are there because of the preparation procedure of the system in an entangled state. All measurement events are in the future light cone of the "preparation event". So there is no violation of Einstein causality (by construction of the theory!) as long as you describe the process within a local microcausal relativistic QFT.

Also, where is there a problem with a "preferred basis"? Isn't it simply the measurement device which decides what I measure? A state by itself has no preferred basis. You can express it in any basis you like, and the physical observable outcomes described by it, is independent of this choice of a basis (you can even formulate eveything in terms of abstract Hilbert-space objects withoug reference to any particular basis). When calculating the probabilities according to the Born rule, you have of course to define, what's measured and then take the corresponding eigenvectors of the measured observable. In some sense you can say that this choice of which observable you measure is to choose a preferred basis, but why not saying it in such a complicated way?

E.g., in the "teleportation experiment", what's measured are the single-photon energy-density distributions at Alice's and Bob's place, which can be interpreted as the detection probability position distributions around these places. If you take A's and B's measurement protocols with sufficiently accurate detection-time "stamps" you can also say that you measure the joint two-photon detection probability position distributions, and this reveals the correlations described by entanglement. There's no need for some instantaneous collapse nor the assumption of a "preferred basis", except the choice of the basis made in detecting photons (with a certain polarization at the two places in this case).
 
Last edited:
  • #86
vanhees71 said:
Also, where is there a problem with a "preferred basis"? Isn't it simply the measurement device which decides what I measure? A state by itself has no preferred basis. You can express it in any basis you like, and the physical observable outcomes described by it, is independent of this choice of a basis (you can even formulate eveything in terms of abstract Hilbert-space objects withoug reference to any particular basis). When calculating the probabilities according to the Born rule, you have of course to define, what's measured and then take the corresponding eigenvectors of the measured observable. In some sense you can say that this choice of which observable you measure is to choose a preferred basis, but why not saying it in such a complicated way?
Whether one sees it as a problem is quite subjective, clearly for you it is not. But it's simply a defining feature of quantum theory for being based on the introduction of a minimum quantum scale(h). No one can deny that it has fostered an industry of interpretational debates because when mixed with an abstracted from measurement basis-independent mathematical formalism it gives room to all sorts of philosophical ambiguities with little practical interest.
E.g., in the "teleportation experiment", what's measured are the single-photon energy-density distributions at Alice's and Bob's place, which can be interpreted as the detection probability position distributions around these places. If you take A's and B's measurement protocols with sufficiently accurate detection-time "stamps" you can also say that you measure the joint two-photon detection probability position distributions, and this reveals the correlations described by entanglement. There's no need for some instantaneous collapse
Right.
nor the assumption of a "preferred basis", except the choice of the basis made in detecting photons (with a certain polarization at the two places in this case).
This is conceding what you are rejecting all in the same sentence.
 
  • Like
Likes vanhees71
  • #87
TrickyDicky said:
Whether one sees it as a problem is quite subjective, clearly for you it is not. But it's simply a defining feature of quantum theory for being based on the introduction of a minimum quantum scale(h). No one can deny that it has fostered an industry of interpretational debates because when mixed with an abstracted from measurement basis-independent mathematical formalism it gives room to all sorts of philosophical ambiguities with little practical interest.
Right.
Well, I don't know, whether I should this take as a "bug or a feature" of the idea of a "problem of a preferred basis". I tend to the former conclusion ;-).
 
  • #88
vanhees71 said:
Well, I don't know, whether I should this take as a "bug or a feature" of the idea of a "problem of a preferred basis". I tend to the former conclusion ;-).
So do I.
 
  • #89
vanhees71 said:
The please don't call it collapse! Collapse implies instantaneous, i.e., faster-than-light signal propagation (whether classical or quantum, I can't say, because I don't know what's meant by a classical signal)! What's meant by "collapse" is simply the acknowledgment of the outcome of measurement, caused by local interactions of the quantum system with the measurement apparatus. "Acknowledgment" doesn't imply a conscious observer but can be just the storage of the outcome of the measurement on a photo plate, a digital storage device or whatever.

Well, then it is just a matter of terminology. As I have said many times throughout this thread and elsewhere, "collapse" is just another word for "state reduction" or "conditional state" or "a posteriori state" or "quantum jump", and is essentially Ballentine's Eq 9.28. The understanding you are arguing against is not part of the orthodox Copenhagen-style interpretation given in Landau and Lifshitz or in Cohen-Tannoudji, Diu and Laloe. The quantum state or wave function is not necessarily real in the orthodox interpretation, so the interpretation is silent on whether there is any "real" faster than light "influence".

However the minimal interpretation does have a classical/quantum cut, so there are classical signals. Together with the incorrect criticism of state reduction in Ballentine, the lack of an explicit statement of the classical/quantum cut is another fundamental misconception in Ballentine. Because of a failure to state the classical/quantum cut, Ballentine does not have a proper interpretation of the term "measurement", which requires a classical outcome. The lack of a proper conception of measurement shows when Ballentine claims that the orthodox interpretation is wrong and suggests that it fails the experimental test of the spin recombination experiment. The misconceptions concerning the classical/quantum cut, the physical meaning of "measurement", and the collapse postulate lead Ballentine further to dispute a standard explanation of the quantum Zeno effect, and to claim that the orthodox interpretation is inconsistent with data from continuous observations.

There are interpretations that do not require a classical/quantum cut such as the Bohmian interpretation, and the Many-Worlds approach. However, Ballentine does not explicitly state hidden variables, and the only hidden variable theory associated with his work is the erroneous one in his 1970 review. If Ballentine is implicitly assuming Many-Worlds, then his point of view is possibly defensible.
 
Last edited:
  • #90
Nick V said:
Is the collapse of the wave function of the electron in the double slit experiment based purely on the act of observation? Or could it be that the way the instrument used to measure the electron caused it to collapse by how it physically interacted with the electron? Keep in mind the delayed choice experiment.

You might not realize it but I believe the question you are asking is actually a very important one. The lack of particularly helpful responses is largely because "we don't know". That is to say, the common quantum physicist doesn't know, though they'd never admit it.

Your question pertains to a well known problem with what it is properly called the "Von Neumann chain". As you correctly intuit, there is some ambiguity in the Copenhagen interpretation of QM with regards to what exactly qualifies as an "observation". Does the detector actually "observe" the particle or is the detector just another part of the system?

Taking the double slit experiment as an example, when the electron or photon leaves the gun, it ceases to be localised precisely in space time, it becomes describable only in terms of probabilities. You can think of it as existing only in a sort of suspended state, it really takes on a superposition of possible states. To simplify the problem we assume there are only two possible outcomes, either the electron goes through the left slit or the right one. So for the period of time between firing and detecting the particle, it actually exists as a superposition of both states. It is in limbo, so to speak.

Now, suppose you set up a detector of some kind on the other side of the screen to check which slit to particle went through. But now suppose you leave the room... Lock the door and let it all happen. The electron will acquire the superposition, travel (as a wave) through the slits, the two waves will interfere and ultimately reach the detector but then what happens? A decision needs to be made. Does the detector record the electron or not? There is a very important question here. Has the system "collapsed"? Or is the detector now part of the system? Is the detector in limbo just like the electron was? The laws of quantum mechanics to not pick and choose. The principle states that everything in the universe is fundamentally a quantum system and so, logically, the answer is yes, the detector does pick up the quantum dichotomy. The detector has become a part of the system and has not been collapsed, it is now a superposition of two states, one in which the electron was detected, and one in which it wasn't.

Now, think about this. You could extend this chain of detectors detecting detectors for as long as you wanted and surely, each subsequent detector would simply pick up the superposition. This chain is interactions is called the Von Neumann chain and the million dollar question in all this is, "Where does it end?". We know it has to end somewhere because well... what happens when you walk in the room and look at the detector? Do you pick up the dichotomy as well? Maybe you do, there are theories to suggest that you split in two as it were, but you are only capable of experiencing one outcome per universe. Whatever the case, something special seems to happen whenever a "sentient being" observes the system. The chain is clearly broken, there is something fundamentally important about us in this way. Perhaps this is what it means to be "aware".

But this is not the place for philosophy, I merely bring this idea to your knowledge as I believe this may be exactly what you are looking for.

For further reading on the subject I suggest you acquire a copy of "The Self-Aware Universe" by Dr. Amit Goswami PhD.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 59 ·
2
Replies
59
Views
7K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
7K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
8K
  • · Replies 33 ·
2
Replies
33
Views
3K