How measurement in the double slit experiment affects result

In summary: The summary of the content is that it is clear from some more advanced variations on the double slit experiment that retaining or destroying "information" on one half of an entangled pair affects the result of the other. The results tend to either be wave or particle type behavior on a detection device. Including the measuring apparatus is essential for obtaining accurate results.
  • #36
atyy said:
However, in such a case, one cannot call z the sole cause of B, because A is also potentially a cause or correlated with a cause that is independent of z.
Right, that is what post-processing means.

Also, Einstein Causality usually means classical relativistic causality so it includes FP.
Nope, Einstein Causality only implies FP if there is no post-processing. Even in classical physics, FP is wrong when we have post-processing. See for example the very simple Bernouli's Urn example in Jaynes' paper. FP fails woefully there, even though it is obviously Einstein Causal. Einstein Causality is fully consistent with a rejection of FP while post-processing, so there is no conflict and no need to modify the definition of Einstein Causality.
 
Physics news on Phys.org
  • #37
billschnieder said:
Nope, Einstein Causality only implies FP if there is no post-processing. Even in classical physics, FP is wrong when we have post-processing. See for example the very simple Bernouli's Urn example in Jaynes' paper. FP fails woefully there, even though it is obviously Einstein Causal. Einstein Causality is fully consistent with a rejection of FP while post-processing, so there is no conflict and no need to modify the definition of Einstein Causality.

Yes, post-processing is needed for A or B to find out the a Bell inequality, and the time at which the post-processing is carried out means that no superluminal communication of classical information is needed for A or B to discover the Bell inequality violation. The question is whether A or B add an inference step to the post-processing and conclude that the violation occurred at spacelike separation. Usually "reality" is accepted, the violation is inferred to have occurred at spacelike separation. By Einstein Causality, one usually means that this inference step is added to the post-processing so that Einstein Causality is ruled out by a Bell inequality violation. If the inference is not taken, and only the post-processing is performed, then one can have violation of the Bell inequalities and a modified form of Einstein Causality, because although there is a violation of the Bell inequalities, the violation did not occur at spacelike separation. This is uncontroversial and agreed on by Bell himself, and indeed by EPR themselves. However, because EPR favoured doing the post-processing and adding the inference about spacelike separation, the usual definition of Einstein Causality includes the additional inference. Again, if all you are saying is that the post-processing is taken, but not the inference, that is not controversial, except for what one might mean by Einstein Causality.
 
Last edited:
  • #38
But the inequalities assume FP, -- i.e. no post-processing. You cannot then use their violation in the presence of post-processing to reject or modify Einstein causality, which is fully consistent with FP when no postprocessing is done, and also fully consistent with the rejection of FP when post-processing is done. For that, you need a violation in the absence of postprocessing. All experiments to date have used postprocessing.

Remember that EPR was always about the "real physical situation", which is unaffected by " inference".
 
  • #39
billschnieder said:
But the inequalities assume FP, -- i.e. no post-processing. You cannot then use their violation in the presence of post-processing to reject or modify Einstein causality, which is fully consistent with FP when no postprocessing is done, and also fully consistent with the rejection of FP when post-processing is done. For that, you need a violation in the absence of postprocessing. All experiments to date have used postprocessing.

Remember that EPR was always about the "real physical situation", which is unaffected by " inference".

FP assumes post-processing, but that the post-processing does not prevent you from making the inference about the real physical situation, in particular that the correlations did occur at spacelike separation.

Actual experiments are a different matter, and you can certainly find loopholes. However, what is being discussed is the prediction of quantum mechanics that if the measurement choices and outcomes are real, then a violation of a Bell inequality at spacelike separation is the real physical situation.
 
Last edited:
  • #40
atyy said:
FP assumes post-processing, but that the post-processing does not prevent you from making the inference about the real physical situation
I have to disagree. P(A,B|z) = P(A|z)P(B|z) implies that P(B|z) = P(B|Az). The expression P(A,B|z) = P(A|z)P(B|z) is only ever an accurate probability expression if and only if P(B|z) = P(B|Az). But the reverse is not true. In classical probability P(A,B|z) = P(A|z)P(B|Az) is always accurate, but P(A,B|z) = P(A|z)P(B|z) is only accurate when P(B|z) = P(B|Az) or P(A|z) = P(A|Bz). Therefore if you obtain P(B|z) =/= P(B|Az) that tells you that P(A,B|z) =/= P(A|z)P(B|z) and therefore FP must be rejected. It tells you nothing about Einstein Causality.

The type of post-processing we are talking about here is specifically the type in which the results of one side, are used to filter the results of the other. If the post-processing does not change the results then post-processing is not necessary. In other words, if P(A|z) = P(A|Bz), then why post-process in the first place? We know from the delayed choice experiments that post-processing does indeed change the results ie, P(A|z) =/= P(A|Bz) and FP must be rejected, with zero impact on Einstein Causality. It is very difficult to accept this if you believe P(B|z) =/= P(B|Az) necessarily implies physical influence that is why Jaynes' Bernoulli urn example is so important. It removes all doubt that such a view is wrong.

However, what is being discussed is the prediction of quantum mechanics that if the measurement choices and outcomes are real, then a violation of a Bell inequality at spacelike separation is the real physical situation.
Yes, QM simply predicts certain correlations which have been confirmed experimentally, but all the experiments involve post-processing. Bell used FP (without post-processing) to derive inequalities which were violated by experiments involving post-processing. The real physical situation produced the clicks on the detectors, then you added post-processing to the results (after the fact, long after the real-physical situation had done its thing) and obtained the correlations. If the post-processing is irrelevant to the results, then you can simply not do it and show that the violation is still there. You do not need a new experiment, or new data why not use the existing data without post-processing? Unless the post-processing is such an important component of the definition of the problem that it is impossible to avoid it?
 
  • Like
Likes vanhees71
  • #41
vanhees71 said:
Ok, then give the description if the quantum-erasure experiment within classical electromagnetism. I don't think that this is possible.
Give me some time to write the paper, then I'll return to this. It sounds like it's worth that effort.
 
  • Like
Likes atyy
  • #42
Ken G said:
Give me some time to write the paper, then I'll return to this. It sounds like it's worth that effort.

But didn't you say that the usual quantum eraser does not violate a Bell inequality? If so, then it's not that surprising if it has a local classical explanation?

Maybe you can try this version, where they violate a Bell inequality (but not sure about spacelike separation): http://arxiv.org/abs/1205.4926 :D Bonus: they start off with wave-particle duality o0)

That paper references http://arxiv.org/abs/1206.4348 which also has a Bell inequality violation.
 
Last edited:
  • #43
atyy said:
But didn't you say that the usual quantum eraser does not violate a Bell inequality? If so, then it's not that surprising if it has a local classical explanation?
I don't remember saying anything about Bell inequalities, but that is certainly an interesting tack to take.
Maybe you can try this version, where they violate a Bell inequality (but not sure about spacelike separation): http://arxiv.org/abs/1205.4926 :D Bonus: they start off with wave-particle duality o0)
Yes, they like the particle/wave language for describing the interference patterns. Personally, I think it's more insightful to recognize that waves do all that stuff, so the "duality" is that there's not really a distinction.
That paper references http://arxiv.org/abs/1206.4348 which also has a Bell inequality violation.
Thanks for those references, it will be interesting to ponder what is the appropriate classical analog to those Bell inequality violations. Note that by "classical analog" I certainly do not mean "locally real particle treatment" or "local hidden variable treatments", those are classical particle analogs. I'm talking about a classical wave analog.
 
  • #44
Ken G said:
I don't remember saying anything about Bell inequalities, but that is certainly an interesting tack to take.

I thought that was the point you were making in post #23 when you said "Yet the correlations are not surprising when they have a "Bertlmann's socks" flavor (which your correlations did by the way, that's a detail that needs to be fixed up but it's not essential)." But maybe you were referring to a different scenario. To be honest, I just presumed without evidence that the usual delayed choice has no Bell inequality violation, since otherwise the more recent versions wouldn't make the point of adding the violation. I suppose another consideration is that although there is no Bell inequality violation, the Wigner function may not be positive. If the Wigner function is positive, then I think one can have a noncontextual local classical model, but if the Wigner function is not positive, then I think that although a local model is not ruled out, it will have to be contextual. On the other hand, the Wigner function is not the unique function whose marginals are the quantum probabilities, and in special cases one can construct another joint distribution that is positive, even if the Wigner function is not.
 
  • #45
atyy said:
I thought that was the point you were making in post #23 when you said "Yet the correlations are not surprising when they have a "Bertlmann's socks" flavor (which your correlations did by the way, that's a detail that needs to be fixed up but it's not essential)."
Ah, I see what you mean. I was talking about the mathematical description given by vanhees71, not necessarily about quantum erasure experiments writ large. I haven't really thought about whether quantum erasure is a Bell-type violation, but those articles you cited certainly clarify that question. I didn't think the issue was essential-- I suspected the analysis by vanhees71 could be generalized to Bell-type violations, but maybe there is some important wrinkle that distinguishes the classes.
To be honest, I just presumed without evidence that the usual delayed choice has no Bell inequality violation, since otherwise the more recent versions wouldn't make the point of adding the violation.
Yes, I see what you are saying now, and indeed this might be an important difference when it comes to looking for classical analogs.
I suppose another consideration is that although there is no Bell inequality violation, the Wigner function may not be positive. If the Wigner function is positive, then I think one can have a noncontextual local classical model, but if the Wigner function is not positive, then I think that although a local model is not ruled out, it will have to be contextual.
The classical analogs I mean are not local, they are wavelike, by which I mean they can involve interferences in two-point correlations that are sensitive to the history of the entire system. It is my contention that "quantum weirdness" lives in the idea that if you have a quantum, you should have attributes "carried with" that quantum, i.e., local realism, and so any two-point correlations must be correlations between two sets of information independently associated with the different quanta. We never think that should be true of waves, so the issue might not even come up. It's food for thought, if there is a classical-wave analog to Bell inequality violations.
So I think this may be a different issue. On the other hand, the Wigner function is not the unique function whose marginals are the quantum probabilities, and in special cases one can construct another joint distribution that is positive, even if the Wigner function is not.
The importance of the Wigner function is more food for thought-- these are all potentially important issues to ponder, I cannot tell at this stage what the classical importance of these are but it should be included in the landscape of classical analogs.
 
  • #46
Sure, with such a setup you can also do Bell inequality violation tests. Isn't the famous CHSH paper about such an experiment with a set of different relative orientations between Alice's and Bob's polarizer? One should note that nothing changes concerning the interpretation within the minimal interpretation, and I still think my treatment sticks to the minimal interpretation, particularly because, nowhere one needs to argue with a "collapse" as a physical process.
 
  • #47
vanhees71 said:
Ok, then give the description if the quantum-erasure experiment within classical electromagnetism. I don't think that this is possible.

The standard DCQE experiment uses entangled photons and thus cannot be explained using classical electromagnetism. However, you just need correlation, but not necessarily quantum correlation to demonstrate DCQE. An example for a classical version of DCQE is given e.g. in T. Peng, et al., "Delayed-Choice Quantum Eraser with Thermal Light" Phys. Rev. Lett. 112, 180401 (2014).
 
  • Like
Likes atyy and vanhees71
  • #48
Cthugha said:
The standard DCQE experiment uses entangled photons and thus cannot be explained using classical electromagnetism. However, you just need correlation, but not necessarily quantum correlation to demonstrate DCQE. An example for a classical version of DCQE is given e.g. in T. Peng, et al., "Delayed-Choice Quantum Eraser with Thermal Light" Phys. Rev. Lett. 112, 180401 (2014).
Thanks for that, you may have saved me a lot of work, though I'm disappointed it has been done! Though not surprised.
 
  • #49
Cthugha said:
The standard DCQE experiment uses entangled photons and thus cannot be explained using classical electromagnetism. However, you just need correlation, but not necessarily quantum correlation to demonstrate DCQE. An example for a classical version of DCQE is given e.g. in T. Peng, et al., "Delayed-Choice Quantum Eraser with Thermal Light" Phys. Rev. Lett. 112, 180401 (2014).
Thanks a lot for pointing us to this very exciting paper. It made a great read last night (although I lack some sleep now :-)). It's very well understandable. Here's the link to the paper.

http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.112.180401

Unfortunately I couldn't find a link to a free preprint. Unfortunately the quantum opticians seem not to send everything to the arXiv :-(.

Now comes my question. Can we let this through as a proof for the claim that delayed-choice (postselection) experiments can be explained by classical electromagnetics? My answer is no!

The modeling given in the paper is very clearly using the quantum properties of the quantized radiation field. The source is modeled as a (quasi-)thermal source via the superposition of coherent states with random-phase averaging. Then single photons are detected at D0. Looking at all photons, registered there one sees only the interferrence pattern of the single slit, because the two slits are separated by a distance larger than the coherence length of the source at the slits.

Now the experiment is driven further by using the clever Mach-Zehnder like beam splitters by making coincidence experiments, i.e., one looks at the two-photon correlation function for photons at D0 and one of the other detectors. Unfortunately there seem to be some typos in the text, mixing up the detectors D1,...,D4 somewhat. The labeling of the plots in Figs. 2 and 3 are, of course, correct. Measuring the coincident detection of a photon at D0 and D1 and filtering such as to register only really coincident photon events, i.e., such "picked up" by D0 and ##\mathrm{D}\alpha## (##\alpha \in \{1,2,3,4 \}##) from two wave trains within the coherent state coming from the same time ##t_0## as encoded in the electric-field operator in Eq. (5)-(7). Then the corresponding two-photon correlation function given by Eq. (8) is measured. For the QFTlers among us: That's nothing else than the appropriate two-photon Wightman function for the number operators of photons at D0 and ##\mathrm{D}\alpha##.

Now the setup is such that if you have a coincident detection of two photons at D0 and D4 you know that the photon registered by D4 must have gone through slit B (i.e., you have which-way information). From the incoherence of the wave trains emitted from slits A and B, then necessarily also the photon registered at D0 must have come from slit B, and thus when looking only on such coincident two-photon events these photons registiered by D0 only show the single-slit refraction pattern (Fig. 2 only shows the main maximum) as expected since due to the filtering process with help of the photon registered at D4 we have gained which-way information concerning the photon registered at D0. In the same way one deduces that one also sees only the single-slit interference pattern when using the coincident two-photon appearance at detectors D0 and D3.

On the other hand when looking for coincident two-photon events at D0 and D1, we cannot decide through which slit the photon at D1 has run and thus also not for the coincidently registered photon at D0. As the calculation and also experiment shows, then one observes the double-slit interference pattern (of course with the single-slit pattern as envelope). Correspondingly one sees also the double-slit interference pattern when using the coincident two-photon appearance at D0 and D2.

Thus, the final statements in the paper, where the math is discussed, is correct. There's something mixed up in the description directly following Eq. (1), which is of course a simple typo.

As this analysis shows, again this experiment detects two-photon Fock states. The difference to the above discussed quantum eraser experiment is that not entangled photon pairs where prepared but a quasi-thermal source with a cleverly chosen geometrical setup concerning the coherence length and the distance of the double slit. Nevertheless, the "delayed erasure" [1] of the which-way information and the observed appearance of the double-slit interference pattern for the appropriately chosen subensembles is only possible within the quantum picture of the radiation field. I don't see, how you can describe these coincidences within classical Maxwell theory. This is closely related to the Hanbury-Brown and Twiss effect (HBT) and its explanation by Glauber et al.

[*] It's a "delayed choice" through "postselection", whether you want which-way information or the double-slit interference pattern, because the photon at D0 is detected significantly relative to the "rise time" of 1 ns of the photo diodes before (5ns delay) the one at ##\mathrm{D}\alpha##. Thus, the photon at D0 is absorbed significantly before we decide which photon subensemble we chose (i.e., one leading to which-way information or one "erasing" it).

Again, according to my understanding of the minimal interpretation the corresponding observations are due to the preparation of the "pseudo thermal" radiation field at the very beginning, i.e., by shining a laser on the "ground glass". There's no retrocausal collapse of the photon detected at D0 by registering the photon at the detector ##\mathrm{D} \alpha##. The observation or non-observation of the double-slit interference pattern in the respective choice of coincidence measurement, within this (ensemble) representation, is thus not due to retrocausal manipulation of the physical state of the observed system. Of course, also here we implicitly assume that "Einstein causality" is correct, i.e., that the photons travel with the speed of light and that the detectors are thus uncorrelated, i.e., that there is no faster-than light "communication" between the detectors. Of course, one cannot strictly prove this by this experiment. So there seems to be still a loophole for metaphysicists who like to give up Einstein causality in favor of what they call "realistic" (or better "ontological") interpretation of quantum states. I'm more on the conservative side and prefer to assume that the space-time structure of (at least special) relativity to be valid :-).

The whole experiment is a "delayed choice" experiment, because the path length difference between the photon detected at
 
  • #50
I think it should be made clear (if I undrerstand both correctly) that vanhees71 and billschneider are not saying the same thing.

billschneider, citing Jaynes, claims the fundamental logic of the Bell inequalities is wrong, and that local realism or Einstein Causality in the usual sense is consistent with quantum mechanics.

vanhees71 is only claiming a weakened form of Einstein Causality, in which RCC is preserved but FP is given up so that correlations do not have a common cause explanation.
 
  • #51
I would also point out that having a "classical analog" is not making the claim that everything can be explained without quantum mechanics. It just means that it is untrue to claim some phenomenon had no classical analog, i.e., that there is no classical version of a similar phenomenon. A good example is quantum chaos, where the wavefunction takes on a different character in systems that exhibit classical chaos. We don't say the existence of classical chaos means you don't need a wavefunction to describe that system. Still, having said that, I have a slightly different classical analog in mind, and I'm not sure about the Bell ramifications. Questions about wave vs. particle behavior are a bit different from Bell violations.
 
  • #52
atyy said:
I think it should be made clear (if I undrerstand both correctly) that vanhees71 and billschneider are not saying the same thing.

billschneider, citing Jaynes, claims the fundamental logic of the Bell inequalities is wrong, and that local realism or Einstein Causality in the usual sense is consistent with quantum mechanics.

vanhees71 is only claiming a weakened form of Einstein Causality, in which RCC is preserved but FP is given up so that correlations do not have a common cause explanation.
I think you understood me right, but I don't see, where Einstein Causality is weakened. In my understanding of the statistical interpretation, there is no faster-than-light signal necessary to explain the outcome of the delayed-choice experiments in accordance with quantum mechanics, and I still hold to my opinion that the common-cause explanation for this phenomenon is the entanglement of the photons for the eraser. Also in the delayed-choice experiment by Peng and all there is no superluminal (or even retrocausal) action at a distance. The two-photon correlations there are also due to the preparation procedure encoded by the description given in the paper.

I'd still be glad about comments about whether my understanding of the Peng paper is correct (at least on the physics level; of course there seem to be subtle or even big differences among us concerning the metaphysical implications, which however should be left out of the discussion as long as the physics is absolutely clarified).
 
  • #53
vanhees71 said:
I think you understood me right, but I don't see, where Einstein Causality is weakened. In my understanding of the statistical interpretation, there is no faster-than-light signal necessary to explain the outcome of the delayed-choice experiments in accordance with quantum mechanics, and I still hold to my opinion that the common-cause explanation for this phenomenon is the entanglement of the photons for the eraser. Also in the delayed-choice experiment by Peng and all there is no superluminal (or even retrocausal) action at a distance. The two-photon correlations there are also due to the preparation procedure encoded by the description given in the paper.

Because of the EPR paper, Einstein Causality usually means RCS + FP, so if one rejects FP, Einstein Causality is weakened (by definition).

Also, throughout classical physics, FP is the definition of common cause. For example, in biology the framework of Bayesian networks is quite common, and FP is the definition of common cause in Bayesian networks. So if FP is rejected, there is no notion of common cause, unless one uses a new definition, which would be ok. But the question is what new definition of common cause are you using? Actually, the Cavalcanti and Lal paper http://arxiv.org/abs/1311.6852 is interesting because it discusses a proposal by Leifer and Spekkens http://arxiv.org/abs/1107.5849 to define a new notion of common cause, so that there is a modified Einstein Causality in which one keeps RCS and a more general notion of common cause.

Edit: In my previous posts I used "RCC", which is a transcription error of mine from Cavalcanti and Lal's terminology, so here I use "RCS" instead for "relativistic causal structure".
 
Last edited:
  • #54
I see, but I don't think that EPR is consistent with the modern findings concerning the predictions of quantum theory in the connection with entanglement as is precisely the topic of the EPR paper. So you cannot keep both assumptions, which you call RCS and FP. I don't see, why one should have to keep FP for any reason. For me it contradicts the fundamental principles of quantum theory, which where tested very thoroughly with all these experiments. As bhobba likes to quote, quantum theory just introduces a way to consistently describe the probabilistic behavior of nature, which is to our best knowledge today an observed fact. This doesn't exclude the possibility that one finds another more comprehensive deterministic theory, which then most probably will be even weirder than quantum theory itself from a metaphysical point of view. but physics (particularly quantum theory) teaches us that nature behaves is she behaves and doesn't care too much about our metaphysical convenience ;-)). I'll have a look at the papers later.
 
  • #55
vanhees71 said:
I see, but I don't think that EPR is consistent with the modern findings concerning the predictions of quantum theory in the connection with entanglement as is precisely the topic of the EPR paper. So you cannot keep both assumptions, which you call RCS and FP.

Yes, I mentioned EPR only because "Einstein Causality" will usually be understood to mean what EPR advocated, which was RCS + FP, which cannot both be kept.

vanhees71 said:
I don't see, why one should have to keep FP for any reason. For me it contradicts the fundamental principles of quantum theory, which where tested very thoroughly with all these experiments.

Yes, it's fine to remove FP. But since FP is the old definition of common cause, one needs a new definition if one wants to say that the entangled state is a common cause.
 
  • #56
Hm, I don't know, how to formalize my statement. Just in plain words, I think the quantum mechanical states, represented by the statistical operator in the formalism (which is by definition a trace-class positive semidefinite self-adjoint operator with trace 1; a state is pure if and only if it's a projection operator), describe the known properties of the system under consideration. It's objectively related to a single system through an equivalence class of preparation procedures but, even when we have full knowledge about the quantum system, i.e., if a complete set of compatible observables is determined through the preparation procedure and thus the state is a pure state, we have only probabilistic knowledge about the outcomes of measurements of the observables possible for this system. To test the corresponding predictions about such meaasurements one thus always needs a suffuciently large ensemble of independently prepared systems.

Among the many predictions of quantum theory are also the non-local correlations known as "entanglement", which are found contradicting local deterministic (hidden-variable) theories by Bell's and related theorems. What's now the "cause" for these "non-classically strong" correlations? In the above picture about states, it's the preparation procedure in such an "entangled state", and in all cases, known to me, (including the gedanken experiment described in the original EPR paper) these preparations are due to local manipulations on particles or photons, which then evolve for some "long" time such that subsystems can be measured at far distant (again local) experiments. The correlations due to entanglement are there for the whole time, from the very first preparation procedure. The measurements on the photons by Alice and Bob in the eraser experiment or the coincident-photon-apir countings in the above discussed experiment by Peng et all are local and can be thus, i.e., due to the locality of the interaction between each of the photons with the corresponding detector, done at space-like separated space-time regions and thus are (due to the microcausality of QED) not causally mutually affecting the far-distant photons measured in coincidence. I'm not aware of any test of these issues, where this explanation with local microcausal relativistic QFT fails, and thus I think there's no problem with causality. As demonstrated in my summaries of the two experiments above, there's no need for a collapse interpretation, and thus no need to assume a faster-than-light manipulation of a far-distant object or even a "retrocausal action at a distance". One must clearly distinguish between interactions and correlations. The former are always local by construction of the standard relativistic QFTs; the latter can refer to far-distant objects and are thus non-local. The locality of the interactions, i.e., microcausality, guarantees the consistency of the S-matrix with relativistic covariance, unitarity, and causality.
 
  • #57
vanhees71 said:
What's now the "cause" for these "non-classically strong" correlations? In the above picture about states, it's the preparation procedure in such an "entangled state", and in all cases, known to me, (including the gedanken experiment described in the original EPR paper) these preparations are due to local manipulations on particles or photons, which then evolve for some "long" time such that subsystems can be measured at far distant (again local) experiments.


vanhees71 said:
The locality of the interactions, i.e., microcausality, guarantees the consistency of the S-matrix with relativistic covariance, unitarity, and causality.

I read the rest of the post too, but I think the key issues are:
(A) Do cluster decomposition and microcausality have anything to say about the entangled state as the "cause" of the nonlocal correlations?
(B) If the entangled state is a "cause" is it a local cause?

Let me try to address (B) first. I need to think more about (A).

Let's work in a frame in which Alice and Bob measure simultaneously, so the observable measured is the tensor product of two local spacelike-separated observables. Let's use the Heisenberg picture so the initial entangled state does not evolve, but the operators do. In the Heisenberg picture, the field operator has an equation of motion that has the same form as the classical relativistic equation, except that it is an operator, so there is a good argument that the field dynamics obey local causality or Einstein causality. That leaves the initial state. I'm not sure it's the only cause, but even if it is, is the initial state a local cause?

(1) First the initial state is in Hilbert space, so it is not obviously local or associated with any point in spacetime. To avoid this we can try to
(2) Associate the initial state with the location of the preparation procedure. But if we do this, the state does not evolve, so when the measurement is made, if the preparation and measurement are spacelike separated, then the measurement outcome will depend nonlocally on the state at a spacelike location. To avoid this we can try to
(3) Associate the initial state with the entire initial spacelike slice, or put a copy of the state at every location on the intial spacelike slice. But if we do this, the preparation procedure itself is nonlocal, since the local preparation procedure is affecting the entire spacelike slice.

The basic reason I don't think the state can be a local cause is that for the state to be a cause, we have to treat it as real (FAPP). But if we treat it as real, then in a frame in which the measurements are not simultaneous, the state for the later measurement will be caused by the earlier measurement and collapse which is manifestly nonlocal. Going to a frame in which the measurements are simultaneous hides the nonlocality, but cannot make it go away.
 
Last edited:
  • #58
Regarding (A), here is a discussion that says cluster decomposition does not apply for entangled initial states: https://www.physicsforums.com/threads/cluster-decomposition-and-epr-correlations.409861/#post-2773606 (especially posts #4 by humanino and #7 by Demystifier).

As for microcausality, what it says is that the measurement of one local observable cannot affect the measurement of a different spacelike-separated local observable. This is a requirement for no superluminal signalling of classical information. This is about the causes of local events. It does not say anything about the cause of nonlocal correlations.
 
Last edited:
  • #59
I've the impression we have just a different language again. So I think we have to clarify what we mean by "cause" first. Two events a causally connected if one event necessarily leads to the other event. Now the fundamental time arrow is defined via causality, i.e., it parametrizes the order of causal events and by convention an event A can only be the cause of an event B when A happens before B.

In relativistic physics this implies that two causally connected events must be time-like or light-like relative to each other, because otherwise the statement that A causes B would become observer dependent, i.e. the causality relation, defined by time order, would not be invariant under proper orthochronous Lorentz transformations (the proper orthochronous Lorentz group is the symmetry group underlying the structure of special relativistic spacetime).

This implies that the natural laws, describing the dynamics of systems through equations of motion, should also be causal in the sense that interactions between distant subsystems, affecting changes of one subsystem due to the interaction with the other. So far, on a fundamental level, this is realized by the field picture. In classical physics this means that the equations of motion of particles are described as mediated through fields, which obey equations of motion that are formulated in terms of causal laws, which excludes (at least to experience from model building so far) tachyonic field equations (free tachyonic fields can be interpreted in a way fulfilling the causality constraint for their propagation, but this has not been achieved for interacting tachyonic fields and tachyonic fields interacting with particles). The forces in the equation of motion of the particles are local functions of the particle's space-time coordinate and its time derivatives, and the field-equations of motion have source terms due to the presence of particles which admit causal (retarded) solutions. On the classical level, however, there is a tremendous self-consistency problem. Even the most simple case of electromagnetics of point particles is intrinsically inconsistent or at least only solvable in an approximate sense (see, F. Rohrlich, Classical charged particles).

In quantum theory these ideas are transferred to the ideas of local microcausal quantum field theories, from which the causality of observable quantities can be deduced (it's a sufficient condition for that, whether it's necessary I'm not clear about; so far I haven't seen a formal proof of this, but FAPP we can constrain our discussion to such local microcausal QFTs because all practically relevant theories, including the Standard Model of elementary particle physics, is of this kind).

This also implies the linked-cluster principle, which states that local (local here means local in both space and time of course) observations (and so far all our observations are quite local, made with measurement apparati in the lab which always have finite spatial extent) are uncorrelated if the observations are space-like separated. There is no restriction on the states in which the observed (sub)systems are prepared. Also if there is entanglement as in our quantum-eraser example, there is no way to find out about this entanglement with local observations. Alice and Bob, making local polarization measurements about photons from a parametric-down conversion "preparation". They simply find unpolarized photons. There's no way for Alice (Bob) to tell that her (his) photon is part of an entangled pair. To figure this out they have to compare their measurement protocols, which they have made such as that they are able to know which photons belong to one pair. That's done by properly keeping track about the time of their measurements, i.e., the ensemble of entangled photon pairs must be timewise separated enough such that within their time-measurement accuracy (limited by the dead time of their photon detectors) they can clearly resolve which photon pair comes from the common preparation as an entangled pair by parametric downconversion. Then they find out about the correlations encoded in the formalism by the description as a two-photon state for which the polarization states of the single photons is entangled. Thus the "cause" for the correlation lies in the preparation of the two-photon state in this entangled state, i.e., in the preparation procedure at the very beginning, i.e., causally (!) before either Bob or Alice measure the polarization state of their photon.

Now to the Heisenberg picture. Of course, it's clear that the choice of the picture of time evolution is totally unimportant. The outcome is always the same in each picture. In the Heisenberg picture the state (statistical operator for pure or mixed states) is time-independent. But then the operators, describing observables and thus their corresponding (generalized) eigenstates, carry the full time evolution with the full Hamiltonian (including the interaction part). For the Schrödinger picture it's the opposite, i.e., the stat. op. carries the full time evolution and the observables are described by time-independent operators. For the general Dirac picture you split the time-dependence among the stat. op. and the operators describing observables. Among all pictures you change with unitary transformations. So the probabilities for Alice and Bob to find one of the photons in the entangled pair in a certain polarization state at their place at a certain time are of course the same in all pictures. So the answer to the question, whether "the state is the cause for the correlations" cannot depend on the choice of the picture of time evolution. This is so by construction: QT is invariant under changes of the picture of time evolution (of course, here I ignore all formal problems in connection with Haag's theorem concerning the interaction picture used to define perturbative QFT).

However, it depends somewhat on the metaphysical interpretation of "state". I think to define the meaning of the notion of "states" in QT is the key for all these debates and misunderstandings in such debates. Of course, physicswise a state is not "a selfadjoint normalized trace-class operator in Hilbert space". This is the mathematical description of it within the theory. Physical notions must be defined as statements about objectively observable facts. That's why in my understanding, the physical definition of "state" is that it is an equivalence class of preparation procedures. E.g., in a Stern-Gerlach experiment a silver atom is in the state with spin up in ##z## direction in the sense that I've let it go through an inhomogeneous magnetic field such that I can (with practically arbitrary accuracy) be sure that silver atoms are separated according to the two different spin states and that I can filter out the unwanted spin-down beam. For Alice and Bob in the quantum-eraser experiment the state is given by the parametric-down conversion setup: You shine a laser beam on the appropriate birefringent chrystal and just consider the polarization-entangled photons. This ensures, as proven empirically (with practically arbitrarily high accuracy), that you have photon pairs showing the correlations described by the corresponding statistical operators (or rays in Hilbert space for this case of a pure state). In this sense, I say the "quantum correlations are caused by the preparation in this particular state".
 
  • #60
vanhees71 said:
I've the impression we have just a different language again. So I think we have to clarify what we mean by "cause" first. Two events a causally connected if one event necessarily leads to the other event.

Let's try to define this mathematically. Let's suppose that there are only 3 events u,v,w. If u is the cause of w, then P(w|u,v) = P(w|u), otherwise u does not necessarily lead to w, since changing v can influence the distribution of w.

Let's consider a simple case without spatially separated measurements, where A is Alice's outcome, a is Alice's measurement setting, and z is the preparation procedure. In general, P(A|a,z) will not be P(A|z) so z is not the cause of A. The preparation procedure is not the cause of the measurement outcome, because the outcome also depends on the measurement setting. Using this definition of cause, the preparation procedure and measurement setting together are the cause of the outcome.
 
  • #61
To begin with, I don't understand this formal definition of "causality". This means that no matter what v is it doesn't influence w; one only needs u to lead to the same probability outcomes for w, but this is not necessarily true even for classical probability experiments, since of course there may be also causal influences additional to u, which change the probability distribution.

The rest is even more obscure to me. What else than the preparation procedure in an entangled state could be the cause for the correlations described by the entanglement? For me that's almost a tautology.
 
  • #62
vanhees71 said:
To begin with, I don't understand this formal definition of "causality". This means that no matter what v is it doesn't influence w; one only needs u to lead to the same probability outcomes for w, but this is not necessarily true even for classical probability experiments, since of course there may be also causal influences additional to u, which change the probability distribution.

Yes, in reality there may be other causal influences additional to u, which change the probability distribution. The point is that causal influences when manipulated affect the probability distribution, and non--causes when manipulated do not affect the probability distribution. So if there are only two events u and v that precede w, we say that u is a causal influence and v is not a causal influence if P(w|u,v) = P(w|u).

vanhees71 said:
The rest is even more obscure to me. What else than the preparation procedure in an entangled state could be the cause for the correlations described by the entanglement? For me that's almost a tautology.

But that is clearly not the case in a frame in which the measurements are not simultaneous. In the formalism, the state collapses after the first measurement. If you say that the state reflects a preparation procedure (which is correct), why don't you count the state after collapse as being prepared by the first measurement?
 
  • #63
But if you assume causality then the collapse assumption leads to a contradiction! Assume that A registers her photon polarization before B in your new frame. Assuming the collapse would cause Bob's measurement result would mean that you affect an event (B's measurement of his photon's polarization) that's spacelike separated in this new frame. In addition you can find another reference frame, where B's measurement process is before A's. Then your collapse hypothesis would mean that in this frame B causally affects A in contradiction to the conclusion in the other frame, where A's measurement causally affects B's measurement. That's the whole point of EPR's criticism.

For me, from a modern point of view, it's not a criticism of QT in the minimal interpretation, but only a QT, where you make this collapse hypothesis, and this hypothesis is superfluous to predict the out come of experiments by QT. That's why I abandon the collapse hypothesis and conclude that the cause of the quantum correlations for sure is not due to the meausurement of the polarization of one of the photons but due to the preparation procedure in an polarization-entangled two-photon state in the very beginning.
 
  • #64
vanhees71 said:
But if you assume causality then the collapse assumption leads to a contradiction! Assume that A registers her photon polarization before B in your new frame. Assuming the collapse would cause Bob's measurement result would mean that you affect an event (B's measurement of his photon's polarization) that's spacelike separated in this new frame. In addition you can find another reference frame, where B's measurement process is before A's. Then your collapse hypothesis would mean that in this frame B causally affects A in contradiction to the conclusion in the other frame, where A's measurement causally affects B's measurement. That's the whole point of EPR's criticism.

For me, from a modern point of view, it's not a criticism of QT in the minimal interpretation, but only a QT, where you make this collapse hypothesis, and this hypothesis is superfluous to predict the out come of experiments by QT. That's why I abandon the collapse hypothesis and conclude that the cause of the quantum correlations for sure is not due to the meausurement of the polarization of one of the photons but due to the preparation procedure in an polarization-entangled two-photon state in the very beginning.

My point is that I don't understand why your interpretation is minimal. In a minimal interpretation, the state is not real, so it is not even a cause, and we don't care if we have a cause, do we? Here RCS is not rejected, but the correlations do not have a cause within the theory (option A).

To me it seems that if you treat the state as a cause, then you are treating the state as physical, which is fine within a minimal interpretation FAPP. But then you should do so consistently, so that the state after collapse is also a cause. Here RCS is rejected, and the correlations have a cause (option B).

Copenhagen, allows both A and B, because they are on different levels. But can one say on the same level that RCS is kept and the correlations have a cause? That seems very problematic.

Edit: Strike this [Anyway, let me say a bit more why my definition of a cause makes sense (it's not my definition, most biologists use it). Let's consider EPR again. Let A be Alice's outcome, a be Alice's measurement setting, B be Bob's outcome, b be Bob's measurement setting, and z be the preparation procedure. By considering Alice's reduced density matrix, we know that P(A|B,a,b,z)=P(A|a,z). So Alice's outcome has the preparation procedure and her measurement settings as causes, but Bob's outcome and measurement setting are not causes of Alice's outcome. This is consistent with relativistic causality, since Alice's measurement setting and the preparation procedure are both in her past light cone. So this is an example that shows that my definition of cause is a reasonable one.]
 
Last edited:
  • #65
I've to think about your Edit first. I think, it's a bit more complicated than that. You have to distinguish between the statistical description of the whole ensemble or the subensembles after comparing the measurement protocols (postselection) which either erase the which-way information (restoring the interference pattern) or gain which-way information (leading to no interference pattern). Of course, these are two different mutually exclusive measurement protocols and in this sense the interference pattern and the which-way information are complementary in Bohr's sense.

What's also not so clear to me, and which may be also a key to resolve our troubles with the right interpretation, is the question, whether QT is also an extension of usual probability theory or not. Some people even talk about "quantum logics", i.e., see the necessity to define quantum theory as an alteration of the very foundations of logics and set theory, which of course is also closely related to the foundation of usual probability theory (say in the axiomatic system according to Kolmogorov).

However, also a minimal interpretation must say, what's the meaning of the state in the physical world, and this is in my understanding of a minimal interpretation just given by Born's rule, i.e., the probabilistic statements about measurements. On the other hand you also must be able to associate a real-world situation with the formally defined state (statistical operator) and this, why the state is also operationally defined as an equivalence class of preparation procedures, which let you prepare the real system in a way such that it is described by this state. This is a very tricky point in the whole quantum business. In my opinion it's best worked out in Asher Peres's book "Quantum Theory: Concepts and Methods".
 
  • #66
What I meant by the edit in post #65 is to ignore it - I don't think it's correct.
 
  • #67
atyy said:
Let's try to define this mathematically. Let's suppose that there are only 3 events u,v,w. If u is the cause of w, then P(w|u,v) = P(w|u), otherwise u does not necessarily lead to w, since changing v can influence the distribution of w.

atyy said:
So if there are only two events u and v that precede w, we say that u is a causal influence and v is not a causal influence if P(w|u,v) = P(w|u)

Therein lies the problem. There is a difference between the following statements:

1) If u is a cause of w but v is not a cause of w then P(w|u,v) = P(w|u)
2) u is a cause of w but v is not a cause of w if P(w|u,v) = P(w|u).

(1) is correct, and (2) is wrong (compounded syllogistic fallacy) as can be seen from Jaynes' simple urn example:

We have an urn with 1 red ball and 1 white ball, drawn blindfolded without replacement
w = "Red on first draw"
u = "Red on second draw"
v = "There is a full moon"

P(w|u,v) = P(w|u).

Would you say "u = Red was picked on the second draw" is a cause of "w = Red was picked on the first draw"?

Those who believe this may end up with the conclusion that the first pick did not have a result until when the second pick was revealed, it collapsed it, ie retro-causation. It is easy to find similar examples that can easily be misunderstood as "non-locality".
 
  • #68
vanhees71 said:
I've to think about your Edit first. I think, it's a bit more complicated than that. You have to distinguish between the statistical description of the whole ensemble or the subensembles after comparing the measurement protocols (postselection) which either erase the which-way information (restoring the interference pattern) or gain which-way information (leading to no interference pattern). Of course, these are two different mutually exclusive measurement protocols and in this sense the interference pattern and the which-way information are complementary in Bohr's sense.

Let's talk about EPR, just to make things easier, and explicitly not consider retrocausation.

vanhees71 said:
What's also not so clear to me, and which may be also a key to resolve our troubles with the right interpretation, is the question, whether QT is also an extension of usual probability theory or not. Some people even talk about "quantum logics", i.e., see the necessity to define quantum theory as an alteration of the very foundations of logics and set theory, which of course is also closely related to the foundation of usual probability theory (say in the axiomatic system according to Kolmogorov).

In the usual Copenhagen interpretation in which the state is considered physical FAPP, there isn't a big change from normal probability. The only difference is that the pure states are rays in Hilbert space. There is also a common sense causality, except that it is not relativistic causality. However, there is no conflict with relativity, since relativity does not require relativistic causality, and only requires that there is no superluminal transmission of classical information. It's only if one wants to maintain relativistic causality and the usual meaning of causation that one cannot use the familiar definitions of causal explanation.

vanhees71 said:
However, also a minimal interpretation must say, what's the meaning of the state in the physical world, and this is in my understanding of a minimal interpretation just given by Born's rule, i.e., the probabilistic statements about measurements. On the other hand you also must be able to associate a real-world situation with the formally defined state (statistical operator) and this, why the state is also operationally defined as an equivalence class of preparation procedures, which let you prepare the real system in a way such that it is described by this state. This is a very tricky point in the whole quantum business. In my opinion it's best worked out in Asher Peres's book "Quantum Theory: Concepts and Methods".

Yes, that is not a problem. The state is an equivalence class of preparation procedures that yield the same measurement outcome distributions. One doesn't have to go to Peres for that, it is standard Copenhagen. The question is can one have relativistic causality and have a local explanation for the correlations? For the usual definitions of causal explanation, the answer is no. It is often said that the Bell inequalities rule out local realism - which is vague, since there are several possible different meanings of realism. However one possible trade-off between locality and realism is:

(1) Accept that the correlations have no cause (the entangled state is not real, and so cannot be a cause)
(2) Accept that the entangled state is real FAPP, and together with the measurement settings can explain the correlations, but lose locality.

Again, quantum theory is local in many senses. Letting A and B be the outcomes, a and b be the settings, and z be the preparation procedure, we can consider quantum mechanics to be local because P(A|a,b,z)=P(A|a,z) meaning that the distant measurement setting does not affect the distribution of local outcomes. There is no superluminal signalling. There is commutation of spcaelike-separated observables and cluster decomposition. But these are different notions from local causality or Einstein causality.
 
Last edited:
  • #69
LocalCausality.jpg


Here is a simple way to see why Einstein causality is ruled out. A and B are the measurement outcomes, S and T are the measurement settings and ##\lambda## is the preparation procedure. The arrows indicate possible causal influences, and this is consistent with relativistic causality because the two possible causes of event A are S and ##\lambda##, both of which are in the past light cone of A; and the two possible causes of event B are T and ##\lambda## both of which are in the past light cone of B. However, the point of the Bell inequality violation is that this causal structure is inconsistent with quantum mechanics.

The diagram is Figure 19 of Woods and Spekkens http://arxiv.org/abs/1208.4119.
 
Last edited:
  • Like
Likes DrChinese
  • #70
atyy said:
View attachment 76513
Here is a simple way to see why Einstein causality is ruled out. A and B are the measurement outcomes, S and T are the measurement settings and ##\lambda## is the preparation procedure. The arrows indicate possible causal influences, and this is consistent with relativistic causality because the two possible causes of event A are S and ##\lambda##, both of which are in the past light cone of A; and the two possible causes of event B are T and ##\lambda## both of which are in the past light cone of B.
This is perfectly fine. Nothing in the above is controversial or "rules out" Einstein Causality. However you then say

However, the point of the Bell inequality violation is that this causal structure is inconsistent with quantum mechanics.
And the point I've been trying to explain to you is that the logic of that conclusion is inconsistent with the classical probability treatment of post-selected experiments even when Einstein Causality demonstrably holds. So there is absolutely no difficulty with Einstein Causality.

Secondly, your diagram is incomplete. You need to draw an arrow from both A and B to a new circle labeled "POSTPROCESSING" and then two more arrows from there to two new circles labelled ##A_B## and ##B_A##, (read as A results filtered according to B results and B results filtered using A results).
Then it is clear that the results A and B , never violate Bell's inequality on their own, rather, it is ##A_B## and ##B_A## that violate Bell's inequality. Post processing is an integral component of such types of experiments, in fact, I would argue that it is part of the "preparation procedure", though some may object to the fact that it happens after. But If the state is not physical but only represents information about the real physical situation, then the "preparation procedure" of the state, can include information gained well after the physical situation happened, without in anyway violating Einstein Causality (which is always ontological). ##\lambda## and "POSTPROCESSING" go hand in hand, so you cannot say it is the post-processing alone that "causes" the correlation. Note again that A and B do not violate the inequalities but ##A_B## and ##B_A## do.
 
  • Like
Likes vanhees71

Similar threads

  • Quantum Physics
2
Replies
36
Views
1K
Replies
5
Views
781
Replies
60
Views
3K
Replies
7
Views
1K
  • Quantum Physics
Replies
18
Views
1K
Replies
3
Views
967
Replies
42
Views
1K
Replies
19
Views
957
Replies
75
Views
4K
Replies
8
Views
2K
Back
Top