Can we violate Bell inequalities by giving up CFD?

Click For Summary
The discussion centers on the potential to violate Bell inequalities by relinquishing counterfactual definiteness (CFD) while maintaining locality. Participants argue that entanglement and quantum mechanics (QM) do not require mystical explanations; rather, they involve correlations arising from superposition. The conversation highlights the distinction between classical and quantum correlations, emphasizing that giving up CFD allows for the acceptance of qubits instead of classical bits. It is noted that while locality can be preserved by avoiding superluminal signaling, predictability must be sacrificed to align with Bell's theorem. Ultimately, the dialogue underscores the foundational principles of QM in understanding entangled systems and their correlations.
  • #61
stevendaryl said:
This distinction between factorizability and signal locality causes philosophical problems for me, no matter what interpretation of "locality" you're using. For some people, signal locality is all that's important, so they're perfectly happy with saying QM is local (or is not nonlocal, to make a fine distinction). But I have problems with that. What is special about "choices made by agents"? Why should physics particular care about those sources of unpredictability?
I agree that it would be much less mysterious if nature just did behave classically. But unfortunately she doesn't and at some point we just have to accept it and adopt the most reasonable explanation. After taking all possibilities into consideration, I've personally come to the conclusion that we have to accept the fact that there is a peculiar thing like quantum probability theory that we just don't quite understand yet. The reason I think so is that it applies universally to every phsical theory. If quantum probabilities weren't a thing, then why can we apply quantum theory to large effective systems without even knowing the actual details of the interactions? If there were a fundamental theory, then we wouldn't expect that simplifying it would still pertain its structure as a quantum theory. But as a matter of fact, the quantum framework works nicely at all levels of complexity. I can imagine that one day we might even successfully apply it to models of economics. And economics clearly isn't a theory of physics.

(Of course, everyone is allowed to have their own opinion. I just don't accept it if people like Ilja force their personal prejudices upon others, especially if there is no evidence in favour of them.)
 
Physics news on Phys.org
  • #62
rubi said:
I agree that one can also take the other point of view and I accept that people do so. I just wanted to explain that one doesn't need to and quantum theory can be a very satisfactory theory if one doesn't.

I think one should be clear that those who take the "other point of view" are not claiming that quantum theory is not a very satisfactory theory. In the same spirit that one can take QED to be a very satisfactory theory because of the Wilsonian effective theory point of view, one can also say that very satisfactory theories can themselves indicate their incompleteness and point towards theoretical opportunities.

rubi said:
Well, I would say that "nature" refers to nature and a theory is just a representation of some ideas about nature in the language of mathematics. We can't read off what nature is by looking at a mathematical theory.

I think your language is unusual. If you would like to just say quantum theory is not a theory about what nature is, but what we can say about nature, ie. predict the probabilities of outcomes, then that would be fine. But going on to say that quantum theory explains the observations is controversial. Usually, in the operational view the wave function is not taken to be real, and just a tool. If the wave function is taken to be an explanation, then it is taken to be real, and collapse is real, and relativistic invariance is manifestly violated.

rubi said:
Pushing the measurement to the end means that you are describing the situation from the outside, i.e. you have a third observer. But as a matter of fact, Alice and Bob perform measurements at spacelike separated intervals and the results are consistent with the statistics that is predicted by the pre-collapsed state.

No, it means that Bob includes Alice as part of his classical apparatus and Alice includes Bob as part of her classical apparatus. So the measurement that is performed is the simultaneous measurement by Alice and Bob. However, using this method to avoid collapse will create a preferred frame, since it takes the frame in which Alice and Bob measure simultaneously. To avoid the preferred frame, one cannot accept the reality of measurements at spacelike intervals.

rubi said:
II believe that the majority of quantum physicists would agree that Reichenbachs criterion is too strong for application in quantum theory.

Yes, perhaps the precise statement of Reichenbach's principle might not be agreed on by everyone. However, Ilja is much closer to consensus than you are - there is no widely accepted notion of cause in which quantum theory explains the correlations.

rubi said:
IBohmian mechanics definitely needs a conspiracy to explain why the Lorentz violation cannot be observed (arXiv:1208.4119). Even the paper you quoted earlier comes to this conclusion. And introducing an ether with all its consequences when there is really no need for it is just not reasonable.

The paper makes separate comments about Valentini's version of Bohmian Mechanics.
 
  • #63
rubi said:
(Of course, everyone is allowed to have their own opinion. I just don't accept it if people like Ilja force their personal prejudices upon others, especially if there is no evidence in favour of them.)

I'm pretty sure Ilja's view is the common one, or at least the one that is closer to correct. The problem with your view is that you go beyond the view that the role of quantum theory is only to predict the correlations, which is the operational view of Bohr and Heisenberg, and all who believe there is a measurement problem also agree the operational view is very satisfactory. But to go beyond that and say that quantum theory "explains" or is about "causes" and can maintain relativistic invariance is very controversial.
 
  • #64
atyy said:
Yes, that's a slightly more general version of what I said. In either case, there is no violation of the Bell inequalities at spacelike separation, so no implication of nonlocality via the Bell inequalities.
I disagree. Alice and Bob are spacelike separated. The violation occurs. They are both spacelike separated from Charles when the photons are detected. The violation occurs.

I suspect that what you mean is that the observation (to fix the results) has to wait until further down Charles' world-line, where he can receive their results. But whilst that may save locality, it forces us to assume that Charles' classical observation of Alice and Bob's classical data is what collapses their wavefunction(s). So Alice and Bob's lives are put on hold until their data intersect. Good job photons are pretty nifty so it's all over in a few microseconds, but I wonder how this would work with cold electrons where Alice and Bob remain in Schrodinger Cat states for half an hour? Perhaps we should ask them what it was like... oh I forgot, their memories get wiped at the same time as the forbidden data.
 
  • #65
Derek Potter said:
I disagree. Alice and Bob are spacelike separated. The violation occurs. They are both spacelike separated from Charles when the photons are detected. The violation occurs.

To be clear, here I always use Copenhagen, so measurement is something which produces a classical result.

If Alice and Bob measure at spacelike separation, one can choose a frame in which their measurements are not simultaneous. In which case, one will have collapse.

To get rid of collapse, one has to use the frame in which Alice and Bob measure simultaneously. However, that means that there is a preferred frame.

To get rid of collapse and to get rid of the preferred frame, one has to say that there is no reality to Alice's measurement at spacelike separation.

Derek Potter said:
I suspect that what you mean is that the observation (to fix the results) has to wait until further down Charles' world-line, where he can receive their results. But whilst that may save locality, it forces us to assume that Charles' classical observation of Alice and Bob's classical data is what collapses their wavefunction(s). So Alice and Bob's lives are put on hold until their data intersect. Good job photons are pretty nifty so it's all over in a few microseconds, but I wonder how this would work with cold electrons where Alice and Bob remain in Schrodinger Cat states for half an hour? Perhaps we should ask them what it was like... oh I forgot, their memories get wiped at the same time as the forbidden data.

Almost except that Alice and Bob have no classical data. They don't really exist, and are just ghostly things in the wave function which is not real. When Charles measures them, he observes he classical result that Alice and Bob report that they violated the Bell inequality at spacelike separation. However, only the report received by Charles is real Alice and Bob and their experiments are not real.
 
Last edited:
  • #66
atyy said:
Almost except that Alice and Bob have no classical data. They don't really exist, and are just ghostly things in the wave function which is not real. When Charles measures them, he observes he classical result that Alice and Bob report that they violated the Bell inequality at spacelike separation. However, only the report received by Charles is real Alice and Bob and their experiments are not real.
That will be news to Alice and Bob. And I thought the LSD-dropping hippies were wierd.
 
  • Like
Likes atyy
  • #67
atyy said:
To be clear, here I always use Copenhagen, so measurement is something which produces a classical result.
If Alice and Bob measure at spacelike separation, one can choose a frame in which their measurements are not simultaneous. In which case, one will have collapse.
To get rid of collapse, one has to use the frame in which Alice and Bob measure simultaneously. However, that means that there is a preferred frame.
I don't think simultaneity solves anything except making it much harder to think about. The wavefunction collapses under two observations: in stages if Alice and Bob stagger their observations, in one step if they are simultaneous.
atyy said:
To get rid of collapse and to get rid of the preferred frame, one has to say that there is no reality to Alice's measurement at spacelike separation.
No *classical* reality. But we know this anyway even without a preferred frame. And we are not trying to get rid of collapse because we are in love with MWI, we need to postpose it otherwise Bob's detector is being affected by an event at Alice which, in some frames, hasn't even happened yet.
 
  • #68
atyy said:
I think your language is unusual. If you would like to just say quantum theory is not a theory about what nature is, but what we can say about nature, ie. predict the probabilities of outcomes, then that would be fine. But going on to say that quantum theory explains the observations is controversial. Usually, in the operational view the wave function is not taken to be real, and just a tool. If the wave function is taken to be an explanation, then it is taken to be real, and collapse is real, and relativistic invariance is manifestly violated.
I pretty much completely agree with the operational view. I might just have a different standard for what I consider a possible explanation. For me, a theory that describes every aspect of a phenomenon accurately, is already a possible explanation. You seem to additionally require an explanation to be philosophically pleasing. I also prefer philosophically pleasing models, but for me it is not a necessary condition for an explanation.

No, it means that Bob includes Alice as part of his classical apparatus and Alice includes Bob as part of her classical apparatus. So the measurement that is performed is the simultaneous measurement by Alice and Bob. However, using this method to avoid collapse will create a preferred frame, since it takes the frame in which Alice and Bob measure simultaneously. To avoid the preferred frame, one cannot accept the reality of measurements at spacelike intervals.
I'm pretty sure that if you are not going to collapse the state anyway, i.e. you are just using it as a tool that encodes available information, you can just apply a Lorentz transform to it to get an equivalent description in any other inertial frame. The unitarity of the transformation ensures that all predictions must be equivalent.
But my point really wasn't about a no-collapse interpretation. What I'm saying is that even in plain Copenhagen with collapse, the probabilities that lead to a Bell inequality violation are calculated using only the pre-collapsed state, so it is really the entanglement and not the collapse, which causes the violation.

Yes, perhaps the precise statement of Reichenbach's principle might not be agreed on by everyone. However, Ilja is much closer to consensus than you are - there is no widely accepted notion of cause in which quantum theory explains the correlations.
This is not what i meant to imply. I agree that it is uncommon to regard the preparation procedure as cause of the correlations. What I'm saying is that I'm fairly sure that the majority of physicists don't know Reichenbach's principle and will reject it as soon as you tell them that it implies Lorentz violation, the exception being the rather negligible group of Bohmians.

The paper makes separate comments about Valentini's version of Bohmian Mechanics.
Ok, but Valentini's version seems to be a version that actually dares to make experimental predictions that contradict conventional quantum mechanics. I happily encourage this kind of research, since it may actually lead to an expansion of our understanding.

atyy said:
I'm pretty sure Ilja's view is the common one, or at least the one that is closer to correct. The problem with your view is that you go beyond the view that the role of quantum theory is only to predict the correlations, which is the operational view of Bohr and Heisenberg, and all who believe there is a measurement problem also agree the operational view is very satisfactory. But to go beyond that and say that quantum theory "explains" or is about "causes" and can maintain relativistic invariance is very controversial.
I explained above what my standard for an admissible explanation is. I'm not forcing anyone to adopt the same standard. However, I don't think that it is controversial to say that relativistic quantum theories maintain Lorentz invariance.

atyy said:
Almost except that Alice and Bob have no classical data. They don't really exist, and are just ghostly things in the wave function which is not real.
I don't think that one is forced to adopt such a point of view. After all, the wave function may just be a container for information about statistics of repeated identically prepared experiments.
 
  • #69
rubi said:
Well, Reichenbachs principle needs to be rejected then if it forces us to give up a perfectly satisfactory theory. It's not like Reichenbachs principle is something that nature must necessarily obey. Nature can behave as she may and we have to accept that.
Real scientists will give up a theory if it can't be rescued in a reasonable way. And Reichenbachs principle is such a theory.
No. It corresponds nicely with de Broglie-Bohm theory. That you don't like a theory does not make it unreasonable.
rubi said:
Bohmian mechanics definitely needs a conspiracy to explain why the Lorentz violation cannot be observed (arXiv:1208.4119).
Big problem. Ok, I do not say that a fine tuning problem is not a problem at all - it is an interesting problem worth to be considered, because the solution of this problem will probably give some additional insight, for example some symmetry.

But the problem in this paper does not have much to do with Lorentz symmetry - it is a general problem of a superposition: If one measures one part, the reduction of the wave function gives information about what has been measured as well as the result of the measurement - but this information remains invisible in the probabilities. Thus, the same problem appears also in non-relativistic QM if one applies the same technique.

And, it appears, dBB solves it - it is the "conspiracy" defined by quantum equilibrium, which creates a 1:1 correspondence between the probability distribution of the configuration and the wave function.

PS: Finishing reading the paper, I have seen that this has already been recognized in the paper itself, in the part where Valentini's variant is considered.
rubi said:
And introducing an ether with all its consequences when there is really no need for it is just not reasonable.
What would be these so horrible consequences that it is preferable to give up such essential fundamental concepts like Reichenbach's principle?

rubi said:
Religous believes like yours have no place in science.
...
It would really be helpful if you didn't randomly mention all these subjects that you clearly don't really understand as if it would be in favour of your argument.
...
If you don't agree that this is the definition of Lorentz covariance, then I'm wasting my time here.
One should indeed think about if it is only a waste of time to have discussions with people who behave in such a way, so I have deleted the answers to the remaining points, leaving only those where I'm interested enough to find out if you have some arguments or not.
 
Last edited:
  • #70
rubi said:
It is well known that QM doesn't satisfy Bell's locality criterion ("Einstein causality"). That doesn't mean it is not causal. The cause for the correlations is of course that the quantum system has been prepared in a state that results in those correlations. If you prepare the system in a different state, you will not see the same correlations. This is a perfect cause and effect relationship.

I don't think that that's the full story, for the same reason that Einstein, Rosen and Podolsky said. Consider an EPR-type twin-pair experiment where Alice and Bob decide ahead of time to choose the same orientation for their spin (or polarization) measurements. Furthermore, suppose that Alice performs her measurement slightly before Bob performs his. Then in the time between those measurements, Alice knows for certain what result Bob will get. So, if Alice were to describe the state of affairs near Bob's detector, then she would have to describe it by a density matrix, or probability distribution, or wave function, or whatever that gave 100% chance for certain outcomes and 0% chance for other outcomes. That's a different state than she would have used a moment before. So the state that she ascribes to Bob's detector/particle changes discontinuously.

That doesn't mean anything nonlocal is going on, if the state that Alice ascribes to Bob is subjective. But it isn't just subjective. Whether anyone else besides Alice knows it, it is certain what result Bob will get (unless you want to get many-worlds about it).
 
  • #71
rubi said:
I pretty much completely agree with the operational view. I might just have a different standard for what I consider a possible explanation. For me, a theory that describes every aspect of a phenomenon accurately, is already a possible explanation. You seem to additionally require an explanation to be philosophically pleasing. I also prefer philosophically pleasing models, but for me it is not a necessary condition for an explanation.

I'm not requiring anything pleasing. In fact, I think dBB is very ugly and Copenhagen is very beautiful. What I'm saying is that in the ordinary use of the word, an explanation or a cause must be something real. So if one considers the wave function to be an explanation or a cause, then one is considering it to be real. Almost everyone agrees that if the wave function is real, then there is manifest violation of Lorentz invariance, which can be particularly clearly seen by the wave function collapse.

rubi said:
I'm pretty sure that if you are not going to collapse the state anyway, i.e. you are just using it as a tool that encodes available information, you can just apply a Lorentz transform to it to get an equivalent description in any other inertial frame. The unitarity of the transformation ensures that all predictions must be equivalent.
But my point really wasn't about a no-collapse interpretation. What I'm saying is that even in plain Copenhagen with collapse, the probabilities that lead to a Bell inequality violation are calculated using only the pre-collapsed state, so it is really the entanglement and not the collapse, which causes the violation.

Yes, I understood that. I was just making a minor side point.

rubi said:
This is not what i meant to imply. I agree that it is uncommon to regard the preparation procedure as cause of the correlations. What I'm saying is that I'm fairly sure that the majority of physicists don't know Reichenbach's principle and will reject it as soon as you tell them that it implies Lorentz violation, the exception being the rather negligible group of Bohmians.

The more usual way to say it in physics that I don't think is controversial is
(1) QM has a measurement problem (of course one can deny this, but many do not, including Landau & Lifshitz, Dirac, Weinberg, Bell, Tsirelson, all Everettians etc)
(2) The measurement problem can potentially be solved by hidden variables
(3) Bell's theorem says that any hidden variable solution of the measurement problem will be nonlocal.

rubi said:
Ok, but Valentini's version seems to be a version that actually dares to make experimental predictions that contradict conventional quantum mechanics. I happily encourage this kind of research, since it may actually lead to an expansion of our understanding.

Yes, of course, the whole point of the measurement problem is that it potentially points to new physics - Dirac explicitly says this. I'm pretty sure Ilja is thinking of Valentini's version of dBB when he says dBB, I think most people do.

rubi said:
I explained above what my standard for an admissible explanation is. I'm not forcing anyone to adopt the same standard. However, I don't think that it is controversial to say that relativistic quantum theories maintain Lorentz invariance.

It isn't controversial that the predictions of quantum theory are Lorentz invariant, ie. at the operational level. But beyond that looking for QM to "explain", then one runs into problems with Lorentz invariance.

rubi said:
I don't think that one is forced to adopt such a point of view. After all, the wave function may just be a container for information about statistics of repeated identically prepared experiments.

Of course one is not forced to adopt such a point of view, I was just bringing up the minor side point that one can do so and save locality (EPR themselves mentioned this).
 
Last edited:
  • #72
rubi said:
I might just have a different standard for what I consider a possible explanation. For me, a theory that describes every aspect of a phenomenon accurately, is already a possible explanation. You seem to additionally require an explanation to be philosophically pleasing. I also prefer philosophically pleasing models, but for me it is not a necessary condition for an explanation.
In principle ok. I would make only the methodological point that to have more open scientific problems is in no way harmful. On the contrary, it is even preferable. Because these are more places where we can reach some progress.

Instead, "solving" problems by accepting what is reached as satisfactory is not a good idea. If people would have accepted classical thermodynamics simply as a field theory, as we accept today the SM, we possibly would not know even today that atoms exist.
rubi said:
What I'm saying is that I'm fairly sure that the majority of physicists don't know Reichenbach's principle and will reject it as soon as you tell them that it implies Lorentz violation, the exception being the rather negligible group of Bohmians.
Here I agree. And I see this as evidence that, contrary to your claim, religous believes have a very big place in science. The rejection of a hidden preferred frame is something more or less acceptable, everything else being equal. But if one prefers it even if one has to give up such fundamental scientific principles like causality and realism, I'm no longer able to see rational explanations. What remains is some quasi-religious belief: The ether is simply anathema.
 
  • #73
Ilja said:
Here I agree. And I see this as evidence that, contrary to your claim, religous believes have a very big place in science. The rejection of a hidden preferred frame is something more or less acceptable, everything else being equal. But if one prefers it even if one has to give up such fundamental scientific principles like causality and realism, I'm no longer able to see rational explanations. What remains is some quasi-religious belief: The ether is simply anathema.

rubi made an exception for Valentini in his comments in post #68.
 
  • #74
atyy said:
rubi made an exception for Valentini in his comments in post #68.
I have seen. And the point is a quite reasonable one:

Ok, but Valentini's version seems to be a version that actually dares to make experimental predictions that contradict conventional quantum mechanics. I happily encourage this kind of research, since it may actually lead to an expansion of our understanding.
This is what I tell all the time: The point of considering different interpretations is that they are starting points for different directions of development of the theory. It is quite natural that an interpretation leads to problems. And the way to solve problems is to modify the theory. Thus, the positivistic rejection of the consideration of interpretations is harmful for science, because it kills whole directions of possible theory development.

I have developed my ether theories following the same scheme. I have started with an interpretation of GR. Then, I have solved a problem - that there was no Euler-Lagrange equation for the preferred coordinates - and the result was already a theory different from GR, http://arxiv.org/abs/gr-qc/0205035

Or for the standard model - initially all I wanted was to obtain the SM. The only way I have managed to incorporate Dirac fermions was in pair (interpreted as electroweak) together with a scalar field. Thus, the ether model http://arxiv.org/abs/0908.0591 has obtained already some CDM candidates.

And in http://arxiv.org/abs/1101.5774 I consider the Wallstrom objection against some interpretations of QM, in particular Nelsonian stochastics. And, it seems, a solution of this problem can also be found by a modification of the theory.
 
  • #75
Ilja said:
No. It corresponds nicely with de Broglie-Bohm theory. That you don't like a theory does not make it unreasonable.
I didn't say that it was apriori unreasonable. I'm just saying that it is a perfectly valid point of view to prefer its rejection over the alternatives. Reichenbach's principle may or may not be relalized by nature. We must take the alternative seriously, especially if it leads to a simpler theory.

Big problem. Ok, I do not say that a fine tuning problem is not a problem at all - it is an interesting problem worth to be considered, because the solution of this problem will probably give some additional insight, for example some symmetry.

But the problem in this paper does not have much to do with Lorentz symmetry - it is a general problem of a superposition: If one measures one part, the reduction of the wave function gives information about what has been measured as well as the result of the measurement - but this information remains invisible in the probabilities. Thus, the same problem appears also in non-relativistic QM if one applies the same technique.

And, it appears, dBB solves it - it is the "conspiracy" defined by quantum equilibrium, which creates a 1:1 correspondence between the probability distribution of the configuration and the wave function.

PS: Finishing reading the paper, I have seen that this has already been recognized in the paper itself, in the part where Valentini's variant is considered.
This is not what I meant. My point is that Bohmian mechanics predicts the existence of many additional entities and shields them from the observer in such a way that he cannot predict more about them than ordinary quantum theory can. It includes action at a distance but doesn't allow superluminal communication. It is hard to believe that nature has such a rich ontology, yet an observer cannot access any of the additional information or use the action at a distance for superluminal communication. Since there is already a theory that works without any such assumptions and is at the same time simpler to use, I find it natural to rejection the Bohmian theory. I acknowledge that Valentini's theory allows for superluminal communication and it deserves to be tested, but it still seems much more convoluted than ordinary QM for be to jump onto it before experiments disprove ordinary QM.

What would be these so horrible consequences that it is preferable to give up such essential fundamental concepts like Reichenbach's principle?
I don't consider Reichenbach's principle essential or fundamental. It is just one possible principle that may or may not be true and it doesn't seem like we gain much by accepting it. On the other hand, our best theories are all relativistically covariant and it would be a big problem to explain why all our theories are relativistically covariant when nature really isn't relativistically covariant. I rather accept the violation of a principle that doesn't need to be realized anyway, than overthrowing basically all of modern physics, especially if there is no evidence that anything can be gained by that.

One should indeed think about if it is only a waste of time to have discussions with people who behave in such a way, so I have deleted the answers to the remaining points, leaving only those where I'm interested enough to find out if you have some arguments or not.
You were the one who started being disrespectful.

stevendaryl said:
I don't think that that's the full story, for the same reason that Einstein, Rosen and Podolsky said. Consider an EPR-type twin-pair experiment where Alice and Bob decide ahead of time to choose the same orientation for their spin (or polarization) measurements. Furthermore, suppose that Alice performs her measurement slightly before Bob performs his. Then in the time between those measurements, Alice knows for certain what result Bob will get. So, if Alice were to describe the state of affairs near Bob's detector, then she would have to describe it by a density matrix, or probability distribution, or wave function, or whatever that gave 100% chance for certain outcomes and 0% chance for other outcomes. That's a different state than she would have used a moment before. So the state that she ascribes to Bob's detector/particle changes discontinuously.

That doesn't mean anything nonlocal is going on, if the state that Alice ascribes to Bob is subjective. But it isn't just subjective. Whether anyone else besides Alice knows it, it is certain what result Bob will get (unless you want to get many-worlds about it).
Alice is performing a Bayesian updating of her knowledge. It doesn't influence Bob's state as he is using his own copy of the state for the description of the system and it produces consistent statistics.

atyy said:
What I'm saying is that in the ordinary use of the word, an explanation or a cause must be something real. So if one considers the wave function to be an explanation or a cause, then one is considering it to be real. Almost everyone agrees that if the wave function is real, then there is manifest violation of Lorentz invariance, which can be particularly clearly seen by the wave function collapse.
The explanation isn't the wave function, but the preparation procedure. If I prepare a system in a state that can only produce 100% correlations (given a certain alignment of the detectors), I shouldn't be surprised if I find 100% correlations. One can still consider the wave-function as a container of information. Yes, I agree that this still violates Reichenbach's principle, but if I register 100% correlation if and only if I prepare the system in a state that can only produce 100% correlation, then it seems hard for me to deny a cause and effect relationship here, even if it is not consistent with Reichenbach's principle. It would rather give me the impression that Reichenbach's principle doesn't fully capture the notion of causality.

The more usual way to say it in physics that I don't think is controversial is
(1) QM has a measurement problem (of course one can deny this, but many do not, including Landau & Lifshitz, Dirac, Weinberg, Bell, Tsirelson, all Everettians etc)
(2) The measurement problem can potentially be solved by hidden variables
(3) Bell's theorem says that any hidden variable solution of the measurement problem will be nonlocal.
I agree that this is uncontroversial, but it cannot be used to argue against locality.

Ilja said:
In principle ok. I would make only the methodological point that to have more open scientific problems is in no way harmful. On the contrary, it is even preferable. Because these are more places where we can reach some progress.

Instead, "solving" problems by accepting what is reached as satisfactory is not a good idea. If people would have accepted classical thermodynamics simply as a field theory, as we accept today the SM, we possibly would not know even today that atoms exist.
I have no problem with research in that direction. But If you demand your views to be respected by others, you should also respect other peoples views.

Here I agree. And I see this as evidence that, contrary to your claim, religous believes have a very big place in science. The rejection of a hidden preferred frame is something more or less acceptable, everything else being equal. But if one prefers it even if one has to give up such fundamental scientific principles like causality and realism, I'm no longer able to see rational explanations. What remains is some quasi-religious belief: The ether is simply anathema.
There are perfectly good reasons to reject an ether theory if it doesn't yield any immediate gain, while only making the theory more complicated. I'm sure if you managed to describe some new experimental result with Bohmian mechanics, people would be just as quickly switch to your theory as they rejected Reichenbach's principle.
 
  • #76
rubi said:
Alice is performing a Bayesian updating of her knowledge.

That's the subjective view that I was talking about, and I don't think that that makes sense. If it's just an updating of her subjective knowledge, then whatever she discovers to be true about Bob's situation by performing her measurement was also true (although she didn't know it) BEFORE her measurement.
 
  • #77
stevendaryl said:
I don't think that that's the full story, for the same reason that Einstein, Rosen and Podolsky said. Consider an EPR-type twin-pair experiment where Alice and Bob decide ahead of time to choose the same orientation for their spin (or polarization) measurements. Furthermore, suppose that Alice performs her measurement slightly before Bob performs his. Then in the time between those measurements, Alice knows for certain what result Bob will get. So, if Alice were to describe the state of affairs near Bob's detector, then she would have to describe it by a density matrix, or probability distribution, or wave function, or whatever that gave 100% chance for certain outcomes and 0% chance for other outcomes. That's a different state than she would have used a moment before. So the state that she ascribes to Bob's detector/particle changes discontinuously.

That doesn't mean anything nonlocal is going on, if the state that Alice ascribes to Bob is subjective. But it isn't just subjective. Whether anyone else besides Alice knows it, it is certain what result Bob will get (unless you want to get many-worlds about it).

The correlations at multiples of 45 degrees, which include "same orientation" of course, can be explained by local variables in a classical sort of way like the red and blue socks.
 
Last edited:
  • #78
Derek Potter said:
The correlations at multiples of 45 degrees, which include "same orientation" of course, can be explained by local variables in a classical sort of way like the red and blue socks. edit - hang on a tick, I'm just adding something :)

But Bell's inequality shows that that explanation isn't true.
 
  • #79
Since this thread seems to have morphed into a discussion of whether a principle proposed by Hans Reichenbach - a philosopher of science - is believed by most physicists, it seems it would now more appropriately belong in the philosophy sub-forum. Does physicsforums still have one?

Practising physicists can practice perfectly well with or without a belief in Reichenbach's 'common cause'. Since (1) science is not a democracy and (2) such a belief has no impact on the actual science, guessing about whether a majority believe in such a principle seems pretty irrelevant. One might as well ask whether a majority of physicists believe in God.

I'd also like to point out that people will expostulate for ages on 'causes' without ever pausing to think whether they really know what they mean by the word 'cause'. It is one of the most vague, misused and unnecessary words in the philosophical lexicon, and discussions become much clearer if we discard it altogether.

Here's an essay from Bertrand Russell pointing this out, from about 100 years ago:
http://www.jstor.org/stable/4543833?seq=1#page_scan_tab_contents
http://www.scribd.com/doc/269810250/Russell-On-the-Notion-of-Cause#scribd [alternate link]

And here's one I wrote a few years ago that made a similar point (I wasn't aware of Russell's essay at the time). It proposes a formal definition that I think is both well-defined and matches reasonably well the naive, folk notion of cause.
https://sageandonions.wordpress.com...-to-distil-clarity-from-a-very-muddy-concept/
 
  • Like
Likes atyy
  • #80
rubi said:
The explanation isn't the wave function, but the preparation procedure. If I prepare a system in a state that can only produce 100% correlations (given a certain alignment of the detectors), I shouldn't be surprised if I find 100% correlations. One can still consider the wave-function as a container of information. Yes, I agree that this still violates Reichenbach's principle, but if I register 100% correlation if and only if I prepare the system in a state that can only produce 100% correlation, then it seems hard for me to deny a cause and effect relationship here, even if it is not consistent with Reichenbach's principle. It would rather give me the impression that Reichenbach's principle doesn't fully capture the notion of causality.

But it is not true that you register 100% correlation if and only if you prepare the state in a certain way. The measurement settings of Alice and Bob are also needed to get the 100% correlation.

rubi said:
I agree that this is uncontroversial, but it cannot be used to argue against locality.

Why not, since you do agree that any hidden variables approach to solving the measurement problem must be nonlocal (or retrocausal or superdeterministic and other usual caveats)? Unless you are using another defnition of local, ie. no superluminal signalling? That is fine, and quantum theory is certainly local by that operational definition. But in trying to solve the measurement problem, we have to go beyond operational quantum theory, in which case it is the locality of classical relativity that is important, since that is a version of relativity that does not have a measurement problem.

My point of view is that there are two notions of locality, and Bell's theorem is important for both of them. For the operational point of view locality means no superluminal signalling, and Bell's theorem guarantees that quantum mechanics is operationally random if no one can signal superluminally.

From the point of view of the measurement problem, the notion of locality is classical relativistic causality since that is a version of special relativity without a measurement problem. Here Bell's theorem guarantees that that is gone, so we have to solve the measurement problem by nonlocal hidden variables, retrocausation, many-worlds , superdeterminism or something more drastic.

So quantum theory is both local and nonlocal, according to different definition of locality.
 
  • #81
stevendaryl said:
But Bell's inequality shows that that explanation isn't true.
The CHSH inequality is
-2 <= E(a, b) − E(a, b′) + E(a′, b) + E(a′, b′) <= +2
I don't know how you can apply the CHSH inequality when you only specify a single pair of angles - the same angle at that.
On a simple level we would make a = b = a' = b' . Alice's observation of perfect anti-correlation then gives you a value of -2 which does not violate the inequality.
If you have some other way of plugging numbers into the expression, I'd like to know what it is.
 
  • #82
andrewkirk said:
Since this thread seems to have morphed into a discussion of whether a principle proposed by Hans Reichenbach - a philosopher of science - is believed by most physicists, it seems it would now more appropriately belong in the philosophy sub-forum.

No, that is actually incidental to the discussion. The real issue is whether the only definition of locality that matters is "no superluminal signalling". No one is arguing that that is not an important sense, nor that quantum mechanics is not local by that definition. What is being argued is that that is not the only definition of locality that matters, because quantum mechanics has a measurement problem - one is certainly entitled to say that there is no measurement problem - however, many physicists, including Landau & Lifshitz, Dirac, Weinberg, Bell, Tsirelson, and all Everettians have agreed there is a problem.
 
  • #83
Derek Potter said:
The CHSH inequality is
-2 <= E(a, b) − E(a, b′) + E(a′, b) + E(a′, b′) <= +2
I don't know how you can apply the CHSH inequality when you only specify a single pair of angles - the same angle at that.

I don't understand your point. We've already proved (or Bell already proved) that EPR violates Bell's inequality. So why do I need to prove it again?
 
  • #84
stevendaryl said:
I don't understand your point. We've already proved (or Bell already proved) that EPR violates Bell's inequality. So why do I need to prove it again?

To elaborate a little more, you're right, in the case in which Alice and Bob agree ahead of time to use the same orientation, that particular experiment has a local hidden-variables explanation for its result. But we have other experiments that have shown that there are no local hidden variables involved. It doesn't make sense to explain a result in terms of some hypothetical entities (local hidden variables) which OTHER experiments have shown do not exist.
 
  • #85
andrewkirk said:
And here's one I wrote a few years ago that made a similar point (I wasn't aware of Russell's essay at the time). It proposes a formal definition that I think is both well-defined and matches reasonably well the naive, folk notion of cause.
https://sageandonions.wordpress.com...-to-distil-clarity-from-a-very-muddy-concept/

That's an interesting essay. Do you know http://www.cs.ucla.edu/~eb/r384-lnai.pdf?
 
Last edited by a moderator:
  • #86
bhobba said:
I gave a model that specifically rejects counter-factual definiteness and predicts the violation of Bell's inequality. Obviously your assertion is wrong.

Oh - I nearly forgot to mention - I make no claim about locality because I don't believe locality applies to correlated systems. But if you do, by a suitable definition of locality, you can reject CFD and keep locality.
My requirement was that model has to be local. As you don't like word "locality" (or like to associate uncommon concept with this word) let me replace "locality" with "factorizability".
 
  • #87
stevendaryl said:
To elaborate a little more, you're right, in the case in which Alice and Bob agree ahead of time to use the same orientation, that particular experiment has a local hidden-variables explanation for its result. But we have other experiments that have shown that there are no local hidden variables involved. It doesn't make sense to explain a result in terms of some hypothetical entities (local hidden variables) which OTHER experiments have shown do not exist.
OK, but the point is obscured if you illustrate non-locality with an example where non-locality is not needed :)
 
  • #88
RUTA said:
Here is an example http://www.ijqf.org/archives/2402. Also, note that it is a realist theory without CFD.
There are number of things about your proposed model and RB interpretation in general.
First, entanglement model is not worked out. In p.154-155 setup is described and then when it would be time to introduce particular configuration of "spacetimesource elements" and show how one arrives at expected result there is some handwaving instead.
Second, there was requirement that model has to be local (factorizable, in case bhobba would read this). But as I understand, relations that are fundamental in this model are non-local, right?
Third, to me AGC seems like a cheat (read, non scientific explanation). Is there some motivation why it is reasonable to introduce AGC?
And forth, to me it seems that switching from worldlines to relations as fundamental entities is philosophically fundamental and so extremely radical change that steps out of domain of science.
 
  • #89
rubi said:
I didn't say that it was apriori unreasonable. I'm just saying that it is a perfectly valid point of view to prefer its rejection over the alternatives. Reichenbach's principle may or may not be relalized by nature. We must take the alternative seriously, especially if it leads to a simpler theory.
If the only gain in simplicity is similar to the gain thermodynamics would have reached by rejecting the atomic hypothesis and being interpreted as a field theory, I would disagree that a "simpler theory" is an advantage. It would be better to care about predictive and explanatory power. But that to prefer an alternative is something which should be allowed in science is nothing I would question.

rubi said:
My point is that Bohmian mechanics predicts the existence of many additional entities and shields them from the observer in such a way that he cannot predict more about them than ordinary quantum theory can. It includes action at a distance but doesn't allow superluminal communication. It is hard to believe that nature has such a rich ontology, yet an observer cannot access any of the additional information or use the action at a distance for superluminal communication. Since there is already a theory that works without any such assumptions and is at the same time simpler to use, I find it natural to rejection the Bohmian theory. I acknowledge that Valentini's theory allows for superluminal communication and it deserves to be tested, but it still seems much more convoluted than ordinary QM for be to jump onto it before experiments disprove ordinary QM.
I do not think dBB ontology is nice. But if we combine the Copenhagen interpretation with the idea that that there exists a unique theory for everything, we cannot really avoid to have a configuration also for the quantum part. We have the access to this part of the structure in the classical part. What we can try to get rid of is the wave function part, which may be epistemological.

rubi said:
I don't consider Reichenbach's principle essential or fundamental. It is just one possible principle that may or may not be true and it doesn't seem like we gain much by accepting it. On the other hand, our best theories are all relativistically covariant and it would be a big problem to explain why all our theories are relativistically covariant when nature really isn't relativistically covariant.
That's a minor problem and already solved in http://arxiv.org/abs/gr-qc/0205035 at least for the classical part, the EEP is derived there for a non-covariant theory.

rubi said:
I rather accept the violation of a principle that doesn't need to be realized anyway, than overthrowing basically all of modern physics, especially if there is no evidence that anything can be gained by that.
There is IMHO nothing important which has to be overthrown, except some metaphysical prejudices against hidden variables. The ether theory of http://arxiv.org/abs/gr-qc/0205035 has the EEP and the Einstein equations as a limit, and its most serious differences with GR disappear if one chooses Y<0, which gives only four massless dark matter fields and some arbitrary small cosmological terms as the remaining difference.

The SM should not be overthrown too, http://arxiv.org/abs/0908.0591 is the only theory I know of which actually predicts the three generations of SM fermions, the SM gauge group, and its action on the fermions, which, I think, is a gain. But, of course, you may find the string theory landscape more attractive.

So, what are the things which have to be overthrown?
rubi said:
You were the one who started being disrespectful.
I have different memories, but let's forget it.
rubi said:
There are perfectly good reasons to reject an ether theory if it doesn't yield any immediate gain, while only making the theory more complicated. I'm sure if you managed to describe some new experimental result with Bohmian mechanics, people would be just as quickly switch to your theory as they rejected Reichenbach's principle.
Sorry, but the gains which can be reached with the ether can be easily seen. The problems of quantization of gravity essentially disappear - we know how to quantize condensed matter theories. The problem to explain why the SM is what it is is solved in an IMHO satisfactory way too, even if the model does not allow to compute the masses. Bohmian mechanics does not have a measurement problem. My ether theories have been published already some years, but are simply ignored. So, no, I don't believe anymore that people would switch to ether theory if it would give some gains. (Not in a world where you need an independent income to do independent research because you can be sure that it will be extremely hard to publish anything and you will never obtain a grant for this.)

And, of course, the question is what has been the immediate gain of the atomic hypothesis? How long has atomic theory been developed before it managed to obtain a new experimental result?
 
  • #90
zonde said:
My requirement was that model has to be local. As you don't like word "locality" (or like to associate uncommon concept with this word) let me replace "locality" with "factorizability".

Its compatible with the definition of locality ie from the previously linked paper:
'Let us define a “local” theory as a one where the outcomes of an experiment on a system are independent of the actions performed on a different system which has no causal connection with the first. For example, the temperature of this room is independent on whether I choose to wear purple socks today. Einstein’s relativity provides a stringent condition for causal connections: if two events are outside their respective light cones, there cannot be any causal connection among them.'

I don't deny the existence locality - I am simply saying it doesn't apply to correlated systems because by the definition of correlation if things are correlated what is done in one system is related to what goes on in the other. Include it in locality if you like. The issue however is if you want to keep counter-factual definiteness you must allow superluminal signalling which specifically makes it non-local:
http://drchinese.com/David/Bell_Theorem_Easy_Math.htm
'But there was a price to pay for such this experimental setup: we must add a SECOND assumption. That assumption is: A measurement setting for one particle does not somehow affect the outcome at the other particle if those particles are space-like separated. This is needed because if there was something that affected Alice due to a measurement on Bob, the results would be skewed and the results could no longer be relied upon as being an independent measurement of a second attribute. This second assumption is called "Bell Locality" and results in a modification to our conclusion above. In this modified version, we conclude: the predictions of any LOCAL Hidden Variables theory are incompatible with the predictions of Quantum Mechanics. Q.E.D. '

Thanks
Bill
 

Similar threads

  • · Replies 71 ·
3
Replies
71
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 93 ·
4
Replies
93
Views
7K
  • · Replies 73 ·
3
Replies
73
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 49 ·
2
Replies
49
Views
5K
  • · Replies 70 ·
3
Replies
70
Views
8K
Replies
11
Views
2K