Do Bell and PBR together point toward nonlocal reality?

Click For Summary
The discussion explores the implications of the Bell and PBR theorems on the nature of reality and locality in quantum mechanics. The Bell theorem posits that either locality or realism is incompatible with quantum mechanics, while the PBR theorem asserts that quantum states are ontologically real. Participants debate whether these theorems suggest that only locality is flawed or if they indicate a need for a nonlocal interpretation of reality. Some argue that the definitions of "reality" in both theorems differ significantly, complicating their relationship. Ultimately, the conversation highlights ongoing tensions between various interpretations of quantum mechanics and their philosophical implications regarding realism and locality.
  • #61
stevendaryl said:
Take an example of a single particle in some kind of potential well. The standard quantum approach is that the wave function gives a probability distribution on positions of the particle, and if you perform a measurement to find out the particle's location, then the wave function collapses to something sharply peaked at the observed location. A second measurement will have probabilities given by the collapsed wave function, not the original.

But in the Bohm model, the particle always has a definite position. So what is the relationship between the wave function and the particle's position? When you detect the particle at some location, does the wave function collapse to a sharply peaked one? If so, what is the mechanism for this? Presumably, this means that there is an interaction between the detector and the wave function, but such an interaction goes beyond ordinary quantum mechanics, it seems to me. I don't see that they are equivalent.

The relationship between wave function and configuration is that the configuration q(t) follows the guiding equation defined by the wave function.

There is an interaction between the detector and the particle, and this interaction has to be described by the dBB theory for the whole system. This may be impossible in reality but is unproblematic conceptually. There is a wave function of the combined system \Psi(q_{sys},q_{env},t), which follows a Schrödinger equation, and configurations q_{sys}(t), q_{env}(t) which follow the guiding equation. The point is that there is also an effective wave function of the system, which is equivalent whenever there is no interaction between the system and the environment, and it is defined simply by
\psi(q_{sys},t)=\Psi(q_{sys},q_{env}(t),t). But during the interaction, the effective wave function does not follow the Schrödinger equation for the system alone. Instead, its evolution describes the collapse of the wave function. The final result of this process depends on the configuration of the measurement device q_{env}(t_{fin}), or, in other words, on the measurement result which we see.

In some sense, this goes beyond QM, indeed. QM does not describe the measurement process. But all what QM tells us is recovered. The wave function collapses, the resulting effective wave function of the system is uniquely defined by the result of the measurement. The resulting probabilities can be computed correctly, using the same QM formulas, if one assumes that the initial state of the whole system is \Psi(q_{sys},q_{env})=\psi(q_{sys})\psi_{env}(q_{env}) and that they are all in quantum equilibrium.
 
Physics news on Phys.org
  • #62
Ilja said:
There is an interaction between the detector and the particle, and this interaction has to be described by the dBB theory for the whole system. This may be impossible in reality but is unproblematic conceptually. There is a wave function of the combined system \Psi(q_{sys},q_{env},t), which follows a Schrödinger equation, and configurations q_{sys}(t), q_{env}(t) which follow the guiding equation.

I understand that, but without at least argument showing that the interaction of the detector with the wave function will cause an apparent collapse of the (effective single-particle) wave function to a sharply peaked delta-function, I don't think that you can say with certainty that the Bohm approach is empirically equivalent to the usual approach.
 
  • #63
kith said:
The wave function and the potential are regarded as ontic in the PBR sense. So where does the epistemicity -which is reflected by the probabilities- come from? Or speaking in terms of classical mechanics: we seem to have an equation for a state of knowledge ρ which describes the motion of some particles in an ontic potential V(ρ).

The state of knowledge is introduced by the notion of quantum equilibrium.

You have a bottle - ontic. You put some water into the bottle. It can move now in the bottle in a quite arbitrary way. But you have the somehow preferred state where the water is in "equilibrium", at rest. This equilibrium is clearly defined by the form of the bottle.

In a similar way, an epistemic probability distribution ρ(q) can be arbitrary, and the dBB equations tell us how it changes in time. But there is a special probability distribution - the quantum equilibrium \rho(q)=|\psi(q)|^2 - which is preferred: Once it is initially in this equilibrium it remains there.
 
  • #64
stevendaryl said:
No, you absolutely do not have to add the second law as a fundamental law. There is no need for it, since the time-symmetric laws of physics are overwhelmingly likely to evolve a low entropy state into a higher entropy state.
Neither the Liouville nor the von Neumann equation evolve a low entropy state to a higher entropy state. So what time-symmetric laws are you referring to?
 
  • #65
stevendaryl said:
I understand that, but without at least argument showing that the interaction of the detector with the wave function will cause an apparent collapse of the (effective single-particle) wave function to a sharply peaked delta-function, I don't think that you can say with certainty that the Bohm approach is empirically equivalent to the usual approach.

Of course you will not obtain δ-functions if you consider a realistic measurement process with finite energy. But any realistic description of measurements in QM has the same problem.

What is usually done in QM is to consider measurements as interactions such that \psi_s\psi_e \to \sum \psi_s^i\psi_e^i. If now the \psi_e^i(q_e) are interpreted as macroscopic states of the measurement device after the measurement, then this can be clearly translated into the condition that the \psi_e^i(q_e) don't overlap so that if we know the q_e we can also uniquely identify the corresponding value i, the measurement result, because \psi_e^j(q_e)\approx 0 for all other j. But then \sum_j \psi_s^j\psi_e^i \approx \psi^i_s\psi^i_e.
 
  • #66
Thanks, Ilja. I think I get the main idea of the analogy now.

Ilja said:
But there is a special probability distribution - the quantum equilibrium \rho(q)=|\psi(q)|^2 - which is preferred: Once it is initially in this equilibrium it remains there.
It seems a bit strange to me that the equilibrium probability distribution follows changes in the potential instantaneously.
 
Last edited:
  • #67
kith said:
Unless we can derive the second law from the time-symmetric laws, we have to add it as fundamental (directional) law by its own right. I still haven't wrapped my head around how such a derivation could look like (Demystifier has written an interesting paper with different local arrows of time, but I haven't had the time to read it in detail).

I might point out that entropy can increase from "now" in both time directions. Obviously most lab situations are special cases in which entropy is made to be unusually lower than the surroundings. If you sampled the entropy of a typical environment, in thermal equilibrium: wouldn't you expect it to be at a local minimum? Ie the number of states it could have evolved from and the number of states it can evolve towards are both greater than now? That would be the statistical view, I believe. In a film of that, I do not believe you could discern its direction as forward or backward in any way (in contrast to the usual idea of a film of a glass breaking being an example of the time direction being obvious).

Or alternately, think of decoherence: entanglement disperses and decreases as you go forward in time. Does that require a fundamental time asymmetric law to describe as well?
 
  • #68
kith said:
Neither the Liouville nor the von Neumann equation evolve a low entropy state to a higher entropy state. So what time-symmetric laws are you referring to?

You have to be a little careful about what you mean by "entropy" when talking about the second law. In the case of both classical phase space and quantum wave functions, there is a notion of "entropy" that is unchanged by the evolution equations. But that is not the kind of entropy that we observe to always increase. When a quantity of gas expands rapidly to fill a vacuum, that's an irreversible process, even though the volume in phase space (which is what is preserved under Liouville equations) remains constant. We don't observe phase space volumes, we observe that gas expands to fill a vacuum, and it never happens that all the gas in a container spontaneously gathers into a small volume, leaving vacuum behind.

The kind of entropy that we observe to increase is coarse-grain entropy. Roughly speaking, the coarse-grain entropy of a state is the log of the number of microscopic states that "look the same" under the coarse-graining. Time-symmetric laws don't imply that this notion of entropy is constant.
 
  • #69
stevendaryl said:
I think that it's important to distinguish between anti-realism in the form of solipsism--"Nothing exists other than my perceptions, and theories of physics are just ways of describing regularities in those perceptions"--and in the form of the conclusion that reality is very different from how it appears. A Many-Worlds type model is very different from the reality that we perceive, but it's not throwing away the idea of reality.
I know that one can play around a lot with different notions of realism and confuse people. Especially if one introduces interpretations which are not well-defined like many worlds (it presupposes a fundamental decomposition of the universe into systems without any base, and I have not seen a satisfactory derivation of the Born rule yet).

And I have no problem if someone tries the hard job of developing a weaker notion of realism which is nonetheless powerful enough to be comparable with the common sense realism but compatible with Einstein causality and the violation of Bells inequality. I doubt this is possible, but who knows.

My point is that there is a simple and straightforward alternative - the quite trivial assumption that the symmetry group of subquantum theory is different from that of quantum theory, or that of quantum gravity different from that of classical gravity. No need to change a single bit in fundamental notions of realism and causality. This is the simple, easy way out of the violation of Bell's inequality, and of a lot of other problems too.

But it is essentially forbidden. String theory publishes thousands of articles without a single empirical prediction. Based on the alternative approach, you can be happy if you succeed to publish a single paper, if you succeed to derive the whole particle content of the standard model from simple principles applied to a quite simple model (arXiv:0908.0591), and you can be sure that nobody even looks at it - once this horrible approach requires a preferred frame.

There is a philosophical issue, here, which is to what extent should a theory of physics be as close as possible to what we directly observe. It certainly is logically possible that there could be a huge gap between the two.
Logically possible is almost everything. So that's not a point. Of course, a theory closer to what we directly observe should be preferable, the question is if the competitor has other advantages.

Logically, the argument "The lack of evidence for X implies evidence against Y" requires you to establish that if Y were true, then X would follow. In the case X = changing the past, Y = time symmetric physics, it doesn't work. If you actually look at the mechanism for causality, you will see that it is ultimately about boundary conditions, not about the directionality of the laws of physics. Causality propagates in the direction of increasing entropy.
My point was not a logical proof, but that there is strong empirical evidence that there is no time symmetry in nature. That there is an animal named mechanism for causality is new to me, as far as I know causality is fundamental, assumed as given from the start. But, I guess, we think about different things named "causality".
 
  • #70
DrChinese said:
If you sampled the entropy of a typical environment, in thermal equilibrium: wouldn't you expect it to be at a local minimum?
I'm not sure I understand your post correctly. With environment you mean something like a cold vacuum in an experimental chamber which encases the system of interest?

DrChinese said:
In a film of that, I do not believe you could discern its direction as forward or backward in any way.
So this film would show the evacuation of the chamber before the experiment and the flooding with air afterwards?

DrChinese said:
Or alternately, think of decoherence: entanglement disperses and decreases as you go forward in time. Does that require a fundamental time asymmetric law to describe as well?
Obviously, I don't doubt that entanglement gets destroyed by decoherence and that a gas expands. ;-) I just don't really understand how these processes are derived from the time-symmetric laws.

stevendaryl said:
The kind of entropy that we observe to increase is coarse-grain entropy. Roughly speaking, the coarse-grain entropy of a state is the log of the number of microscopic states that "look the same" under the coarse-graining.
I never really got this distinction. The Liouville equation leaves the Shannon entropy of the probability distribution constant. What you call the coarse-grain entropy is also called the Boltzmann entropy and I thought it was equivalent to the Shannon entropy. Maybe that's a misconception?

Puh, I think this really leads off topic.
 
Last edited:
  • #71
stevendaryl said:
Could you post a concise statement of PBR, or a link to such a statement? I remember reading the paper and yawning, because it didn't seem like it said anything that I didn't already know (or suspect).

[edit]Never mind, I found a good discussion here:
http://mattleifer.info/2011/11/20/can-the-quantum-state-be-interpreted-statistically/

That is the same link I posted above. :smile:

The issue about PBR and dBB vis a vis that link is: can 2 dBB wave functions overlap as shown in the Probability Density diagram? To quote: "... the question is: should we think of it as an ontic state (more like a phase space point), an epistemic state (more like a probability distribution), or something else entirely?"

The idea being that if the dBB wave function is sharply defined (as I think Ilja is saying), there can be no overlap. But that in turn is in contradiction to statistical spread from our unknown initial conditions. So I think if the dBB pilot wave is to be considered real: then there is no spread of values, there are hidden variables, there is non-local determinism and QM is incomplete. While PBR would say that if there are hidden variables, there must be a spread of outcomes for a particular wave state, and there will be overlap (therefore placing the theory in Group 1 and being prohibited).

I realize to the Bohmian, they see PBR as either neutral or a plus for their position. But I see it as either neutral or a negative for their position. As more and more elements of dBB are developed and declared, I think there are more and more opportunities for Bohmian class theories to run afoul of PBR in a fashion that they would not with Bell.

In other words: I agree with you that demonstrating the equivalency of QM and Bohmian class theories is not trivial. I think the idea that Bohmian theories *automatically* reproduce all QM predictions is unjustified. Logically, there must be a lot of ways to formulate the interaction effects of particle positions - and they can't all be equivalent (and be equivalent to QM at the same time). The very fact that there are multiple versions of dBB would imply that as well. Again, I cannot say *exactly* what is wrong with the Bohmian reasoning on this, but it certainly raises a lot of questions in my mind.
 
  • #72
DrChinese said:
I might point out that entropy can increase from "now" in both time directions. Obviously most lab situations are special cases in which entropy is made to be unusually lower than the surroundings. If you sampled the entropy of a typical environment, in thermal equilibrium: wouldn't you expect it to be at a local minimum? Ie the number of states it could have evolved from and the number of states it can evolve towards are both greater than now? That would be the statistical view, I believe. In a film of that, I do not believe you could discern its direction as forward or backward in any way (in contrast to the usual idea of a film of a glass breaking being an example of the time direction being obvious).

Or alternately, think of decoherence: entanglement disperses and decreases as you go forward in time. Does that require a fundamental time asymmetric law to describe as well?

There is a time-symmetric model of the second law that applies to classical physics (and I assume that it can be extended to quantum mechanics, as well).

Imagine taking a human being--okay, for ethical reasons, let it be a guinea pig, instead--and putting it inside an impenetrable, eternal box. No energy or matter can go in or out. Now just wait--a billion years, a trillion years, 10^{100} years, however long it takes. After a while, the guinea pig will die, and decompose and will reach some kind of uninteresting equilibrium state, and its component atoms will remain in that state for an ungodly length of time. But there will always be a certain amount of random thermal motion of the atoms. Purely by chance, if you are willing to wait forever, the atoms will eventually arrange themselves to a configuration that is arbitrarily close to the original state of the guinea pig. In other words, the guinea pig will eventually come back to life, a reversal of entropy.

But over an enormous span of time, if you plot entropy as a function of time, what you will find is that:

  1. By far, the most likely configuration is the maximal possible entropy.
  2. Very rarely, the entropy dips down to a non-maximal value.
  3. In almost all such cases, the entropy returns quickly to a higher value.

entropy.jpg


The picture shows a typical plot of entropy vs time. Situations of type A are vastly more likely than situations of type B, which are vastly more likely than situations of type C, etc. So whatever the entropy is, if it's not the maximal value, then you are overwhelmingly likely to have higher entropy in the future, even though the graph is completely symmetric between past and future.

So the guinea pig, looking forward in such a universe can assume the second law of thermodynamics. He will likely age, die, and decompose just as the second law predicts.

What's weird about this thought experiment is that while the guinea pig can safely assume that he will be older and more decrepit in the future, he can't assume that he will be younger and in better health in the past [edit: was 'future']. In this model, the most likely past for the guinea pig is one in which he is older than now. It's overwhelmingly likely that right now the guinea pig is youngest he has been for millennia and the youngest he will be for millenia to come.
 
Last edited:
  • #73
kith said:
So this film would show the evacuation of the chamber before the experiment and the flooding with air afterwards?

...

Puh, I think this really leads off topic.

Probably right about off-topic... :smile:

But as to the film: Consider a film of a volume of air in equilibrium. You cannot tell forward from backward (until you reach the boundary in which it is no longer in equilibrium).

Or: Why do we see the arrow of time as forward? Is that a requirement of (the fundamental law of) increasing entropy? Or is it just a coincidence? Or maybe it has something more to do with initial conditions.
 
  • #74
Ilja said:
That there is an animal named mechanism for causality is new to me, as far as I know causality is fundamental, assumed as given from the start. But, I guess, we think about different things named "causality".

In this case, I'm using causality to mean the propagation of effects. You drop a pebble into a pool of water, and the ripples spread out away from where you dropped it. Someone seeing those ripples will likely conclude that there must have been some disturbance at the apparent source of the outgoing circular waves. But the equations describing the propagation of waves in water are time-symmetric. So it is consistent with those equations to have converging concentric waves as well as diverging waves. So how do you explain why you always see diverging waves, and never see converging waves? It has to do with boundary conditions. The outgoing waves are the only possibility that is consistent with the boundary conditions.
 
  • #75
Ilja said:
My point was not a logical proof, but that there is strong empirical evidence that there is no time symmetry in nature. That there is an animal named mechanism for causality is new to me, as far as I know causality is fundamental, assumed as given from the start. But, I guess, we think about different things named "causality".

How is causality fundamental? Measuring a non-commuting observable on a system in a known eigenstate always produces a random value. That doesn't sound like empirically fundamental anything.

So I say that causes (or influences) from the future would appear (to us) as randomness in the present. Again, I am imagining some kind of time symmetric formulation of QM. That doesn't seem to be more of a stretch then imagining a Bohmian formulation in which all particle positions everywhere are a part of the equation. At least in the TS formulation, your limit of things to consider resides in a nice Einsteinian time cone (albeit in 2 directions).

Of course, beauty is in the eye of the beholder. :biggrin:
 
  • #76
stevendaryl said:
What's weird about this thought experiment is that while the guinea pig can safely assume that he will be older and more decrepit in the future, he can't assume that he will be younger and in better health in the future. In this model, the most likely past for the guinea pig is one in which he is older than now. It's overwhelmingly likely that right now the guinea pig is youngest he has been for millennia and the youngest he will be for millenia to come.
Yeah. In relation to the origin of the universe, this argument states that it is incredibly more likely, that all of the observable universe was created in a fluctuation just a moment ago, instead of in a fluctuation with a much lower entropy during the Big Bang. I think this goes back to Boltzmann and it is quite puzzling.

DrChinese said:
Or: Why do we see the arrow of time as forward? Is that a requirement of (the fundamental law of) increasing entropy? Or is it just a coincidence? Or maybe it has something more to do with initial conditions.
I see this more clearly now, thanks. I still have a gap in my understanding regarding the relation between coarse-grained entropy and microscopic theory, though.
 
  • #77
Yakir Aharonov's time symmetric interpretation of quantum mechanics (TSQM) offers a way to explain the EPR paradox and preserve local realism. (A TSQM-based explanation of the EPR Paradox was the #18 post in this discussion.) Please note that Yakir Aharonov was a student of Bohm and was very familiar with the deBroglie–Bohm theory (dBB). TSQM replaces dBB and, as noted above, provides a way to explain EPR, where dBB does not. For an introduction to TSQM, see post #23. For a few of the experiments which confirmed results TSQM had uniquely predicted, see post #24. As to the "realism" question, see posts #25 and 26.
 
  • #78
Regarding the Arrow of time and the Second law of thermodynamics, as “QM freak” it’s easy to forget gravity:

AB230924-FA4D-9EAC-5E5E8D5152C227B1_4.jpg


Clearly the initial conditions in the early universe, is what gives the direction and destruction of exergy. Energy can’t be destroyed but exergy can, and exergy is the fuel that drives the universe.

My guess is that the perplexity regarding T-symmetry etc will be gone once we get a complete theory for QM gravity, hopefully...(The thing that tickles me is the question – What if gravity was “turned on” after “matter creation”?? – there you have the special initial conditions in a little box! ;)
 
Last edited by a moderator:
  • #79
kith said:
I see this more clearly now, thanks. I still have a gap in my understanding regarding the relation between coarse-grained entropy and microscopic theory, though.

The idea about how the Liouville Theorem is consistent with increasing coarse-grained entropy is illustrated with the following picture: Imagine a system starting out with an uncertainty given by a certain compact volume in phase space, as shown on the left. With time, that simple shape evolves to a much more complex shape, such as the one on the right. The shape has the same actual volume as it did previously. But if you do coarse-graining, and ignore the details of the shape, the shape on the right appears to have a larger volume of phase space than the one on the left. So coarse-graining (ignoring tiny details)

liouville.jpg
 
  • #80
Ilja said:
But it is essentially forbidden. String theory publishes thousands of articles without a single empirical prediction. Based on the alternative approach, you can be happy if you succeed to publish a single paper, if you succeed to derive the whole particle content of the standard model from simple principles applied to a quite simple model (arXiv:0908.0591), and you can be sure that nobody even looks at it - once this horrible approach requires a preferred frame.
This effect, well known to all of us working on less popular theories in physics, has more to do with sociology and psychology than with science. I was thinking a lot about it and concluded that scientists are just like all other "ordinary" people. Even if they are more intelligent than the average, they are not much more rational. But let me not enter into the details, because that would be off topic ...
 
  • #81
DrChinese said:
How is causality fundamental?
There is no explanation in terms of anything more fundamental.

Measuring a non-commuting observable on a system in a known eigenstate always produces a random value. That doesn't sound like empirically fundamental anything.
Different values of the probability have also some cause. Not? By the way, I'm not talking about "empirically fundamental". This sounds like a contradiction for me. Empirical predictions are always quite complex derived things.

So I say that causes (or influences) from the future would appear (to us) as randomness in the present. Again, I am imagining some kind of time symmetric formulation of QM. That doesn't seem to be more of a stretch then imagining a Bohmian formulation in which all particle positions everywhere are a part of the equation. At least in the TS formulation, your limit of things to consider resides in a nice Einsteinian time cone (albeit in 2 directions).
I don't follow. In dBB it depends on a 3-dimensional configuration, in your TS on two four-dimensional. By the way, if the future already exist, we need no causality at all. The future simply remains as it is, it does not have to change at all.
 
  • #82
Jon_Trevathan said:
Yakir Aharonov's time symmetric interpretation of quantum mechanics (TSQM) offers a way to explain the EPR paradox and preserve local realism. (A TSQM-based explanation of the EPR Paradox was the #18 post in this discussion.) Please note that Yakir Aharonov was a student of Bohm and was very familiar with the deBroglie–Bohm theory (dBB). TSQM replaces dBB and, as noted above, provides a way to explain EPR, where dBB does not. For an introduction to TSQM, see post #23. For a few of the experiments which confirmed results TSQM had uniquely predicted, see post #24. As to the "realism" question, see posts #25 and 26.
Seen them and answered in #28.

Anyway, causal influence from the future violates Einstein causality too, it allows only causal influences from the past light cone. Thus, Einstein causality would be dead even in your choice. If it is not about causal influence from the future (which is how I interpret the paper) then it is about something different and irrelevant as an explanation of the violation of Bell's inequality.

For me, causal influences from the future are mystical sci-fi nonsense not worth to be considered seriously. To take it seriously, one would need extremely strong empirical evidence. Something completely unexplainable with classical causality as in dBB. If you think otherwise, your choice.
 
  • #83
stevendaryl said:
But if you know the position of a particle at all times, then you know the velocity at all times (well, if the position is a differentiable function of time). Yet position and velocity are non-commuting.
Yes, but I think that's not what Dr Chinese had in mind.
 
  • #84
kith said:
Thanks, that's a nice point of view. I still don't understand something: both the quantum potential and the probabilities are derived from the wave function. The wave function and the potential are regarded as ontic in the PBR sense. So where does the epistemicity -which is reflected by the probabilities- come from? Or speaking in terms of classical mechanics: we seem to have an equation for a state of knowledge ρ which describes the motion of some particles in an ontic potential V(ρ). This is hard to reconcile for me.
Let me use a simple classical analogy. Suppose that you have lost your keys in your apartment, but you have no idea in which room have you lost them. What you know is that some rooms are bigger and others are smaller. The rooms themselves and their size are ontic properties. Now, do these ontic properties imply some epistemic (probabilistic) properties as well? Yes they do. You can easily conclude that the probability of finding the keys in a given room is proportional to the size of the room. It is more likely that you will find the keys in a bigger room than in a smaller one.

In Bohmian mechanics, instead of the room you have wave function, and instead of the room's size you have |psi|^2. The bigger |psi|^2 at a given point, the bigger probability that you will find the particle there.
 
  • #85
Thanks, stevendaryl and Demystifier. I have learned quite a bit from this thread. :-)
 
  • #86
Ilja said:
There is no explanation in terms of anything more fundamental.

Circular reasoning. You assume that which you conclude, which is that causality rules. That would more or less force you down the Bohmian path. And voila...
 
  • #87
Ilja said:
Seen them and answered in #28.
For me, causal influences from the future are mystical sci-fi nonsense not worth to be considered seriously. To take it seriously, one would need extremely strong empirical evidence. Something completely unexplainable with classical causality as in dBB. If you think otherwise, your choice.

You need to read the papers I cited.
 
  • #88
@Jon_Trevathan: I have read (and cited) one of them, what was interesting for me has been clarified, I have no interest in interpretations which use confusing time-symmetric notions to describe a time-asymmetric world.

@DrChinese: No circular reasoning because I have never claimed that I can somehow conclude that causality has to be fundamental. In my opinion it is, and I have never seen a meaningful approach where it has been non-fundamental, derived from something different. Feel free to introduce me to such an approach.
 
  • #89
An interesting paper that kind of relates to the topic of this thread:
Theorem 16 (PBR). For any preparation independent theory that reproduces(a certain set of ) quantum correlations, the wavefunction is ontic.Motivated by this, we present a weak version of Bell’s theorem [3], in which we additionally assume preparation independence. The proof here is similar to that of proposition 14 and striking for its simplicity. The theorem could also be regarded as a combination of the PBR theorem, and a result closely related to the following, proved in [8].

Theorem 17. Quantum mechanics is not realisable by any preparation independent, local theory. Proof. If quantum mechanics is realisable by a preparation independent theory then, by the PBR theorem, the wavefunction is ontic with respect to that theory. We proceed by showing that there exist quantum correlations that cannot be realized by any local model for which the wavefunction is ontic...
On the Reality of Observable Properties
http://arxiv.org/pdf/1306.3216.pdf

If I'm interpretating this correctly, this is the reason why Leifer argued that using PBR we can now "infer nonlocality directly from EPR":
As emphasized by Harrigan and Spekkens, a variant of the EPR argument favoured by Einstein shows that any psi-ontic hidden variable theory must be nonlocal. Thus, prior to Bell's theorem, the only open possibility for a local hidden variable theory was a psi-epistemic theory. Of course, Bell's theorem rules out all local hidden variable theories, regardless of the status of the quantum state within them. Nevertheless, the PBR result now gives an arguably simpler route to the same conclusion by ruling out psi-epistemic theories, allowing us to infer nonlocality directly from EPR.
PBR, EPR, and all that jazz
http://www.aps.org/units/gqi/newsletters/upload/vol6num3.pdf
 
Last edited:
  • #90
Ilja said:
@DrChinese: No circular reasoning because I have never claimed that I can somehow conclude that causality has to be fundamental. In my opinion it is, and I have never seen a meaningful approach where it has been non-fundamental, derived from something different. Feel free to introduce me to such an approach.

Umm, Quantum Mechanics?

Perhaps you know of *something* where indeterminism (raw chance) does not play a part. Anything actually. How about human behavior? Ever seen the slightest indication that A causes B there? Structure of the universe, what caused the sun to be where it is and the Earth to be where it is. Anything...?

And if you even bother to mumble something about initial conditions, you will really bring a smile to my face. :smile: In fact you already have...
 

Similar threads

  • · Replies 220 ·
8
Replies
220
Views
22K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
7K
Replies
22
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 47 ·
2
Replies
47
Views
5K
  • · Replies 28 ·
Replies
28
Views
2K
  • · Replies 40 ·
2
Replies
40
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K