Superdeterminism and the hidden variable

In summary: Superdeterminism has a number of problems: it is not experimentally testable (as @bobob says, "no physical content") and the mechanisms it requires seem wildly implausible to most people. It is pretty much a matter of personal taste whether retaining the classical worldview is worth accepting these problems.Yes, it is a matter of personal taste, but I think that accepting the problems is worth it in order to retain a more complete understanding of the world. Superdeterminism is a way of reconciling classical physics with the experimentally observed violations of Bell's inequality.
  • #1
wittgenstein
216
7
TL;DR Summary
Bell's superdetermimism seems to imply that there is a hidden variable.
Bell's superdetermimism seems to imply that there is a hidden variable. Please correct me if I am wrong but that seems like classical physics.
 
Physics news on Phys.org
  • #2
  • #3
Superdeterminism has no physical content.
 
  • #4
wittgenstein said:
Bell's superdetermimism...
Bell's? When he did embrace superdeterminism?
... that seems like classical physics.
Of course - that's why it was invented. Superdeterminism is one way (the only way when the other "loopholes" are closing) of reconciling classical physics with the experimentally observed violations of Bell's inequality.

Superdeterminism has a number of problems: it is not experimentally testable (as @bobob says, "no physical content") and the mechanisms it requires seem wildly implausible to most people. It is pretty much a matter of personal taste whether retaining the classical worldview is worth accepting these problems.
 
  • #5
You wrote," and the mechanisms it requires seem wildly implausible to most people "
What are those wildly implausible requirements? Superdeterminism seems reasonable .OK it is not experimentally testable. It is not experimentally possible to determine if outside the light cone are unicorns. But I think it is reasonable to say that there are no unicorns beyond the observable universe. Do I absolutely know that there are none? NO! Science is not based on absolute certainty. Can I say that the Earth goes around the sun? I am "ONLY " 99.99999 % sure.
 
  • Like
Likes swampwiz
  • #6

I love it when he says that when students ask how can such a weird thing can be true, the answer is to give up physics. He also says that modern physicists are like a guy that tells you how to get the right results from your calculator , but your question as to how the calculator works is forbidden! I want to know about reality, not how to push the right buttons to get the answer to a math problem.
 
  • #8
wittgenstein said:
What are those wildly implausible requirements?
Suppose I design a clever automated device with a polarizing filter and a chamber into which we can insert a billet of uranium; the device sets its orientation for each measurement according to the pattern of random radioactive decay in that uranium billet. I make two copies my design blueprints; one goes into storage on Earth and the other goes into something like the Voyager spacecraft . A few tens of millennia later the spacecraft reaches an inhabited planet, and these alien physicists build the machine according to the blueprint I sent them, including locating an ore deposit and mining and refining some uranium. Meanwhile my remote descendants are doing the same thing with the blueprints left back on earth. After a decade or so exchanging radio messages to confirm that both sides have set up their devices, some entangled photon pairs are generated and sent to both detectors (another few years) and then the results are shared by radio (even more years)... and it is seen that Bell’s inequality has been violated.

The superdeterminist explanation is that there is a relationship between the decay patterns of two ostensibly independent pieces of uranium mined and refined on different planets light-years apart and the BBO crystal we’re using to generate our entangled photon pairs. It’s possible - all three deterministically evolved from the same cloud of intergalactic schmutz a few billion years ago - but I feel justified in applying adjectives like “extraordinary” and “implausible” to that possibility.
 
  • #9
Bobob wrote," Superdeterminism has no physical content. "
I don't know what that means. Are you saying that there is no experiment that verifies superdeterminism? OK, So the theory that there are not 8,000,000 unicorns just beyond the observable universe has no physical content? If a theory is reasonable such as superdeterminism and doesn't require abandoning logic ( Copenhagen violates the law of the excluded middle ) I am more likely to accept it.
 
  • #10
Nugatory, I wrote that before your response. I have to go to my fiancée's house. I will print it and ponder it.
 
  • #11
Thank you! I love pondering!
 
  • #12
Nugatory said:
Bell's? When he did embrace superdeterminism?Of course - that's why it was invented. Superdeterminism is one way (the only way when the other "loopholes" are closing) of reconciling classical physics with the experimentally observed violations of Bell's inequality.

No. Classical physics (or, in this case, better metaphysics) does not have any problem with violations of Bell's inequality. The straightforward classical way is to accept a preferred frame - as in the classical interpretation of special relativity following Lorentz. Non-locality is not a big problem for classical physics, given that Newtonian gravity was non-local too. It is relativity (in its fundamental interpretation) which has a problem with violations of Bell's inequality.
 
  • #13
wittgenstein said:
Summary:: Bell's superdetermimism seems to imply that there is a hidden variable.

Bell's superdetermimism seems to imply that there is a hidden variable. Please correct me if I am wrong but that seems like classical physics.

Yes, superdeterminism is a sort of hidden variable theory. Unlike a hidden variable theory like Bohmian Mechanics, superdeterministic hidden variables can be local. However, it is difficult (impossible?) for us to come up with a useful hidden variable theory, since it is likely to depend on fine tuning of many details in the past.
 
  • Like
Likes mattt and Demystifier
  • #14
All time/CPT symmetric formulations (with boundary conditions in past and future), like the least action principle, or Feynman ensembles of paths/diagrams/histories of the universe are superdeterministic.

For example scattering matrix uses boundary conditions in past and future ( https://en.wikipedia.org/wiki/S-matrix#Interaction_picture ):
##S_{fi} = \lim_{t_2\to +\infty} \lim_{t_1\to-\infty} \langle \Phi_f|U(t_2,t_1)|\Phi_i\rangle ##
 
  • Skeptical
Likes Demystifier
  • #15
Jarek 31 said:
All time/CPT symmetric formulations (with boundary conditions in past and future), like the least action principle, or Feynman ensembles of paths/diagrams/histories of the universe are superdeterministic.

For example scattering matrix uses boundary conditions in past and future ( https://en.wikipedia.org/wiki/S-matrix#Interaction_picture ):
##S_{fi} = \lim_{t_2\to +\infty} \lim_{t_1\to-\infty} \langle \Phi_f|U(t_2,t_1)|\Phi_i\rangle ##
That has nothing to do with superdeterminism.
 
  • #16
The best person for this topic is probably Ken Wharton, claiming that superdeterministic explanations (now regaining popularity due to Sabine Hossenfelder) are equivalent with time symmetric ones - we have some Ken's papers and slides in our seminar webpage: http://th.if.uj.edu.pl/~dudaj/QMFNoT

Let me elaborate how I see it.
We usually think about determinism as evolution from past boundary conditions, like in Euler-Lagrange or Schrodinger equation.
But in time/CPT symmetric formulations, e.g. the least action principle, or Feynman path integrals, we derive them based on boundary conditions in both past and future to mount such trajectories.
The difference is that e.g. the least action principle requires only values at both boundaries, while Euler-Lagrange requires values and derivatives in one boundary.

In superdeterminism we imagine that state is already chosen to be compatible with all future measurements.
In symmetric formulations it is not a surprise as everything depends on both past and future, we can imagine that state contains this additional future-dependent information e.g. in these additional derivatives in Euler-Lagrange.
 
  • #17
Closed while the moderators fetch a mop and a bucket.

... which has now been employed to clean up a long digression from the original thread topic. The thread is open again.
 
Last edited:
  • Like
  • Haha
Likes mattt, berkeman and phinds
  • #18
Here are some things on superdeterminism you might find interesting:

Sabine Hossenfelder

Gerard ‘tHooft

Gerard ‘tHooft

’t Hooft is upset by the fact that QM only gives us averages, it does not tell us what each particle is doing, i.e., it doesn’t tell us “what reality is.” But, as I pointed out in this ScienceX News article regarding Bell states, that’s because there is no hidden reality. Reality is not fundamentally governed by causal mechanisms, but by principles.
 
  • Like
Likes Lynch101 and DrClaude
  • #19
RUTA said:
But, as I pointed out in this ScienceX News article regarding Bell states, that’s because there is no hidden reality. Reality is not fundamentally governed by causal mechanisms, but by principles.
So you essentially propose to give up causality as a guiding principle. Without anything in exchange. Ok, here you will probably object, given that you propose to use the "principle fashion" instead. But, sorry, nothing prevents the use of principle explanations together with causal explanations. There is no either or between principle explanations and causal explanations. So, the principle explanations are not at all a replacement, a gain we obtain by throwing away causal explanations. And so your proposal means to give up causal explanations without gaining anything.

But, of course, it may happen that we are forced to give up causal explanations. How can we find out if we are in such a situation? The answer is straightforward and simple: There will be something which does not allow any causal explanation. Imaginary examples of such falsifications of causality can be found already in many myths, like the story of Oedipus, and, in fact, almost every story where somebody learns about a prophecy not in his favor and tries everything imaginable to prevent this, without success. Of course, behind those myths are plausibly real experiences of children faced with predictions of adults what would be the long-term consequences of their stupid behavior, predictions which have causal explanations, but the children don't know them. Similarly, if we see such an empirical "falsification" of causality, we should be aware of the possibility that nonetheless there exist a causal explanation which is yet unknown to us.

So, before rejecting causality, there should be, first, some experiment which superficially falsifies causality, that means that we actually don't have any causal explanation. But then there should be even more: Sufficiently strong evidence that no causal explanation is possible even in principle.

What is the situation for Bell's theorem? It is far away from this. We have causal explanations for the observed violations of the Bell inequalities, namely all the realist causal interpretations of QT (de Broglie-Bohm, Nelsonian stochastics, Caticha's entropic dynamics). The only objection against them is metaphysical: They require a (hidden) preferred frame in the relativistic context.

So, relativistic symmetry is emergent, not fundamental. Big deal. We know anyway that the actual field theories we have are emergent, effective theories (mainly because gravity is non-renormalizable) , so the hypothesis that its symmetries are not emergent but fundamental is pure speculation based on nothing. Then, we have everyday examples of emergent Lorentz symmetry - every wave equation ##(\partial_t^2 - c_m^2 \Delta +V(u)) u(\vec{x},t) = 0##, with ##c_m## being a constant characteristic for this sort of waves, has a Lorentz symmetry with that ##c_m## instead of c: It allows to compute a Doppler-shifted solution for every solution of that wave equation. Of course, for sound waves we know that this symmetry is only a large scale approximation, thus, emergent.

Are there other objections against the realist causal interpretations? No.
 
  • #20
Sunil said:
What is the situation for Bell's theorem? It is far away from this. We have causal explanations for the observed violations of the Bell inequalities, namely all the realist causal interpretations of QT (de Broglie-Bohm, Nelsonian stochastics, Caticha's)
Could you elaborate?

I prefer Mermin's inequality as more obvious than Bell's, hence more difficult to violate:
For ABC binary variables: Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1
intuitively: "tossing 3 coins, at least 2 give the same value" - looks completely absolutely obvious, but QM formalism allows to violate it (e.g. https://arxiv.org/pdf/1212.5214 ).

Derivation of this inequality doesn't use locality assumption, only existence of joint probability distribution and Kolmogorov axioms ( https://en.wikipedia.org/wiki/Probability_axioms ) - could you briefly say how these interpretations can explain its violation?
 
  • #21
The point of the addendum to my post about principle explanation is in response to ’t Hooft’s complaint that QM only gives averages. He thinks that fact entails that something is missing from QM. Our paper shows that average-only conservation is necessarily the best the conservation principle can be instantiated, if the QM correlations are correct (and they have been verified to 8 sigma).

Keep in mind that principle explanation is simply the mathematical consequence of some empirical fact. So, whatever constructive counterpart you want to propose will have to be in accord with the principle explanation unless you are refuting the mathematical consequence, which in this case has been tested to 8 sigma.

Whatever causal mechanism you want to propose that leads to average-only conservation is going to be very deviant from what is typically meant by a causal mechanism, e.g., "Principle theories, constructive theories, and explanation in modern physics." Wesley Van Camp. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 42(1), 23-31 (2011):
The interpretive work that must be done is less in coming up with a constructive theory and thereby explaining puzzling quantum phenomena, but more in explaining why the interpretation counts as explanatory at all given that it must sacrifice some key aspect of the traditional understanding of causal-mechanical explanation.
So, the search for a constructive account of Bell state entanglement is starting to sound a lot like the aether as a causal mechanism for SR.

The point is, we have one and the same principle explanation for the mysteries of SR and QM while both are still without consensus constructive counterparts after 115 and 85 years, respectively. This is my response to ‘t Hooft’s motivation for superdeterminism.
 
Last edited:
  • #22
Jarek 31 said:
Derivation of this inequality doesn't use locality assumption, only existence of joint probability distribution and Kolmogorov axioms ( https://en.wikipedia.org/wiki/Probability_axioms ) - could you briefly say how these interpretations can explain its violation?
Your link refers to some variant of the theorem which starts with the assumption of counterfactual existence. This is not less, but more than what Bell uses because the first part of the proof (in the paper itself a first few lines with a verbal reference to EPR before the first formula, ignored by many readers, only later he has recognized this and emphasized this part in more detail as a very essential part) is the proof that from Einstein causality and the EPR criterion of locality it counterfactual existence follows.

Because Einstein causality does not hold in the realist interpretations, one cannot use it to prove counterfactual existence. And in itself there is no counterfactual existence: The measurement device of a general measurement has also a trajectory ##q_{dev}(t)\in Q_{dev}##, and the "measurement result" depends for the general measurement not only on the trajectory of the system, but also on the trajectory of the measurement device (and in particular on the initial value). The only exception would be the measurement of the configuration itself, so, whenever some non-configuration measurement is involved, the measurement result depends on the measurement device too. So let's follow Bell's recommendation in "against measurement" and rename "measurement of A" with something like "result of interaction with device A" and so on.

So only the configuration variables have counterfactual existence. Everything else not, all other "measurements" are contextual. (This includes also measurements of other maximal sets of commuting variables, even if they contain some configuration variables, the "measurement" even of those configuration variables can, then, be influenced by the other context.)
 
  • #23
Derivation of Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1 inequality uses just two assumptions:
1) There exists joint probability distribution Pr(ABC) ##(\sum_{ABC} Pr(ABC) =1)##,
2) Kolmogorov axioms, especially the 3rd one: https://en.wikipedia.org/wiki/Probability_axioms#Third_axiom

Derivation: from 3rd axiom
Pr(A=B) = Pr(000) + Pr(001) + Pr(110) + Pr(111)
Pr(A=C) = Pr(000) + Pr(010) + Pr(101) + Pr(111)
Pr(B=C) = Pr(000) + Pr(100) + Pr(011) + Pr(111)
getting
Pr(A=B) + Pr(A=C) + Pr(B=C) = 2Pr(000) + 2Pr(111) + sum_{ABC} Pr(ABC) >= 1
But QM formalism allows to violate it (e.g. 4.2.1 in Preskill notes http://theory.caltech.edu/~preskill/ph229/notes/chap4_01.pdf ).

So at least one of the two above assumptions is not satisfied by physics - which one?

I would say that QM allows to replace 3rd axiom with Born rule:
- 3rd axiom: probability of alternative of disjoint events is sum of their probabilities,
- Born rule: probability of alternative of disjoint events is proportional to square of sum of their amplitudes.
The problem is understanding Born rule, but using Feynman -> Boltzmann path ensemble mathematics similarity, we can get Born-like rule in Ising model, what brings nice intuitions: https://www.physicsforums.com/threa...sing-model-feynman-boltzmann-ensemble.995234/
 
  • #24
Jarek 31 said:
1) There exists joint probability distribution Pr(ABC) ##(\sum_{ABC} Pr(ABC) =1)##,
2) Kolmogorov axioms,
So at least one of the two above assumptions is not satisfied by physics - which one?
The first one. It is assuming that the "measurement" results are predefined, that measuring A for the first copy does not change A, B or C for the second one. In the Bell tests, one hopes to reach this by making both measurements spacelike separated, so that if Einstein causality holds, both measurements cannot causally influence the other measurements or their results. But in the realistic interpretations, such influence explicitly exists. If Alice measures A, and gains 1, then an immediate causal influence guarantees that after this Bob measuring A will give 1 too.

RUTA said:
Keep in mind that principle explanation is simply the mathematical consequence of some empirical fact. So, whatever constructive counterpart you want to propose will have to be in accord with the principle explanation unless you are refuting the mathematical consequence, which in this case has been tested to 8 sigma.
Ok. Even if there is yet a difference between a principle and the empirical facts supporting that principle, such principles are indeed quite close to the empirical facts, and, therefore, unable to explain them. A "principle" based on observational facts cannot be an explanation of these observational facts.

The number of sigma is, in fact, quite irrelevant, even a "principle" which holds only with one or two sigma it is worth to search for causal explanations, and even a "principle" which holds for 10 sigma can fail on a more fundamental level. That the causal explanation has to agree with the facts is clear, and for this purpose principle theories remain useful as a mathematical tool for the research for a constructive theory, which gives a causal explanation. All what one needs, quite independent of the number of sigma given by the experiments, is that the equations of the principle theory can be derived from the constructive theory in some limit.
RUTA said:
"Principle theories, constructive theories, and explanation in modern physics." Wesley Van Camp. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 42(1), 23-31 (2011):
The interpretive work that must be done is less in coming up with a constructive theory and thereby explaining puzzling quantum phenomena, but more in explaining why the interpretation counts as explanatory at all given that it must sacrifice some key aspect of the traditional understanding of causal-mechanical explanation.
So, the search for a constructive account of Bell state entanglement is starting to sound a lot like the aether as a causal mechanism for SR.
The point being? That discussing interpretations of quantum theory is now allowed here, while discussing interpretations of relativity is yet anathema?
RUTA said:
The point is, we have one and the same principle explanation for the mysteries of SR and QM while both are still without consensus constructive counterparts after 115 and 85 years, respectively. This is my response to ‘t Hooft’s motivation for superdeterminism.
Both have constructive counterparts.

For SR, there was always the Lorentz ether. Today, there is more, namely for the SM of particle physics, we have
Schmelzer, I. (2009). A Condensed Matter Interpretation of SM Fermions and Gauge Fields, Found. Phys. 39(1) 73-107, resp. arxiv:0908.0591.

And for GR, we have
Schmelzer, I. (2012). A Generalization of the Lorentz Ether to Gravity with General-Relativistic Limit. Advances in Applied Clifford Algebras 22(1), 203-242, resp. arxiv:gr-qc/0205035.

There is also Jacobson's Einstein aether, but it does not really look like a constructive theory, it seems it is more a technical way to find out where one can search for preferred frame effects violating GR.

For quantum theory, we have all the realist interpretations, in particular

Bohm, D. (1952). A suggested interpretation of the quantum theory in terms of ``hidden'' variables, Phys. Rev. 85, 166-193.

Nelson, E. (1966). Derivation of the Schrödinger Equation from Newtonian Mechanics, Phys.Rev. 150, 1079-1085

Caticha, A. (2011). Entropic Dynamics, Time and Quantum Theory, J. Phys. A44:225303, arxiv:1005.2357

All of them are constructive, given that they explain the probabilities postulating a continuous configuration space trajectory ##q(t)\in Q##, and are essentially also different, more fundamental theories (dBB because it is defined also outside quantum equilibrium, Nelson and Caticha because of the Wallstrom objection).

That there is no consensus in the sense of no acceptance by the mainstream is the result of a definite decision of the mainstream to reject or ignore them essentially without discussion. With Caticha actually the story with Bohm is repeated, with a lot of people starting with PBR proving impossible theorems for something which already has been explicitly constructed and presented. Today, ignorance is an argument, and a quite strong one.

No consensus in the sense of different proposals is what has to be expected. Principle theories restrict themselves to observable effects, constructive theories not, they propose something more fundamental which would allow to explain, in a nontrivial way, what we observe. Here, different possibilities may be expected.
 
  • #25
Sunil said:
in the realistic interpretations, such influence explicitly exists. If Alice measures A, and gains 1, then an immediate causal influence guarantees that after this Bob measuring A will give 1 too.

Note that, when you try to reconcile such an interpretation with relativity, more precisely with the fact that the time ordering of A's and B's measurements, since they are spacelike separated, is frame-dependent, you end up either admitting that your interpretation is not valid because it is incompatible with relativity, or claiming that Lorentz invariance is not fundamental but only emergent and so should be expected to be violated under appropriate conditions. The latter position, while it is logically consistent, has a heavy burden of proof since we have no experimental evidence whatever of Lorentz invariance being violated.
 
  • #26
Physics has largely accepted that SR is a principle theory, as you note. We're trying to build on that fact by suggesting it's time to accept QM as a principle theory as well. In fact, they're both based on the same principle. That's the point, but this discussion is not relevant to the OP, so we should abandon it here.
 
  • #27
PeterDonis said:
Note that, when you try to reconcile such an interpretation with relativity, more precisely with the fact that the time ordering of A's and B's measurements, since they are spacelike separated, is frame-dependent, you end up either admitting that your interpretation is not valid because it is incompatible with relativity, or claiming that Lorentz invariance is not fundamental but only emergent and so should be expected to be violated under appropriate conditions.
Given Bell's theorem, all realistic interpretations have to accept a hidden preferred frame. Which makes them incompatible with the spacetime interpretation of relativity, but not with the Lorentz ether interpretation which assumes a preferred frame. Ok, if one rejects the preferred frame as anathema, then the Minkowski interpretation is the only one and so one can speak about some incompatibility with relativity. I don't.

For the Lorentz ether, Lorentz invariance is, indeed, not fundamental but only emergent.
PeterDonis said:
The latter position, while it is logically consistent, has a heavy burden of proof since we have no experimental evidence whatever of Lorentz invariance being violated.
The place where one would naively expect it to be violated is the Planck length. So one would not even expect to see Lorentz violations now.
 
  • #28
The two assumption lead to inequalities violated by QM formalism - which one is wrong?
1) There exists joint probability distribution Pr(ABC) ,
2) Kolmogorov axioms,
Sunil said:
The first one. It is assuming that the "measurement" results are predefined, that measuring A for the first copy does not change A, B or C for the second one.
No, probability distribution does not have to predefine results, like for Pr(head)=Pr(tail)=1/2 coin toss.
And this is existence of joint probability distribution, allowing to make that measurement of A changes probability distribution of B,C, e.g. Pr(head,head,head)=Pr(tail,tail,tail)=1/2 and 0 for remaining.

The problem is much more subtle - that in QM we can replace existence of joint probability distribution and 3rd axiom, with existence of amplitudes and Born rule - having this essentially different square allowing to violate inequalities derived without it.

Statistical interpretations need to explain this square in Born rule.
Hint: asking about probability distribution inside Ising sequence we also get this square.
 
  • #29
Sunil said:
For the Lorentz ether, Lorentz invariance is, indeed, not fundamental but only emergent.

Please note, first, that Lorentz Ether Theory is generally not discussed here at PF, since that term as it is usually understood refers, not to any theory in which Lorentz invariance is "only emergent", but to an interpretation of Special Relativity in which there is a preferred frame, but it is unobservable and does not affect the results of any experiments.

You do not appear to be using "Lorentz ether" to refer to that theory, but to some other theory in which, it seems, there should in principle be a way to experimentally distinguish the theory from standard Special Relativity. If that is the case, you need to give a reference for this theory.

Sunil said:
The place where one would naively expect it to be violated is the Planck length.

Again, please give a reference for whatever theory (or hypothesis) you are talking about that makes this claim.
 
  • #30
PeterDonis said:
Please note, first, that Lorentz Ether Theory is generally not discussed here at PF, since that term as it is usually understood refers, not to any theory in which Lorentz invariance is "only emergent", but to an interpretation of Special Relativity in which there is a preferred frame, but it is unobservable and does not affect the results of any experiments.
We are here in the interpretation subforum. Ok, it is only for interpretations of quantum theory, but all the realist interpretations of quantum theory require a preferred frame (I exclude here MWI and similar things sometimes claimed to be realist too), thus, the preferred frame hypothesis can be considered part of them, thus, can be discussed too. Then, incompatibility with relativity is claimed, despite the fact that there is no incompatibility with any observations supporting relativity, what is behind such claims is incompatibility with a particular interpretation of relativity, the spacetime interpretation. If one would exclude here discussions about interpretations of relativity, then one should exclude such claims too.

In the light of Bell's theorem, the preferred frame hypothesis has observable consequences. It allows causal influences faster than light if they influence only the future as defined by the preferred frame. To prove Bell's theorem, one has to use, instead, Einstein causality, classical causality as defined by a preferred frame is not sufficient for this. The possibility of violations of the Bell inequalities for spacelike separated events is, therefore, a prediction of the Lorentz ether which differs from the corresponding prediction of the spacetime interpretation. (The "corresponding prediction" means that all other independent assumptions about reality, Reichenbach's common cause principle, no superdeterminism and so on are the same.)

(This is, by the way, quite typical for interpretations - if one looks more carefully, one finds one or the other differences in the interpretation.)

Then, I disagree about the identification of the Lorentz ether with the preferred frame hypothesis. I do not mean here particular things like the model of the electron considered by Lorentz, which is only of historical interest. But the Lorentz ether also contains the word "ether", which clearly refers to some medium. Even if this medium is homogeneous and incompressible, it is a medium different from space itself, and it has waves, so that homogeneity and incompressibility are plausibly only approximations, and even without the details being worked out one can assume that there will be some microscopic structure, similar to the other media we know.

If you want an explicit reference to some theory of the Lorentz ether itself, no problem, take the no gravity limit of Schmelzer's generalization of the Lorentz ether to gravity, or his ether model for the SM. But I think that my considerations here are independent of this particular approach. It is simply one particular proposal for the general idea that the SM as well as GR are only effective theories. Weinberg claims that "The present educated view of the standard model, and of general relativity, is again that these are the leading terms in effective field theories", so this is sufficiently mainstream. As an effective field theory, quantum GR works nicely, see Donoghue's papers, while fundamental GR has not yet been successfully quantized.

PeterDonis said:
You do not appear to be using "Lorentz ether" to refer to that theory, but to some other theory in which, it seems, there should in principle be a way to experimentally distinguish the theory from standard Special Relativity. If that is the case, you need to give a reference for this theory.
If GR and SM are only effective field theories, then there will be in principle such a way, namely go down to the critical distance of that theory. One can, of course, hope that the underlying high energy theory will have the same relativistic symmetry, but there is nothing which suggests this. At least those straightforward regularizations which can pretend to look like more fundamental theories, like lattices, violate relativistic symmetry.

Ref:
Weinberg, S. (1997). What is Quantum Field Theory, and What Did We Think It Is. arxiv:hep-th/9702027
Donoghue, J.F. (1994). General relativity as an effective field theory: The leading quantum corrections. Phys Rev D 50(6), 3874-3888
Donoghue, J.F. (1996). The Quantum Theory of General Relativity at Low Energies, Helv.Phys.Acta 69, 269-275, arXiv:gr-qc/9607039
Schmelzer, I. (2012). A generalization of the Lorentz ether to gravity with general-relativistic limit, Advances in Applied Clifford Algebras 22(1) 203-242, arXiv:gr-gc/0205035
Schmelzer, I. (2009). A Condensed Matter Interpretation of SM Fermions and Gauge Fields, Found. Phys. 39(1), 73-107, arXiv:0908.0591
 
  • Like
Likes Demystifier
  • #31
Jarek 31 said:
The two assumption lead to inequalities violated by QM formalism - which one is wrong?
1) There exists joint probability distribution Pr(ABC) ,
2) Kolmogorov axioms,
No, probability distribution does not have to predefine results, like for Pr(head)=Pr(tail)=1/2 coin toss.
The assumption of a joint probability distribution Pr(ABC) presupposes a lot. First, without any presuppositions, you would have to distinguish at least six things, ##(A_A, A_B, B_A, B_B, C_A, C_B)##. And even this is far from sufficient, you also have to distinguish the experiments done, and their order. So,
##(A_A\to A_B, A_B \to A_A, A_A\to B_B, B_B \to A_A, \ldots, C_B \to C_A)##.

And even this is not all. If we know that ##A_A\to B_B##, Alice measures A and after this Bob measures B, what are the values of the two instances of C? They have not been measured. If they would have been measured, that would have been a completely different experiment.
 
  • #32
Probability distribution just gives statistics e.g. of measurement results, for example if tossing 3 distinguishable coins, probability of getting (A,B,C) result is Pr(ABC), they have to sum to 1 ... it seems really tough to escape existence of such joint probability distribution.
And we are talking about just "tossing 3 coins, at least 2 give the same" - absolutely obvious example of Dirichlet principle.

To violate it, first of all it is crucial that we measure only 2 out of 3 variables - measuring all 3, there is joint probability distribution, this inequality has to be satisfied.
So the third variable is not only unknown, but unmeasured - for the Born rule: to add amplitudes, then square to get probability distribution.

So what is the difference between unknown and unmeasured variable?
Where does Born rule come from - required to violate such inequalities?
 
  • #33
Sunil said:
We are here in the interpretation subforum.

The quantum mechanics intepretation subforum. Not the relativity interpretation subforum. As you yourself recognize.

Sunil said:
all the realist interpretations of quantum theory require a preferred frame (I exclude here MWI and similar things sometimes claimed to be realist too), thus, the preferred frame hypothesis can be considered part of them, thus, can be discussed too.

If you provide a specific reference to a QM interpretation that requires a preferred frame and claims to be incompatible with relativity, we could potentially discuss that. Whether that discussion would be on topic for this thread is a separate question.

Sunil said:
I disagree about the identification of the Lorentz ether with the preferred frame hypothesis

I didn't introduce the term "Lorentz ether" or equate it with "preferred frame". You did. If it's the wrong term for what you were actually talking about (which it is--see below), you shouldn't have used it.

Sunil said:
If GR and SM are only effective field theories, then there will be in principle such a way, namely go down to the critical distance of that theory. One can, of course, hope that the underlying high energy theory will have the same relativistic symmetry, but there is nothing which suggests this. At least those straightforward regularizations which can pretend to look like more fundamental theories, like lattices, violate relativistic symmetry.

None of these suggestions involve "Lorentz ether", so you should not be using that term to refer to them.
 
  • #34
Sunil said:
The possibility of violations of the Bell inequalities for spacelike separated events is, therefore, a prediction of the Lorentz ether which differs from the corresponding prediction of the spacetime interpretation.

Please provide a reference for this claim.
 
  • #35
PeterDonis said:
If you provide a specific reference to a QM interpretation that requires a preferred frame and claims to be incompatible with relativity, we could potentially discuss that. Whether that discussion would be on topic for this thread is a separate question.
I have mentioned several of them, dBB, Nelsonian stochastics, Caticha's entropic dynamics, with links. They require a preferred frame if considered in a relativistic context. This is obvious because they all use the Bohmian velocity (even if, except dBB itself, only as an average velocity).

PeterDonis said:
I didn't introduce the term "Lorentz ether" or equate it with "preferred frame". You did. If it's the wrong term for what you were actually talking about (which it is--see below), you shouldn't have used it.
Then I have somehow misinterpreted the following
Please note, first, that Lorentz Ether Theory is generally not discussed here at PF, since that term as it is usually understood refers, not to any theory in which Lorentz invariance is "only emergent", but to an interpretation of Special Relativity in which there is a preferred frame, but it is unobservable and does not affect the results of any experiments.
as equation the Lorentz ether with SR with a preferred frame.

I distinguish them, as explained, and both are necessary: The preferred frame is an actual property of the mentioned interpretations, the Lorentz ether in its modern interpretation is an effective field theory on a background with a preferred frame which allows to meet arguments against such interpretations of the following type:
... or claiming that Lorentz invariance is not fundamental but only emergent and so should be expected to be violated under appropriate conditions. The latter position, while it is logically consistent, has a heavy burden of proof since we have no experimental evidence whatever of Lorentz invariance being violated.
So, yes, I think Lorentz invariance is only emergent. And once objections against this are made, I have to refer to the Lorentz ether.
I have answered an argument made here, I have given a reference to a particular variant of the Lorentz ether compatible with GR/SM where Lorentz invariance is only emergent, and I have given more general (and therefore not mentioning the Lorentz ether) considerations which suggest that Lorentz symmetry is only emergent, which is necessary given that the number of supporters of the Lorentz ether is minimal. So I don't see the point of your objections.
There was also another claim in #21 which had to be answered by an explicit reference to the Lorentz ether:
RUTA said:
The point is, we have one and the same principle explanation for the mysteries of SR and QM while both are still without consensus constructive counterparts after 115 and 85 years, respectively.
PeterDonis said:
The possibility of violations of the Bell inequalities for spacelike separated events is, therefore, a prediction of the Lorentz ether which differs from the corresponding prediction of the spacetime interpretation.
Please provide a reference for this claim.
It is unclear to me what you think requires support with a reference. That classical causality, which is the causality which is relevant for the Lorentz ether, is not sufficient to prove the Bell inequalities for space-like separated events? The possibility of violations of the Bell inequalities for the Lorentz ether simply means that there is no proof. I cannot prove non-existence of proofs, the burden of proof is on the side of those who claim the existence of proofs. So, it is you (given that you claimed equivalence) to give a proof of Bell's theorem which works for the Lorentz ether (that means with classical causality in the preferred frame) with all other assumptions unchanged.
Jarek 31 said:
And we are talking about just "tossing 3 coins, at least 2 give the same" - absolutely obvious example of Dirichlet principle.

This is something you can trivially prove, no problem. That's not about violations of Bell inequalities.

Jarek 31 said:
To violate it, first of all it is crucial that we measure only 2 out of 3 variables - measuring all 3, there is joint probability distribution, this inequality has to be satisfied.

So the third variable is not only unknown, but unmeasured - for the Born rule: to add amplitudes, then square to get probability distribution.
Once your two measurements violate the Bell inequalities, you can be certain that the theory that these are three objectively existing coins is wrong.

Jarek 31 said:
So what is the difference between unknown and unmeasured variable?
Where does Born rule come from - required to violate such inequalities?
The Born rule does not violate any inequalities. You seem to think that fairy tale that quantum theory is incompatible with classical probability theory. It is not, the realist interpretations prove this.
 
  • Like
Likes Demystifier

Similar threads

  • Quantum Interpretations and Foundations
2
Replies
41
Views
5K
  • Quantum Interpretations and Foundations
10
Replies
333
Views
11K
  • Quantum Interpretations and Foundations
3
Replies
89
Views
6K
  • Quantum Interpretations and Foundations
2
Replies
45
Views
3K
  • Quantum Interpretations and Foundations
Replies
8
Views
2K
  • Quantum Interpretations and Foundations
Replies
9
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
59
Views
4K
  • Quantum Interpretations and Foundations
3
Replies
97
Views
6K
  • Quantum Interpretations and Foundations
2
Replies
44
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
50
Views
5K
Back
Top