Joy Christian, Disproof of Bell's Theorem

  • #101


Dear billschnieder,

Let me start by saying that on the very remote possibility that you are indeed not Joy Christian, I am apologizing to you.

I replied earlier, but my post did not appear and unfortunaley I did not saved it.

Let me list the reasons why Joy's model is wrong:

Physical reasons:

- Never in his model he is using the fact that the original state in in the Bell state. Start with any other Psi and you will still get -a.b if you believe his math.
- The model does not respect the detector swapping symmetry: Swap Alice and Bob's detectors and you get the same results. Joy is using DIFFERENT analyzers for Alice and Bob to recover the minus sign on -a.b. Restoring the symmetry results in + a.b
- Holman's argument: Once MU is set, perform the EPR-B experiment on z axis and do a subsequent measurement on one arm of the experiment on the x axis. You get 2 choices: MU does not change between measurement, or MU changes between measurement. MU does not change: this means the x measurement outcome is always the same as the z outcome. Experiments show you get 50% the same answer and 50% the opposite answer. MU does change: than you have problems explaining 3 1/2 spin particle experimental results.

Mathematical reasons:

-incorrect Hodge duality between pseudo-vectors and bivectors in a left handed basis. In a right handed bases a^b = I (axb) (Joys agrees with it). In a left handed basis Joy claims incorrectly a^b = -I (axb). This is wrong, it is still with +. Easy way of seeing this: changing handedness comes from a mirror reflection. In a mirror reflection I = e1^e2^e3 changes signs because it is a PSEUDO-scalar (Joy does this correctly). However (axb) changes signs as well (Joy forgets that axb is a PSEUDO-vectors and treats it like a vector)
-On FQXi website Joy now claims a different thing: he is using left and right algebras instead of left and right handedness. To debunk this I spelled out all 4 combinations: left algebra-left handedness, left algebra-right handedness, right algebra-left handedness, right algebra-right handedness. In each algebra Hodge duality preserves the sign, and mixing algebras is inconsistent (it is like adding kets with bras, row and column vectors: "go direcly to jail, do not pass go do not collect 200"). All associative algebras have left and right implementations (and the name comes from the matrix formalism). Only in 3D there is handedness-a property of the cross product. Handedness is the sign of the pseudo-scalar I = e1^e2^e3 = e1e2e3 and not of the bivector product: B1B2B3. The sign of the bivector product gives you the left or right algebra.
-Any generalization of Joy's model in the Clifford algebra formalism breaks either -a.b correlation, or the zero average in each arm of the experiment
-Joy takes a 0/0 limit: sin(epsilon)/sin(epsilon) and claims it equals zero because the nominator goes to zero.
-Joy computes incorrectly a rotation with a bad rotor in geometric algebra. (the last 2 errors are used to fight Holman's analysis)

Computer simulation arguments:
-By now there are 2 independent simulations of Joy's model both recovering the classical limit. One of the simulation was validated by obtaining -cos correlation on other models

Sociological factors:
-I have never ever got any mathematical arguments from Joy. Instead he used only lies, insults, fallacious arguments, and obfuscation of simple mathematical facts.
-naming the Hodge duality after himself – a major score on Baez’s crackpot index.
-His archive replies are using a bullying tone which scared away critics. You want proof? Sure. The +1=-1 mistake from the wrong sign of Hodge duality was almost found by the very first critic and the tone of Joy’s reply: “rectify this pedagogical error”-like the first critic was an idiot, scared other people from checking his math.
Frankly, I have no explanation for his behavior and obstinate denial of obvious elementary mistakes except that he is doing a cover-up. But a coverup is worse than the offense, and if he can now say: look, I made a sign mistake and I did not treat axb as a pseudo-vector – I am only human, publishing anything else on the archive denying the obvious mistakes can only be achieved by doing other mistakes. And after that he will lose all his mathematical credibility. I plead with him to see reason and stop this self-destruction madness.
 
Physics news on Phys.org
  • #102
Simple refutation of Joy Christian's simple refutation of Bell's simple theorem

Posted today by Richard Gill, of the Mathematical Institute:

http://arxiv.org/abs/1203.1504

Abstract:

"I point out a simple algebraic error in Joy Christian's refutation of Bell's theorem. In substituting the result of multiplying some derived bivectors with one another by consultation of their multiplication table, he confuses the generic vectors which he used to define the table, with other specific vectors having a special role in the paper, which had been introduced earlier. The result should be expressed in terms of the derived bivectors which indeed do follow this multiplication table. When correcting this calculation, the result is not the singlet correlation any more. Moreover, curiously, his normalized correlations are independent of the number of measurements and certainly do not require letting n converge to infinity. On the other hand his unnormalized or raw correlations are identically equal to -1, independently of the number of measurements too. Correctly computed, his standardized correlations are the bivectors - a . b - a x b, and they find their origin entirely in his normalization or standardization factors; the raw product moment correlations are all -1. I conclude that his research program has been set up around an elaborately hidden but trivial mistake. "

--------------------------------------------

It is interesting to add this note, addressed to those who suggest Jaynes is the only person who properly understands how probability applies to Bell's Theorem, entanglement, etc: Gill is also an expert in statistical theory, and has done extensive research in this area (including the application of Bayes). He apparently does not see the issue Jaynes does. Gill frequently collaborates with the top scientists in the study of entanglement, so I think it is safe to say this area has been well considered and has not been overlooked somehow.
 
  • #103


DrChinese said:
I conclude that his research program has been set up around an elaborately hidden but trivial mistake.
Puh, this is definitely not something you want to read in a serious paper addressing your work. ;-)
 
  • #104


kith said:
Puh, this is definitely not something you want to read in a serious paper addressing your work. ;-)

That would sting. I would say that Gill addressing this shows that top teams take challenges to Bell quite seriously. Gill has previously brought down at least one of the Hess-Philipp stochastic models.
 
  • #105


Told you so! :mad:
Delta Kilo said:
... and no-one actually bothered to look at the half-a-page of math to see the elephants lurking therein.

Well, let's look at eq (5). ...
 
  • #106


Hehe, what's funny is that as I found this paper on the archives yesterday, my first thought was: wow, DrChinese will find that funny.

On another note, I think the strong language at the end of the abastract suggests that some people in the community is starting to get annoyed by joy christians continuing crusade against Bell. I guess he should maybe try to put up a bit more humble attitude in the future (assuming he has one :-p )
 
  • #107


Delta Kilo said:
Told you so! :mad:

Ahead of the pack is a good place to be... :smile:
 
  • #108


Zarqon said:
Hehe, what's funny is that as I found this paper on the archives yesterday, my first thought was: wow, DrChinese will find that funny.

On another note, I think the strong language at the end of the abastract suggests that some people in the community is starting to get annoyed by joy christians continuing crusade against Bell. I guess he should maybe try to put up a bit more humble attitude in the future (assuming he has one :-p )

Heh, I'm so predictable...

Yes, I think the issue is: if someone (such as Christian) really has an angle on something, why not collaborate on it rather than this process of trying to upend something which has been thoroughly studied (Bell)? Every entanglement test shows the same pattern of impossibly high correlations, which again should be a tip-off. Some mathematical sleight of hand is not going to do it, there is going to need to be something very convincing - something like a new testable prediction.
 
  • #109


DrChinese said:
Posted today by Richard Gill, of the Mathematical Institute:

http://arxiv.org/abs/1203.1504

Abstract:

"I point out a simple algebraic error in Joy Christian's refutation of Bell's theorem. In substituting the result of multiplying some derived bivectors with one another by consultation of their multiplication table, he confuses the generic vectors which he used to define the table, with other specific vectors having a special role in the paper, which had been introduced earlier. The result should be expressed in terms of the derived bivectors which indeed do follow this multiplication table. When correcting this calculation, the result is not the singlet correlation any more. Moreover, curiously, his normalized correlations are independent of the number of measurements and certainly do not require letting n converge to infinity. On the other hand his unnormalized or raw correlations are identically equal to -1, independently of the number of measurements too. Correctly computed, his standardized correlations are the bivectors - a . b - a x b, and they find their origin entirely in his normalization or standardization factors; the raw product moment correlations are all -1. I conclude that his research program has been set up around an elaborately hidden but trivial mistake. "

--------------------------------------------

It is interesting to add this note, addressed to those who suggest Jaynes is the only person who properly understands how probability applies to Bell's Theorem, entanglement, etc: Gill is also an expert in statistical theory, and has done extensive research in this area (including the application of Bayes). He apparently does not see the issue Jaynes does. Gill frequently collaborates with the top scientists in the study of entanglement, so I think it is safe to say this area has been well considered and has not been overlooked somehow.
I thought at first that Christian might be on to something, because I intuited a connection between his approach and mine. But, after further consideration, imho, his stuff is just too mathematically circuitous to be considered. I've read his papers and his replies to various discussions, and in none of it is there a clear explanation of why his stuff should be considered a local realistic model of quantum entanglement.
 
  • #110


Richard Gill's refutation is not a new critique. It is essentially the same as one of the critiques advanced by a certain Florin Moldoveanu in the fall last year to which Joy Christian has already replied (http://arxiv.org/abs/1110.5876). It originates from a misunderstanding of Joy's framework which admittedly is not very easy to understand especially for those who have blinders of one kind or another.

Gill thinks Joy is using a convoluted more difficult method to do a calculation and prefers a different method which ultimately leads him to a different result, not realizing/understanding that the calculation method Joy used is demanded by his framework. This is hardly a serious critique not unlike his failed critique of Hess and Phillip. He should at least have read Joy's response to Moldoveanu which he apparently did not, since he does not cite or mention it. It's been available since October 2011, one-month after Moldoveanu posted his critique.

I remember Florin came here to boast about his critique and I pointed out his misunderstanding at the time in this thread: https://www.physicsforums.com/newreply.php?do=newreply&noquote=1&p=3806400

... you are missing the point because Joy Christian is not using handedness as a convention but as the hidden variable itself.
This is the same error Gill has made. See section (II) of Joy's response to Moldoveanu.
 
Last edited by a moderator:
  • #111
  • #112


bohm2 said:
More on this from Joy Christian and I don't understand any of it:

Refutation of Richard Gill's Argument Against my Disproof of Bell's Theorem
http://lanl.arxiv.org/pdf/1203.2529.pdf

Oh-ho, here we go again. No, Joy, measurement outcomes are not bivectors from unit sphere, they are numbers { -1; 1 }. That's how they are defined in Bell's paper and that is also the way how they come out of experiments. And their mean is 0 and their standard deviation is 1. Not bivectors, just numbers 0 and 1.

Joy Christian said:
with \sigma(A) = (−I · \textbf{a} ) and \sigma(B) = (+I · \textbf{b} ), respectively, being the standard deviations in the results A and B.
I can't be bothered anymore, but if you substitute I and \textbf{a} from definitions elsewhere in his paper, you will get \sigma(\textbf{a})=\sum a_{j}\beta_{j} where a_{j} are coefficients of unit vector \textbf{a} and \beta_{j} are "basis bivectors". Brain ruptures at this point...
 
  • #113


billschnieder said:
Richard Gill's refutation is not a new critique. It is essentially the same as one of the critiques advanced by a certain Florin Moldoveanu in the fall last year to which Joy Christian has already replied (http://arxiv.org/abs/1110.5876). It originates from a misunderstanding of Joy's framework which admittedly is not very easy to understand especially for those who have blinders of one kind or another.

Gill thinks Joy is using a convoluted more difficult method to do a calculation and prefers a different method which ultimately leads him to a different result, not realizing/understanding that the calculation method Joy used is demanded by his framework. This is hardly a serious critique not unlike his failed critique of Hess and Phillip. He should at least have read Joy's response to Moldoveanu which he apparently did not, since he does not cite or mention it. It's been available since October 2011, one-month after Moldoveanu posted his critique.

I remember Florin came here to boast about his critique and I pointed out his misunderstanding at the time in this thread: https://www.physicsforums.com/newreply.php?do=newreply&noquote=1&p=3806400 This is the same error Gill has made. See section (II) of Joy's response to Moldoveanu.

It's true that Moldoveanu had earlier seen the same error, in a sense ... but Joy's definitions have not remained constant over the years, so it's a moot point whether the error in one of the earlier, long accounts, is the same error as in Joy's beautiful and simple one-page paper. Florin's focus was not the one-page paper, but the whole corpus of work at that point.

Joy and Bill Schnieder may find it legitimate, when one has freedom to make an arbitrary choice of "handedness", to make different and mutually contradictory choices at different locations in the same computation, but to my mind this is just license to get any result one likes by use of poetry.

Joy's one page paper and my refutation are exercises in simple algebra. I suggest that Bill Schnieder and others work through my algebra and through Joy's algebra, themselves.

The reference to Hess and Phillip is also amusing. Not many people actually read through all the details of Hess and Phillips "counterexample" to Bell's theorem. Somewhere in the midst of that, a variable which had three indices suddenly only had two. This is where a joint probability distribution is being factored into a marginal and the product of two conditionals. Because of the notational slip-up, the normalization factor was wrong. All rather sad.
 
Last edited by a moderator:
  • #114


DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.
 
  • #115


PS, Bill Schnieder thinks that I prefer a different route to get Joy Christian's result because it gives a different answer, but this means he has not read my paper carefully. I discovered a short route, and it appeared to give Joy's answer. I showed this proudly to Joy. He pointed out that I was making a mistake, there was a missing term. I went back and looked more closely at his longer route, and discovered that they both gave the same answer. With the missing term.
 
  • #116


Just curious. Doesn't the new PBR theorem reach the same conclusion as Bell's making Joy Christian's refutation of Bell's theorem (even if it was conceivable) a mute point, at least with respect to arguing for a local realistic model:
Thus, prior to Bell’s theorem, the only open possibility for a local hidden variable theory was a psi-epistemic theory. Of course, Bell’s theorem rules out all local hidden variable theories, regardless of the status of the quantum state within them. Nevertheless, the PBR result now gives an arguably simpler route to the same conclusion by ruling out psi-epistemic theories, allowing us to infer nonlocality directly from EPR.
Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/quantum-times-article-on-the-pbr-theorem/

The quantum state cannot be interpreted statistically
http://lanl.arxiv.org/pdf/1111.3328v1.pdf
 
  • #117


gill1109 said:
DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.

Thanks so much for taking time to share this story. For those interested, here is the direct link to your paper:

http://arxiv.org/abs/quant-ph/0301059

I like your example of Luigi and the computers. I would recommend this paper to anyone who is interested in understanding the pros AND cons of various local realistic positions - and this is a pretty strong roundup!
 
  • #118


Thanks, Bohm2 and thanks DrChinese.

Regarding PBR: I have to admit to have not got the point of PBR. PBR argue that the quantum state is not statistical, but real. That argument depends on definitions of those two words "statistical", "real". My own opinion about quantum foundations is summarized by statements that (1) the real world is real, and its past is now fixed (2) the future of the real world is random, (3) the quantum state is what you need to know about the past in order to determine the probability distribution of the future (so it's just as real as the real world, if you like, since the past real world is real and the probability distribution of the future is real too). This point of view is argued in http://arxiv.org/abs/0905.2723 which is actually just an attempt to explain the ideas which I got from V.P. Belavkin But you could also say that this is just a rigorous Copenhagen approach in which we don't talk about things which we don't need to, and in which we admit the necessity of defining quantum physics on a platform of naive classical physics.
 
  • #119


gill1109 said:
DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.
Thanks for giving your opinion on this matter which happens to be the discussion topic of a parallel thread:
https://www.physicsforums.com/showthread.php?t=581193
I can copy your comment there, but it would be nicer if you would do it yourself. :smile:
 
  • #120


harrylin said:
Thanks for giving your opinion on this matter which happens to be the discussion topic of a parallel thread:
https://www.physicsforums.com/showthread.php?t=581193
I can copy your comment there, but it would be nicer if you would do it yourself. :smile:

I copied my comment + reference over there, which has the effect of including the above.
 
  • #121


DrChinese said:
I copied my comment + reference over there, which has the effect of including the above.
Looking at the time stamp, we had the same idea at the same time. :-p
 
  • #122


Delta Kilo said:
Oh-ho, here we go again. No, Joy, measurement outcomes are not bivectors from unit sphere, they are numbers { -1; 1 }. That's how they are defined in Bell's paper and that is also the way how they come out of experiments. And their mean is 0 and their standard deviation is 1. Not bivectors, just numbers 0 and 1.

I can't be bothered anymore, but if you substitute I and \textbf{a} from definitions elsewhere in his paper, you will get \sigma(\textbf{a})=\sum a_{j}\beta_{j} where a_{j} are coefficients of unit vector \textbf{a} and \beta_{j} are "basis bivectors". Brain ruptures at this point...

So that pretty much destroys Joy's response to the argument against his original paper?
 
  • #123


Joy Christian has now responded to Richard Gill's purported refutation:

http://arxiv.org/abs/1203.2529

I identify a number of errors in Richard Gill’s purported refutation of my disproof of Bell’s theorem.
In particular, I point out that his central argument is based, not only on a rather trivial misreading
of my counterexample to Bell’s theorem, but also on a simple oversight of a freedom of choice in
the orientation of a Clifford algebra. What is innovative and original in my counterexample is thus
mistaken for an error, at the expense of the professed universality and generality of Bell’s theorem.
 
  • #124


Thanks, Bill Schnieder. Joy has changed his postulates to patch the error. The new postulates are mutually contradictory. So first there was a model and a mistake, now there's no mistake but no model either. Vanished in a puff of smoke.
 
  • #125


I posted that paper in this thread above but I gave up trying to understand the debate. A very long one and not too friendly one that can be followed more fully here in this FQXi Blog:

Disproofs of disproofs of disproofs of disproofs...
http://www.fqxi.org/community/forum/topic/1247
 
  • #126


bohm2 said:
Just curious. Doesn't the new PBR theorem reach the same conclusion as Bell's making Joy Christian's refutation of Bell's theorem (even if it was conceivable) a mute point, at least with respect to arguing for a local realistic model:

Thus, prior to Bell’s theorem, the only open possibility for a local hidden variable theory was a psi-epistemic theory. Of course, Bell’s theorem rules out all local hidden variable theories, regardless of the status of the quantum state within them. Nevertheless, the PBR result now gives an arguably simpler route to the same conclusion by ruling out psi-epistemic theories, allowing us to infer nonlocality directly from EPR.

Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/quantum-times-article-on-the-pbr-theorem/

The quantum state cannot be interpreted statistically
http://lanl.arxiv.org/pdf/1111.3328v1.pdf

PBR place strong constraints on epistemic interpretations rather than rule out.
 
Last edited:
  • #127


yoda jedi said:
PBR place a strong constraints on psi-epistemic interpretations rather than rule out.
My question really wasn't about this point. Joy Christian's preservation of local realism relies on refutation of Bell's. Even if that could be done, my question was whether non-locality can be inferred directly via PBR without Bell's theorem. Matt Leifer in his blog answered in a post:

Question by poster:
Hi Matt, Do you still believe that PBR directly implies non-locality, without Bell’s as I think you argued in a section of Quantum Times article?
“It (PBR) provides a simple proof of many other known theorems, and it supercharges the EPR argument, converting it into a rigorous proof of nonlocality that has the same status as Bell’s theorem. ”
Matt's reply:
Yes, but this requires the factorization assumption used by PBR. At the time of writing, I was hopeful that we could prove the PBR theorem without factorization, but now I know that this is not possible. Therefore, the standard Bell-inequality arguments are still preferable as they involve one less assumption.
Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/q...-the-pbr-theorem/comment-page-1/#comment-2877
 
  • #128
Last edited:
  • #129


yoda jedi said:
i understand, in the same manner, like your question of "Loophole-free demonstration of nonlocality".

Exactly.
 
  • #130


my 2-form's worth on the subject: Its been well known since the work of Philippe Eberhard and Arthur Fine in 70s and 80s that a model produces Bell's Inequalities if and only if it is equivalent to a "local hidden variable theory". The term "local hidden variable theory" has a precise mathematical definition it doesn't simply mean any model that has quantities unknown to the observer that determine all outcomes, the quantities have to have sufficient structure to allow probabilities to be calculated via the mechanisms of formal probability theory (sigma algebras etc etc). Some of the people who have commented in this topic seem to have a vague grasp of this when they describe Joy Christian's example as "unrealistic" etc etc but they are missing the point, Fine and Eberhard showed that any such example is "unrealistic" in this sense, but what Christian and others show is that such "unrealistic" behaviour is not really unrealistic and it manifests in some fairly simple models of EPR experiments, like the one Christian manages to present in a single page.
 
  • #131


Unfortunately Christian's single page contains a glaring error in the algebra, as well as being conceptually completely misguided, see http://arxiv.org/abs/1203.1504

On the other hand, no technically advanced parts of formal probability theory are needed to derive Bell inequalities from a local hidden variables model. No sigma algebras or whatever. They follow from absolutely elementary logical reasoning, elementary arithmetic, elementary (counting) probability, see for instance http://arxiv.org/abs/1207.5103
 
  • #132


I'll take a look at the links you provide. The derivation of Bell's Inequality doesn't need any knowledge of sigma algebras etc, but standard probability structure is implicit when one takes various averages to produce the inequality and assumes them to be meaningful. If you look at Fine's work in the 80s the key to what's going on is that local hidden variable theories always have well defined joint probability distributions for pairs of variables which really don't according to QM. Taking Eberhard's work into consideration, local hidden variable theories have this problem because they implicitly assume counterfactual definiteness.

Although a lot of people have an issue with denying counterfactual definiteness, they shouldn't its something that fails even in mundane examples such as asking what's in your fist when your palm is flat or what's on your lap when you are standing. ("Fist contents" and "flat palm orienation" are incompatible observables and do not have well defined joint probabilities) In layman's terms what's going wrong in hidden variable theories is that they are insisting that you can talk about having something in your fist at the same time as your palm is flat and as a result they can never produce correct correlations.

Now maybe Joy Christian's example is erroneous (I will check the refutation) but there have been many others who have come up with correct similar examples - and all they are really doing is constructing something which loosely speaking amounts the "fist-palm" example using some sort of non-distributive lattice or tensor algebra of some kind. The reason why such examples are valuable is that they show people that they have to be careful in jumping to conclusions about non-locality or denial of philosohical realism simply because a standard hidden variable theory cannot produce QM correlations. They also show that there are mundane cases where you can't treat a pair of variables as having a well defined joint probability distribution (or without going into such detail, where you can't just average stuff and assume the average makes sense).

The main problem with Joy Christians work is his attitude - the very title "Disproof of Bell's Theorem" is troublesome/cranky because it isn't disproving Bell or finding an error is Bell's reasoning, its merely trying to come up with an example of something we've known about since the 70s basically.
 
  • #133


At the heart of the debate over Bell, local realism, and such, is an implicit assumption that events originate with emission and end with measurment. More specifically, the applicability of Bell's Theorem to nature and certain conclusions about CHSH experiments depend critically on the veracity of the assumption that entanglement involves the coincident emission of two particles that can then be identified, and thus have their entanglement tested, by a coincident detection. This assumption may not be correct.

I've shown that direct particle-to-particle interactions, or relationships, that share information at the speed of light for a non-zero duration are adequate to generate "quantum" results in CHSH experiments.

https://docs.google.com/open?id=0BxBGJRkQXyjweXR2R3ExTlEyNm8
 
Last edited:
  • #134
  • #135


Ok, I read and thoroughly enjoyed Gill's refutation paper but have still to read Christian's rebuttal. I have to say at this point I don't actually understand Christian's calculations my initial understanding of the notation which seemed to confirm his result appears to be wrong. (Another regret I'm having, back in the day I remember thinking "quaternions? I'll never need this, I can always look them up one day if I do ... well the day finally came :D )

Regarding the concluding discussion in Gill's paper, "There is no limitation in Bell's theorem on the space in which the hidden variables live." Well yes true as long as we recognize the subsequent bit about realism and locality, in particular realism. Gills says: "Realism", however it is defined, comes down, effectively, to the mathematical existence of the outcomes of unperformed experiments, alongside of those which were actually performed." Well local models that do violate Bell's Inequality work precisely by encoding the lack of a well defined outcomes of unperformed experiments thus preventing the type of averaging or counting that is needed in the derivation of Bell's Theorem. I initially assumed that Christian was trying the same sort of thing but the algebra was over my head.

What for me is an important point is that such models are not really "non-realist" in the true philosophical sense. Although Bell's "realism" seems sensible on the surface it isn't sensible at all because it amounts to saying that we can talk sensibly about a particle being in a position eigenstate which we didn't measure and don't know at the same time that we did measure and do know that it is in a particular momentum eigenstate - and that is just plain nonsense. That it is nonsense is just a consequence of how Fourier transforms work (or more gnerally how change of eigenbases works) and there is nothing philisophically non-realist about it - so it its very unfortunate that Bell and others dubbed this "realism". The failue of this kind of realism is no worse than failing to have a lap when standing up, I'm not a spooky subjective entity because my lap disappeared when I stood up, but the fact that Bell and others seem to imply that it is (or alternatively try to imply that faster than light signalling exists) is a bit of crankiness on their part and its precisely what sets of off the anti-Bell cranks.
 
  • #136


BTW the sort of stuff I'm talking about when I speak of local models that violate Bell's Theorem, I mean the sort of things that Rovelli, Omnes, Hartle etc have come up with. They all reject non-local communication but have models consistent with ordinary QM correlations not with Bell's Theorem and they manage this by failing to be "realist" in Bell's narrow sense but are nevertheless still "realist" in a philosophical sense.
 
  • #137


Mathematech said:
BTW the sort of stuff I'm talking about when I speak of local models that violate Bell's Theorem, I mean the sort of things that Rovelli, Omnes, Hartle etc have come up with. They all reject non-local communication but have models consistent with ordinary QM correlations not with Bell's Theorem and they manage this by failing to be "realist" in Bell's narrow sense but are nevertheless still "realist" in a philosophical sense.

Thanks for this! Is there a good source for a precise definition of Bell's meaning of realism, as well as "consensus" definitions?
 
  • #138


Off hand I can't think of any good sources although I have read many confusing ones. When it comes to Bell's Theorem, "realism" means counterfactual definiteness. Counterfactual definiteness is typically poorly explained in texts often with some statement along the lines that if a different experiment had been performed (e.g. position measured instead of momentum) then it would have produced a definite result. Now that isn't quite what counterfactual definiteness is about as all interpretations of QM agree that if you do an experiment you get a definite result, even raw Copenhagen says that. What counterfactual definiteness really says is that such a counterfactual outcome is still statistically meaningful for a time where I have performed a different experiment that is quantum mechanically incompatible.

To give a real world non-QM analogy, assuming counterfactual definiteness in a QM situtaion amounts to counting how many times your fist contained some unspecified item according to a guess without having checked it but instead having seen to the contrary that your palm was open and that your hand wasn't even clenched in a fist all - such counts are obviously meaningless nonsense. Similarly refinements of Copenhagen QM avoid Bell's Theorem without the need for non-locality by similarly considering the calculations in the derivation of Bell's Theorem to be meaningless sums.
 
  • #139


The point is that counterfactual definiteness was never a problem in physics till QM came along. Secondly, the perfect anticorrelations predicted by the singlet state make it a very natural assumption (when you measure the two particles in the same way you get equal and opposite results - hard to imagine except by supposing the different measurement outcomes for different settings are already "fixed" for the two particles at the source). (cf EPR argument for "elements of reality").

Finally, the fact remains that it is impossible to generate violations of Bell inequalities in a rigorously regulated experiment by non-quantum means. (Rigorously regulated means: no post-selection, no non-detections; random settings; proper space-time separation so that Alice's measurement is finished before Bob's setting could become available and vice -versa). When the definitive experiment is done in a year or two (several experimental groups are getting very close) we'll know for sure that nature - quantum reality - is non classical. Nature is not deterministic but irreducibly stochastic.
 
  • #140


Mathematech said:
Although a lot of people have an issue with denying counterfactual definiteness, they shouldn't its something that fails even in mundane examples such as asking what's in your fist when your palm is flat or what's on your lap when you are standing. ("Fist contents" and "flat palm orienation" are incompatible observables and do not have well defined joint probabilities)
Joint probabilities are irrelevant: "fist contents", as you've defined it, is ill defined all on its own.
 
  • #141


The thing is we are dealing with superpositions of correlated pairs of states and counting outcomes of measurements on these which cannot all simultaneously be factual due to incompatibilities of the observables involved, does not produce the correct statistics. The so called "non-realist" approaches feel that this is sufficiently explained by the fact that there is no reason such calculations should produce meaningful values or match the results obtained by the standard Hilbert space formalism.

This to some extent relates to the logical analysis of loaded statements like "Do you still beat your wife? Write +1 if you do and -1 if you don't." What's the answer? Well if you never beat your wife in the first place or don't even have a wife, the answer cannot be said to be either +1 or -1. From the point of view of so called non-realists, the derivation of Bell's Theorem is representing the answer as an unspecified but nevertheless definite amount x and then concluding something like even if we don't know the value of x we do know we must have |x| = 1.
 
  • #142


"Realism" in this context should be called "idealism". Because it asserts the equal realness of the outcome of the measurement with the setting which you did not use, alongside of the outcome of the same measurement with the setting which you did actually use. OK so you can scoff at this. The point is, in all of classical physics this would not have been problematic. The mathematical model of physics allow you to replace an actual setting with a different (counterfactual) setting and still read off an outcome. Bell shows us that quantum reality cannot be modeled in a classical way.

Most Bell-deniers deny this.

"Non-realist" interpretations of quantum mechanics don't resolve these issues. They simply refuse to discuss them. The fact that QM is intrinsically different from any physics which went before us swept under a carpet of verbiage. The fact that QM predicts real world phenomena which are impossible under any classical-like physics, is likewise hidden from view. In other words, such interpretations are merely a comfort-blanket.
 
  • #143


Mathematech said:
This to some extent relates to the logical analysis of loaded statements like "Do you still beat your wife? Write +1 if you do and -1 if you don't." What's the answer?
The analogy can be used, but not the way you are setting it up.

I can set a spin-about-Z-measuring-device up in front of any particle. I can't set up a spousal-abuse-measuring-device in front of an unmarried man. The relationship between spin-about-Z and spin-about-X is of a fundamentally different type than the relationship between having a spouse and whether you abuse her.

The analogy can be used, though, in regards to "What spin about Z did you measure?" versus "Did you use a spin-about-Z-measuring device?"
 
  • #144


Indeed QM cannot be modeled in a classical way, but we need to be sure we understand what we mean when we say "classical way".

I don't think the "non-realist" approaches can be dismissed as "refusing to discuss". When your particle is in an x-axis spin eigenstate, it mathematically is not in a y-axis eigenstate, this is straightforward mathematics. Recognizing that it is meaningless to speak of the y-axis eigenstate of a particle you know to be in an x-eigenstate is not a refusal to discuss, its being sensible. Similarly recognizing that you don't have a fist when your palm is open is not a refusal to discuss fists. Indeed insisting that one can talk of simultaneous x and y-axis eigenstates is what is cranky, but ultimately this is what so-called "realist" interpretations are doing.
 
  • #145


Mathematech said:
When your particle is in an x-axis spin eigenstate, it mathematically is not in a y-axis eigenstate, this is straightforward mathematics. Recognizing that it is meaningless to speak of the y-axis eigenstate of a particle you know to be in an x-eigenstate is not a refusal to discuss, its being sensible.
I can feed such a through a spin-about-y-measuring-device, and it will give me an answer.

The realist philosophy (as used in this context) is that the measuring device is measuring some quality the particle actually has, and so there is some aspect of the "true" physical state of the particle that would determine the result of measurement -- or if we accept non-determinism, would determine a probability distribution on the outcomes.

The description of the particle as being in an x-axis eigenstate is actually sufficient for this task, as it tells us of the 50-50 distribution on the outcomes of the y measurement.
 
  • #146


Another point I'd like to make, although my last few posts have been defending the "non-realist" explanations of why QM produces different results to a local hidden variable theory, if one delves into the philosophy there are views which to do not see non-counterfactual definiteness and non-locality as two different explanations but which consider the possibility that the notions are intimately related and that QM escapes Bell by virtue of both counterfactual definiteness and certain notions of locality failing - that an x spin measurement of particle A results in a non-local influence on particle B which puts it in a superposition of Y spin states making it meaningless to speak of its Y spin if that isn't measured.
 
  • #147


One could argue in fact that the very tensor product Hilbert space formalism used for two entangled particles itself implies failures of both counterfactual definiteness and locality - the Hilbert space formalism alone implying failure of the former and the tensor product implying failure of the latter ... but this is a subject for volumes of books not something that be explained in a forum :)
 
  • #148


Back to Joy Christian's paper - I'm reading the rebuttal to Gill, but I am at a complete loss to understand what he is on about in his "A fallacy of misplaced concreteness" section when he goes on about statistical vs algebraic variables.
 
  • #149


Mathematech said:
Back to Joy Christian's paper - I'm reading the rebuttal to Gill, but I am at a complete loss to understand what he is on about in his "A fallacy of misplaced concreteness" section when he goes on about statistical vs algebraic variables.

Join the club...
 
  • #150


Realism does not insist on simultaneous x and y-axis eigenstates. Realism asks the sensible question: is the statistical nature of quantum mechanical predictions merely the reflection of statistical variation in presently unknown variables at a deeper level of physical description?

Answer: no! The statistical nature of QM is intrinsic, it's for real. In fact there's a nice theorem that violation of Bell inequalities together with locality implies that nature must be non-deterministic. It's because of intrinsic indeterminism that QM allows observable phenomena to exist which would be impossible in a classical, deterministic, locality obeying, universe.
 

Similar threads

Back
Top