Is action at a distance possible as envisaged by the EPR Paradox.

  • #601
ThomasT said:
1. Say Alice and Bob are counter-propagating sinusoidal (light) waves that share a cloned property, eg., they're identically polarized. Analyze this cloned property with crossed polarizers and you get entanglement correlation. Cos^2 |a-b| in the ideal. It's just optics. Not that optics isn't somewhat mysterious in it's own right. But we can at least understand that the entanglement stats so produced don't have to be due to Alice and Bob communicating with each other, or that nonseparability means that Alice and Bob are the same thing in the sense that they're actually physically connected when they reach the polarizers..

2. Bell didn't address this case, because it's precluded by the EPR requirement that lhv models of entanglement be expressed in terms of parameters that determine individual results.

3. On the other hand, since a local realistic computer simulation of an entanglement preparation is not the same as a local realistic formal model (in the EPR sense), then it wouldn't be at all surprising if such a simulation could reproduce the observed experimental results, and violate a BI appropriate to the situation being simulated -- and this wouldn't contradict Bell's result, but, rather, affirm it in a way analogous to the way real experiments have affirmed Bell's result.

1. I have news for you: this is patently FALSE. If you take 2 identically polarized photons and run them through the polarizers as you describe here, you do NOT get Cos^2 |a-b| or anything close to it. You ONLY get this for ENTANGLED photons. In other words: in the case where your assumption is actually valid - and I do mean identical and identically polarized photons coming out of a PDC crystal - you do NOT get entangled state statistics. You ONLY get those when the output is in a superposition of states. (Whether you get one or the other is a decision that the experimenter can make by altering the setup slightly.)

2. Bell quite discussed the case where the correlations are due anti-symmetric considerations.

3. I would like to see one (and yes, it would surprise me). This is a somewhat complex subject and I am currently working with the De Raedt team (and another independent theoretical physicist) regarding some concerns I have expressed about their model. Their model does have some very interesting features. If it were possible to suitably express such a simulation, I think it might require some additional experimental analysis. It would not affect Bell's Theorem.
 
Physics news on Phys.org
  • #602
DrChinese said:
It does matter IF you won't get the QM predicted values, which you won't.

All I ask is that the percentage of matches between 0/22.5 match the percentage from 22.5/45 and that the 0/45 matches are 50%. Your model does not do this for any dataset with more than about 20 items. You should acknowledge that is so. I assume you now understand why such a dataset is not possible.

And all I ask is that if you and I both have 0 self momentum, we must not have any momentum relative to each other. Even with plain old Galilean Relativity, it's not a very reasonable thing to ask is it?

And all I ask is that if our relative velocity is 30 km/hour, and we both increase our velocity by 30 km/hour, our relative velocity must remain unchanged. Not a very reasonable thing to ask is it.

Yet I do get QM predicted values, if I'm allowed to define ANY angle as 0, just like any inertial observer can describe their velocity as 0. If I change my definition of my velocity by X, it does not mean it changes my measurement of your velocity by X. Same with linear changes in relative polarizer angles.

So why demand even stricter linearity in measureables than what even Galilean Relativity supports? Fundamentally all EPR correlations measure is how many of the 50% of photons a polarizer detects overlaps with different polarizer settings. Yet, counterfactually, it is being presumed that the same subset of photons, with a common detection overlap, are involved with 2 different detector settings.
 
  • #603
my_wan said:
And all I ask is that if you and I both have 0 self momentum, we must not have any momentum relative to each other. Even with plain old Galilean Relativity, it's not a very reasonable thing to ask is it?

And all I ask is that if our relative velocity is 30 km/hour, and we both increase our velocity by 30 km/hour, our relative velocity must remain unchanged. Not a very reasonable thing to ask is it.

Yet I do get QM predicted values, if I'm allowed to define ANY angle as 0, just like any inertial observer can describe their velocity as 0. If I change my definition of my velocity by X, it does not mean it changes my measurement of your velocity by X. Same with linear changes in relative polarizer angles.

So why demand even stricter linearity in measureables than what even Galilean Relativity supports? Fundamentally all EPR correlations measure is how many of the 50% of photons a polarizer detects overlaps with different polarizer settings. Yet, counterfactually, it is being presumed that the same subset of photons, with a common detection overlap, are involved with 2 different detector settings.

I truly have no idea what you are talking about. I am discussing polarization, not velocity or relativity. If you generated a realistic dataset that works like real QM does, then simply show it. It is easy for me to request this since I KNOW you don't have it.

And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.
 
  • #604
DrChinese said:
I truly have no idea what you are talking about. I am discussing polarization, not velocity or relativity. If you generated a realistic dataset that works like real QM does, then simply show it. It is easy for me to request this since I KNOW you don't have it.

And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.

So what you are saying here is that the a giver polarizer setting passes all photons at that polarization? No, a given polarization setting passes 50% of a randomly polarized beam of light. A polarization setting at 90 degrees to that will pass exactly the other 50%. Thus any setting between those 2 must pass some of the photon at the 0 setting, and some from the 90 setting. This is true with or without classical mechanisms. See a visual here:
http://www.lon-capa.org/~mmp/kap24/polarizers/Polarizer.htm

The only thing to explain, per Bell's ansatz, is why the transition between 0 and 90 is not, counterfactually, linear with changes in the angle. The exact same paradox exist, in that polarizer applet above, when you add a second inline polarizer and notice that, with arbitrary offsets from the first polarizer, the percentage of the photons passing both polarizers do NOT fall off linearly between 0 and 90. EXACTLY the same non-linearity seen in EPR correlations, without ANY correlated photons. Yet EPR correlations indicate this is deterministically replicable if the photons are exactly (anti)correlated. But it is a LOCAL non-linearity producing it, exactly as seen in that applet.

Unlike the restrictions of Bell's realism, I include 'LOCAL' QM effects as valid effects to explain this non-linearity with. If nature is defined as a 'real' pure field, how can you expect properties to be linear representations of parts? If the relevant is "subensembles", we are likely not dealing with a finite Universe, even on the microscopic scale. But that is, in itself, not a violation of realism. Einstein did build GR as a causally connected field theory, however problematic that is for quantization.
 
  • #605
my_wan said:
So what you are saying here is that the a giver polarizer setting passes all photons at that polarization? No, a given polarization setting passes 50% of a randomly polarized beam of light. A polarization setting at 90 degrees to that will pass exactly the other 50%. Thus any setting between those 2 must pass some of the photon at the 0 setting, and some from the 90 setting. This is true with or without classical mechanisms. See a visual here:
http://www.lon-capa.org/~mmp/kap24/polarizers/Polarizer.htm

I keep trying to tell you that this is NOT how most real experiments are performed. Polarizing beam splitters are used. 100% of the light emerges, and it goes one way or another. That way, there is no question that there is a match.

You are creating artificial confusion by talking about the counterfactual "overlapping" or whatever it is. In fact, Alice and Bob are always counted (ideal case of course).
 
  • #606
my_wan said:
The exact same paradox exist, in that polarizer applet above, when you add a second inline polarizer and notice that, with arbitrary offsets from the first polarizer, the percentage of the photons passing both polarizers do NOT fall off linearly between 0 and 90. EXACTLY the same non-linearity seen in EPR correlations, without ANY correlated photons. Yet EPR correlations indicate this is deterministically replicable if the photons are exactly (anti)correlated. But it is a LOCAL non-linearity producing it, exactly as seen in that applet.

You know, that is an interesting similarity. But it actually has nothing directly to do with Bell test correlations. Those are obtained by a different technique, and yes, there is an underlying mathematical relationship connecting them. But that is where the connection ends.

If you can formulate a non-local connection between 2 polarizers in series, go for it. But that analogy does not apply to Bell tests. In fact, I am sure that there probably IS a connection at some deep level as you suggest. After all, the Heisenberg Uncertainty Principle is at work in both cases so that is to be expected. In my opinion, the same quantum non-locality is at work whenever the HUP is invoked. But everyone may not agree with that opinion.

However, that does not change the fact that it is the ENTANGLED connection which is of interest in Bell tests. It is that paradox which is at hand, and which is the subject of EPR.
 
  • #607
Note: Experimental constraints are too high for me to worry about anything but the ideal case.
DrChinese said:
And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.
This is a single polarizer with a photon detector:
attachment.php?attachmentid=26133&stc=1&d=1275435978.jpg

Now when we turn a second polarizer to 22.5 degrees, relative to that one, we get:
attachment.php?attachmentid=26134&stc=1&d=1275435978.jpg

Given that only 50% of the orginal beam hits the second polarizer, it's passing 85.36% of the polarized light hitting it. This is precisely the percent of EPR correlations at that same angle offset. It also matches at EVERY arbitrary offset. I take this to empirically mean that 2 polarizers, with a 22.5 degree offset, will counterfactually detect 85.36% of the same individual photons.

DrChinese said:
You know, that is an interesting similarity. But it actually has nothing directly to do with Bell test correlations. Those are obtained by a different technique, and yes, there is an underlying mathematical relationship connecting them. But that is where the connection ends.
So a point for point, angle for angle exact match is no connection? Let's look at what Bell's ansatz operationally assumed: That the correlations of any local realistic EPR mechanism must linearly transition from 50% to 100% max. Hence the 75% Bell limit on correlations at 22.5 degrees. But if a beam of polarized light does NOT linearly transition from 0 to 90 degrees, how can you possibly expect (presummed deterministic) correlations to?

DrChinese said:
If you can formulate a non-local connection between 2 polarizers in series, go for it. But that analogy does not apply to Bell tests. In fact, I am sure that there probably IS a connection at some deep level as you suggest. After all, the Heisenberg uncertainty principle is at work in both cases so that is to be expected. In my opinion, the same quantum non-locality is at work whenever the HUP is invoked. But everyone may not agree with that opinion.

The first sentence if kind of interesting, but my only point was that if the mechanism that induced the non-linearity exists in uncorrelated photons was a QM property of the way a photon interacts with a polarizer, and this interaction is fundamentally deterministic, it cannot be used as an ansatz to define a non-local mechanism. There is no doubt whatsoever that HUP is empirically valid, but that doesn't rule out a local deterministic underpinning, with or without finite parts (subensembles). Bell's ansatz is contingent upon countable subensembles with 'absolute' (measurable) properties.

DrChinese said:
However, that does not change the fact that it is the ENTANGLED connection which is of interest in Bell tests. It is that paradox which is at hand, and which is the subject of EPR.
Yes, but the locality claims about the meaning of ENTANGLED connection is predicated on a linearity that are trivially violated generally, in even Newtonian physics, and specifically in polarizer/photon interactions without EPR correlations. Yes the entangles states is interesting, but the non-linearity across relative detector settings do not represent a test of locality, except in the rawest assumption that all observables have perfectly linear relationships with things.

You keep asking for a dataset, but you'll just hang to the notion that you must be able to plug in 22.5/45, and get the same answer as 0/22.5. To that I have 1 question: If predefining a common coordinate system such that 22.5/45 has a relative difference of 22.5 is not a FTL cheat, why the is predefining ONLY the relative difference a FTL cheat? Coordinate systems are by definition non-physical, only the symmetries on them are.
 

Attachments

  • 1x50.jpg
    1x50.jpg
    14.1 KB · Views: 533
  • 2x22.5.jpg
    2x22.5.jpg
    12.2 KB · Views: 457
  • #608
ThomasT said:
You asked if the mathematical legitimacy of Bell's theorem is irrefutable. The mathematical form of Bell's theorem is the Bell inequalities, and they are irrefutable. Their physical meaning, however, is debatable.

In order to determine the physical meaning of the inequalities we look at where they come from, Bell's locality condition, P(AB|H) = P(A|H)P(B|H).

Then we can ask what you asked and we see that:
1. A and B are correlated in EPR settings.
2. Bell uses P(AB|H) = P(A|H)P(B|H)
3. P(AB|H) = P(A|H)P(B|H) is invalid when A and B are correlated.

Conclusion: The form, P(AB|H) = P(A|H)P(B|H), cannot possibly model the experimental situation. This is the immediate cause of violation of BIs based on limitations imposed by this form.

What does this mean?

P(AB|H) = P(A|H)P(B|H) is the purported locality condition. Yet it is first the definition of statistical independence. The experiments are prepared to produce statistical dependence via the measurement of a relationship between two disturbances by a joint or global measurement parameter in accordance with local causality.

Bell inequalities are violated because an experiment prepared to produce statistical dependence is being modeled as an experiment prepared to produce statistical independence.

Bell's theorem says that the statistical predictions of qm are incompatible with separable predetermination. Which, according to certain attempts (including mine) at disambiguation, means that joint experimental situations which produce (and for which qm correctly predicts) entanglement stats can't be viably modeled in terms of the variable or variables which determine individual results.

Yet, per EPR elements of reality, the joint, entangled, situation must be modeled using the same variables which determine individual results. So, Bell rendered the lhv ansatz in the only form that it could be rendered in and remain consistent with the EPR meaning of local hidden variable.

Therefore, Bell's theorem, as stated above by Bell, and disambiguated, holds.

Does it imply nonlocality -- no.

DrChinese said:
This is not correct because it is not what Bell says. You are mixing up his separability formula (Bell's 2), which has a different meaning. Bell is simply saying that there are 2 separate probability functions which are evaluated independently. They can be correlated, there is no restiction there and in fact Bell states immediately following that "This should equal the Quantum mechanical expectation value..." which is 1 when the a and b settings are the same. (This being the fully correlated case.)

DrC, ThomasT.

You both appear to agree that Bell uses P(AB|H) = P(A|H).P(B|H) in his work.

I cannot see how EPR studies using that formula could be serious. If H includes a hidden variable for each particle, that formula gives P(AB|H) = P(A|H).P(B|H) = (1/2).(1/2) = 1/4.

Can you direct me to an example where Bell uses P(AB|H) = P(A|H).P(B|H) in his work, please?

[Apologies for possible hi-jack; I will add this under Understanding Bell's mathematics.]
 
Last edited:
  • #609
my_wan said:
...

You are wandering all over the place. When you want to tackle a point, I will be glad to discuss. I have asked you before to stop your meandering. Listen to what I am saying, and re-read your responses. You are just flailing.

I told you to look at PBSs not polarizers. I know how polarizers work, you don't need to provide a diagram. They have nothing to do with the discussion. We are talking about Bell's theorem and Bell tests.

I know you have a lot of pet ideas. So what? We are NOT here to discuss your pet ideas. The point is to discuss the science of EPR and Bell. I know you are "supremely" confident of your ideas, but you have yet to demonstrate a single cogent idea. I refuse to continue if you won't be a well-behaved participant.
 
  • #610
JenniT said:
DrC, ThomasT.

You both appear to agree that Bell uses P(AB|H) = P(A|H).P(B|H) in his work.

I cannot see how EPR studies using that formula could be serious. If H includes a hidden variable for each particle, that formula gives P(AB|H) = P(A|H).P(B|H) = (1/2).(1/2) = 1/4.

Can you direct me to an example where Bell uses P(AB|H) = P(A|H).P(B|H) in his work, please?

[Apologies for possible hi-jack; I will add this under Understanding Bell's mathematics.]

Will continue that part of the discussion in that thread...
 
  • #611
I was reading over the rebuttals, and it seems I often misinterpreted your claim of 100% of photons emerging from a polarizer. I argued the polarizer effect by narrowing attention to a particular subsystem of the experiment. I do need to include polarizing beam splitter if for no other reason than perhaps to avoid some confusion.

Yes it's true that a PBS effectively detects ~100% of the light. Yet this still represents a single detection axis. So let's see what looking at both outputs of a PBS entails in the argument I posed. Consider a PBS in front of a randomly polarized beam of light. ~50% will be diverted to 1 detector, while the other ~50% is diverted to another. By the argument I proposed, if you rotate that PBS 22.5 degrees, ~15% of the light that would have been diverted 1 way is now diverted the other way.

Now consider a pair of PBS/detectors at each end of an EPR experiment. With both PBS's set on the same axis we get ~100% correlations. We offset 1 PBS by 22.5 degrees. Each photon has a certain tolerance for how far off from the PBS axis can be relative to the default photon polarization before it's diverted the other way by the PBS. When you exceed this tolerance, then, in spite of being anticorrelated with it's partner, the tolerance in the difference between the PBS detection axis is exceeded, so it reads as uncorrelated.

Bell's ansatz assumes a locally realistic mechanism must take a form that linearly transitions with the change in angle. What we have is a transition that changes with the square of the angle. Yet this empirical fact is ubiquitous. The same rules apply to polarizers, the efficiency loss in aerial antennas offset from the ideal setting, etc. This empirical fact may or may not have a realistic basis. But the fact that EPR correlations exhibit the same detection profile says nothing about locality when the same effect occurs without any correlations involved. EPR correlations, in this view, would only indicate the mechanism is deterministically replicable.

By the way, if the wavefunction is assumed to be real, with particles being a projection from a Hilbert space construct, it's reasonable that the square of the angle defines the observables. Even if only a subset of the 'possibilities' formally defined in Hilbert space represent an actual state. Self interaction still seems to require an ensemble (possible infinity) of micro-states.
 
  • #612
my_wan said:
1. I was reading over the rebuttals, and it seems I often misinterpreted your claim of 100% of photons emerging from a polarizer. I argued the polarizer effect by narrowing attention to a particular subsystem of the experiment. I do need to include polarizing beam splitter if for no other reason than perhaps to avoid some confusion.

Yes it's true that a PBS effectively detects ~100% of the light. Yet this still represents a single detection axis. So let's see what looking at both outputs of a PBS entails in the argument I posed. Consider a PBS in front of a randomly polarized beam of light. ~50% will be diverted to 1 detector, while the other ~50% is diverted to another. By the argument I proposed, if you rotate that PBS 22.5 degrees, ~15% of the light that would have been diverted 1 way is now diverted the other way.

Now consider a pair of PBS/detectors at each end of an EPR experiment. With both PBS's set on the same axis we get ~100% correlations. We offset 1 PBS by 22.5 degrees. Each photon has a certain tolerance for how far off from the PBS axis can be relative to the default photon polarization before it's diverted the other way by the PBS. When you exceed this tolerance, then, in spite of being anticorrelated with it's partner, the tolerance in the difference between the PBS detection axis is exceeded, so it reads as uncorrelated.

2. Bell's ansatz assumes a locally realistic mechanism must take a form that linearly transitions with the change in angle...

1. This is correct, you end up with subensembles where you have HH, VV, HV and VH. These are experimentally verifiable. What is counterfactual is the realistic case where there are 3 settings, and you get 8 permutations: HHH, HHV, ... , VVV.

2. Bell does not say this. He says that the local realistic formula ideally should reproduce the quantum expectation value. That is, if there is to be agreement between local realism and QM. So then you notice that it more or less requires the function to have a second derivative of zero (i.e. stationary) so that the realism requirement works. Now, this is not an absolute requirement per se. But you can see that he is setting things up to hint strongly that there will be a contradiction. And he is sharing some of his thoughts about how he arrives at his proof.
 
  • #613
1. Yes, but hidden variables may themselves be subassemblies of those measurables, which define the measurables, rather than just a hidden appendange to them.

2. Wrt: "Bell does not say this."
So, his ansatz, which assumes a maximum classical correlation 0=100%, 22.5=75%, 45=50%, 67.5=25%, and 90=0%, is not a requirement that max correlation statistics must linearly transition with the angle?

Here is an approach that takes a generally similar tack to my argument, with the a priori known probability distribution, but in the context of classical nonlinear filtering in a stochastic system.
http://arxiv.org/abs/0907.2327"
Abstract: [PLAIN said:
http://arxiv.org/abs/0907.2327][/PLAIN] A model is developed to describe state reduction in an EPR experiment as a continuous, relativistically-invariant, dynamical process. The system under consideration consists of two entangled isospin particles each of which undergo isospin measurements at spacelike separated locations. The equations of motion take the form of stochastic differential equations. These equations are solved explicitly in terms of random variables with a priori known probability distribution in the physical probability measure. In the course of solving these equations a correspondence is made between the state reduction process and the problem of classical nonlinear filtering. It is shown that the solution is covariant, violates Bell inequalities, and does not permit superluminal signaling. It is demonstrated that the model is not governed by the Free Will Theorem and it is argued that the claims of Conway and Kochen, that there can be no relativistic theory providing a mechanism for state reduction, are false.
 
Last edited by a moderator:
  • #614
my_wan said:
1. Yes, but hidden variables may themselves be subassemblies of those measurables, which define the measurables, rather than just a hidden appendange to them.

2. Wrt: "Bell does not say this."
So, his ansatz, which assumes a maximum classical correlation 0=100%, 22.5=75%, 45=50%, 67.5=25%, and 90=0%, is not a requirement that max correlation statistics must linearly transition with the angle?

3. Here is an approach that takes a generally similar tack to my argument, with the a priori known probability distribution, but in the context of classical nonlinear filtering in a stochastic system...

1. They can only go as deep as A and B. There is no C, hence no realism.

2. I think you mean the boundary point of a Bell inequality. Bell does not require that boundary to be the actual expectation function. Rather, that QM and LR are on different sides of it.

3. Again, another author who does not feel the need to provide for realism in their "realistic" solution. Hey, Joy Christian just came up with yet another "disproof of Bell" this week! Same thing, proof of hidden variables for A and B but not C. So what is the point of touting realism when no realistic dataset is forthcoming? A single counterexample should do it!
 
  • #615
DrChinese said:
1. They can only go as deep as A and B. There is no C, hence no realism.
The 3rd variable is counterfactual in Bell's EPR argument, so the realism is suspect in that case.

DrChinese said:
2. I think you mean the boundary point of a Bell inequality. Bell does not require that boundary to be the actual expectation function. Rather, that QM and LR are on different sides of it.
Well naturally the linear assumption is a boundary rather than a prediction. Yet it remains that Bell's ansatz assumes a classical mechanism can not exceed this linear boundary.

DrChinese said:
3. Again, another author who does not feel the need to provide for realism in their "realistic" solution. Hey, Joy Christian just came up with yet another "disproof of Bell" this week! Same thing, proof of hidden variables for A and B but not C. So what is the point of touting realism when no realistic dataset is forthcoming? A single counterexample should do it!
Actually, wrt the authors mentioned, I have to agree.. :blushing: They tend to overstate the significance of what they provided. Such attempts do remain important though.

The thing is, the claims of what violations of Bell's inequalities actually mean tends to be overstated on both sides of the fence. We are both arguing on the grounds of what we don't know, the nature of a connection between spacelike separated correlations. The argument from ignorance is inherent in the whole debate. I appreciate you making me think though.

Wrt a dataset, your not going to be happy with the floating 0 angle to maintain relative detector data locally. Neither am I really, but the physical significance of a choice in coordinate labels, distinct from the symmetries, is also dubious. My modeling attempts is to articulate the issues in my mind. They involve generating a list of thousands of random virtual photons and looping through them with a set of virtual detectors. I'm still trying some new, likely dubious, ideas. If we are dealing with transfinite subensembles it may not be possible with or without FTL. But the objective is to learn the issues in as much detail as possible. Adding both sides of the PBS output is actually quiet useful.
 
  • #616
my_wan said:
The 3rd variable is counterfactual in Bell's EPR argument, so the realism is suspect in that case.

That is the definition of realism. If there is no simultaneous C, there is nothing to discuss in a hidden variable theory. It simply isn't a hidden variable theory.

Because Bell slips this requirement in such a subtle manner, it doesn't jump out to many folks. But there it is, right after his (14), and it is quite evident: a, b and c are all together in one equation.

So it is simple: if you reject this assumption as meaningful, then the Bell result is not meaningful.

But you will be part of a small minority. Hey, some people don't like the Beatles either.
 
  • #617
If we take fully generalized thermodynamic models and/or Hilbert space seriously, we could also be looking at a version of Hilbert's paradox of the Grand Hotel. Of course that begs the question of why QM is normalizable. Yet that's a bit of a soft spot from a foundational perspective anyway. Yet, again, if a unit vector is a sum over an infinite number of local "hotel rooms", infinitesimals momentarily occupying a finite subset of those rooms, it still doesn't require FTL as a mechanism.

Would you consider 'actual infinities' a violation of realism? Even the linked paper by Bedingham, using a stochastic model, appears to be stuffing an arbitrary number of possible states into a singular ensemble. Same for the thermodynamic model, with statistically complete variables, linked a few pages back. Hilbert space, with it's required metrically complete property, appears to require the same thing, if it's taken to be physically real in some sense.

This also appears to be a required property for Quantum Computers to work as expected. Who was it that offered Quantum Computers as proof of MWI, due to not enough particles in the Universe to mimic them? The Axiom of Choice also appears to be related in some sense.

So what is your view wrt realism if it's defined in terms of 'actual infinities'?
 
  • #618
my_wan said:
If we take fully generalized thermodynamic models and/or Hilbert space seriously, we could also be looking at a version of Hilbert's paradox of the Grand Hotel. Of course that begs the question of why QM is normalizable. Yet that's a bit of a soft spot from a foundational perspective anyway. Yet, again, if a unit vector is a sum over an infinite number of local "hotel rooms", infinitesimals momentarily occupying a finite subset of those rooms, it still doesn't require FTL as a mechanism.

Would you consider 'actual infinities' a violation of realism?... So what is your view wrt realism if it's defined in terms of 'actual infinities'?

There is no dividing line between QM and realism on this subject. I don't see how the problem of infinities relates to realism. I guess you are saying that infinities cannot exist, and that somehow that means that counterfactuals don't have to exist. But I am not asserting counterfactuals exist, you are. Or at least you are if you are a realist.
 
  • #619
DrChinese said:
There is no dividing line between QM and realism on this subject. I don't see how the problem of infinities relates to realism. I guess you are saying that infinities cannot exist, and that somehow that means that counterfactuals don't have to exist. But I am not asserting counterfactuals exist, you are. Or at least you are if you are a realist.
Your reading way too much into my words, apparently based on a 'perception' of my position. In fact I said "actual infinities" may indeed exist. A sentiment that I have stated several ways before. Here I suggested perhaps the incongruence in counterfactual measures might be a real 'physical' result of Hilbert's paradox of the Grand Hotel.

I'm not holding nature to a conception of my choice. I am debating points for which I lack certainty, in the hopes of learning something that increases or decreases that certainty. The highly limited few things I have a fair degree of certainty on, is not even included in my arguments. I've asserted how I think it's possible for counterfactuals to be interpreted within a 'particular' contextual construct, but mostly dropped it for lack of clarity. But I can't a priori reject reasonable arguments, even if they lack the conclusiveness the authors wish it to.

When you objected with: "another author who does not feel the need to provide for realism in their "realistic" solution", I had to agree that, in spite of some reasonable content, your objection was essentially valid. I don't see any solid justification on either side. The non-realist seems to say, we don't see it so it must not exist. The realist seems happy to suggest mechanisms without actually stating what's real. So I began thinking about how Bedingham and others smooth over Bell's violations, where they hide the inequalities, and why it's not sufficient for some to define realism.

I'm not asking you to accept Hilbert's hotel paradox as an actual explanation, only something that it might in principle be so. My question was far more limited, to get a better picture of what you would 'in principle' accept as a realistic model. Because a repeat of Bell's realism really leaves me with a lot of questions about the range of what can and can't qualify as realism in your view. It seems the definitions used by various authors are incongruent, even when based on the same words, like the definition used by Bell.

Note: 'Actual infinities' is distinct concept from infinities in general. 'Actual infinities' are existential, so they by definition relates to realism. And it seemed to me the approach Bedingham et al used implicitly stuffed extra occupants in Hilbert's hotel, and even provided some justification in terms Hilbert space, quantum computers, etc. I remain at a loss for how you define the constraints of what qualifies for realism. You've rejected my characterization as a linear part-->measurable property, and continually quote my text that says one thing and characterize it as saying another. My desire for more concrete definitions is hampered by assumption of my positions, opposite of what I stated, on the very questions I ask to articulate those definitions. So I can only guess what your answer might have been.
 
  • #620
ThomasT said:
1. Say Alice and Bob are counter-propagating sinusoidal (light) waves that share a cloned property, eg., they're identically polarized. Analyze this cloned property with crossed polarizers and you get entanglement correlation. Cos^2 |a-b| in the ideal. It's just optics. Not that optics isn't somewhat mysterious in it's own right. But we can at least understand that the entanglement stats so produced don't have to be due to Alice and Bob communicating with each other, or that nonseparability means that Alice and Bob are the same thing in the sense that they're actually physically connected when they reach the polarizers.

DrChinese said:
1. I have news for you: this is patently FALSE. If you take 2 identically polarized photons and run them through the polarizers as you describe here, you do NOT get Cos^2 |a-b| or anything close to it.
In the cases you're talking about, the explanation is that the photons (while sometimes very closely polarized) aren't identically polarized. They're not 'clones' of each other. How do we know that? Precisely because when you run them through the polarizers you don't get cos^2 |a-b| entanglement stats (but you do get a range of approximations of essentially the same sinusoidal angular dependency -- which suggests to me that 'entanglement' is simply a special case involving the same underlying physical principles, which include, but aren't limited to, (1) the principle of locality and (2) the cos^2 theta rule).

DrChinese said:
You ONLY get this for ENTANGLED photons.
I agree. They (or a common property that's being jointly measured) are clones of each other. Which means that they're, eg., identically polarized. Which is deduced via the production of entanglement stats.

DrChinese said:
In other words: in the case where your assumption is actually valid - and I do mean identical and identically polarized photons coming out of a PDC crystal - you do NOT get entangled state statistics.
Then, as I said above, these photons aren't cloned (ie., entangled) wrt polarization. In this case, we can assume that |L1 - L2| > 0 (ie., we can assume that they weren't identically polarized), where L1 and L2 denote the optical vectors of the photons.

------------------------

ThomasT said:
2. Bell didn't address this case, because it's precluded by the EPR requirement that lhv models of entanglement be expressed in terms of parameters that determine individual results.
DrChinese said:
2. Bell quite discussed the case where the correlations are due anti-symmetric considerations.
That's not what I'm talking about -- which is that if Bell had modeled the joint situation in the global (ie., nonseparable) terms that it actually required (involving some modification in the representation of the 'beables' involved), then he might have presented a local realistic model which would have reproduced the qm correlation. The point of departure for viable local realistic models is that an experimental situation measuring a joint microphysical parameter vis a joint measurement parameter requires a 'nonseparable' representation. Such models have been produced, they work, and they remain unrefuted.

(Wrt my statement 2. above, I've come to think that EPR's definition of reality doesn't require that LR models of entanglement be expressed in terms of parameters that determine individual results. That is, there can be a common, underlying parameter that determines joint results while not determining individual results, and this, realistic, conception isn't contradicted by the EPR's conception of reality and definition thereof vis elements of reality.)

-----------------------------------------------

ThomasT said:
3. On the other hand, since a local realistic computer simulation of an entanglement preparation is not the same as a local realistic formal model (in the EPR sense), then it wouldn't be at all surprising if such a simulation could reproduce the observed experimental results, and violate a BI appropriate to the situation being simulated -- and this wouldn't contradict Bell's result, but, rather, affirm it in a way analogous to the way real experiments have affirmed Bell's result.

DrChinese said:
3. I would like to see one (and yes, it would surprise me). This is a somewhat complex subject and I am currently working with the De Raedt team (and another independent theoretical physicist) regarding some concerns I have expressed about their model. Their model does have some very interesting features. If it were possible to suitably express such a simulation, I think it might require some additional experimental analysis. It would not affect Bell's Theorem.
Not the math itself, no, but it would affect the physical interpretation of BI violations wrt locality and determinism -- rendering them irrelevant wrt those considerations.

---------------------------------------------------

From the thread: "Why the De Raedt Local Realistic Computer Simulations are wrong", you stated:

DrChinese said:
In trying to show that there "could" be an exception to Bell, please consider the following to add to your list of tests for you candidate LHV theory:
... snip ...
DrChinese said:
b) The formula for the underlying relationship will be different than the QM predictions, and must respect the Bell Inequality curve. I.e. usually that means the boundary condition which is a straight line, although there are solutions which yield more radical results.
If you're requiring that an LR model of entanglement not agree with qm predictions or experimental results, then I now see the point of your 'LR dataset' requirement. Well, yes, I certainly agree that one way to rule out qm compatible and viable LR accounts of entanglement is to simply require them to be incompatible with qm and inaccurate. But that would be inane. So I must be misunderstanding what you mean.
 
  • #621
ThomasT said:
1. Then, as I said above, these photons aren't cloned (ie., entangled) wrt polarization. In this case, we can assume that |L1 - L2| > 0 (ie., we can assume that they weren't identically polarized), where L1 and L2 denote the optical vectors of the photons.

2. If you're requiring that an LR model of entanglement not agree with qm predictions or experimental results, then I now see the point of your 'LR dataset' requirement. Well, yes, I certainly agree that one way to rule out qm compatible and viable LR accounts of entanglement is to simply require them to be incompatible with qm and inaccurate. But that would be inane.

1. Again, this is patently false. They most certainly ARE polarization clones of each other. And they are entangled. But they are not polarization entangled, which is quite different. If we accept your physical assumption of "counter-propagating influences", then these should produce the same statistics as entangled particles. But they don't.

Now why are these particles acting different? Because they are NOT in a superposition of polarization states. This is meaningful within QM but has no counterpart in a local realistic theory - in which there is no such thing as a superposition (by definition). Take a look at how these photon pairs are produced and you will see how ridiculous your assertion is. A reference:

Theory of two-photon entanglement in type-II optical parametric down-conversion
M. Rubin, D. Klyshko, Y. Shih, A. Sergienko
Physical Review A, December 1994
http://sws.bu.edu/alexserg/PRA_50_5122.pdf

"Using Eq. (41), it is easy to see that |Phi'> is a product state when Psi=pi/8; otherwise it is in an entangled state. It is an EPR state if Psi=0 or pi/4 and is a linear superposition of two EPR states for all other Psi's..."

What this means is that the only difference in producing the entangled state versus the product state is a small rotation of a wave plate. Perhaps you could explain how that separates these streams using a local realistic viewpoint. (P.S. this is a trick question because any accurate answer would show where to find the physical source of entanglement, and there isn't one.) Similarly there are other ways to break polarization entanglement and all of them rely on gaining knowledge of "which path" and therefore do not produce a superposition.

Again, I keep calling you out on this subject and you are operating in denial. The fact is that entangled particles have attributes that do not follow a local realistic explanation. You are simply trying to claim your ideas are equivalent to QM and they are not. If you are going to make an assumption with physical implications, then you lay yourself open to seeing that disproved. Which it has been, over and over.2. Talk to Bell about this. Or God. I did not create our universe, so it is not my requirement. Next you will be complaining about the 4 color map theorem as being "inane".
 
Last edited by a moderator:
  • #622
Here is an interest paper by Michael Seevinck, which rigorously derives a version of Bell's inequalities for correlations.
http://philpapers.org/rec/SEETQW".
Found. Phys. 36, 1573-1586 (2006)

He makes a more heuristic case here:
http://philpapers.org/rec/SEETQW said:
It is possible that one thinks that the requirement of local realism is too strong a requirement for ontological robustness. However, that one cannot think of entanglement as a property which has some ontological robustness can already be seen using the following weaker requirement: anything which is ontologically robust can, without interaction, not be mixed away, nor swapped to another object, nor flowed irretrievably away into some environment. Precisely these features are possible in the case of entanglement and thus even the weaker requirement for ontological robustness does not hold.
This same case against ontological robustness made here naturally also applies to the properties in Bell's inequalities. If ontologically robust variables exist, independent of any observation of it, this tells us it can't innately contain the properties, or observables, that we associate with the realism of classical properties. These properties must be generated dynamically.

At a foundational level, any such ontologically robust variables, independent of the dynamically generated properties, must by definition make them independent variables. As Schneider put so well in "[URL Determinism Refuted[/URL], an independent variable cannot even in principle be observed. However, if they play a role in dynamically generating observables, they may still have deterministic underpinnings. Thus Schneider has not refuted determinism, nor ontologically robust variables, in principle, but merely described exactly why it can't be directly observed in experiments, whether they exist or not. Schneider's argument only holds if absolutely nothing we can't see exist. The standard QM interpretation is predicated on this notion.

The positivist can yell poppycocks, but existential postulates are fundamentally no different from any mathematical postulate, so long as it's used for more than just to sweep the ontological and/or empirical difficulties of QM under the rug. So long as QM and GR remain disconnected, even failing the above criterion, it remains a legitimate open and worthy question. There is sound reason to consider observables synonymous with what is 'real'. It is the sole source of cogency of any theory. Yet to fail to make a distinction, in principle, between what is observed and what is ontologically real has been referred to as sleepwalking by some authors.

The point here is that ontological realism, ontologically robust variables, does not explicitly depend on any given measurable having any direct relation to those variables. Schneider's argument should make it clear that, even if realism is factual in principle, the notion that these ontologically robust variables are in themselves measurables is untenable. For a realist to assume a thing is observable without interaction amounts to ESP, at which point a self-referential interaction is observed, not the thing. From this perspective, the very notion of classical realism, used by Bell, Einstein, etc., is fatally flawed at the foundational level. Yet the realism may yet persist, or not.
 
Last edited by a moderator:
  • #623
my_wan said:
At a foundational level, any such ontologically robust variables, independent of the dynamically generated properties, must by definition make them independent variables. As Schneider put so well in "[URL Determinism Refuted[/URL], an independent variable cannot even in principle be observed. However, if they play a role in dynamically generating observables, they may still have deterministic underpinnings. Thus Schneider has not refuted determinism, nor ontologically robust variables, in principle, but merely described exactly why it can't be directly observed in experiments, whether they exist or not. Schneider's argument only holds if absolutely nothing we can't see exist. The standard QM interpretation is predicated on this notion.

That Schnieder guy makes some good points, thanks for pointing this out.

:smile:
 
Last edited by a moderator:
  • #624
DrChinese said:
That Schnieder guy makes some good points, thanks for pointing this out.

:smile:

:wink:
 
  • #625
Here is a more rigorous treatment of the idea that, if QM holds locally, then it indicates a violation of Bell's inequalities with no-signaling:
http://arxiv.org/abs/0910.3952"
Phys. Rev. Lett. 104, 140401 (2010)

This paper also uses an argument I previously attempted here wrt classical variables:
http://arxiv.org/abs/0804.0884"

It still seems to me, based on my modeling, that in order to define EPR in terms of variables, each offset in detector settings has to be defined by a separate (probably relativistically related) probability space as defined by Hess et al. Unless of course I'm allowed to define 1 of the detector settings as 0, and simply rotate the whole coordinate system to change its settings. Otherwise the number of variables required grows excessively large for arbitrary settings, perhaps even diverges. Quantum computers appears to require an arbitrary number of variables also.

QM, in a sense, consist of discontinuous solutions to differential equations. Along with the Born rule and HUP, it primarily sums up the conceptual difficulties with QM. I suspect Bell violations may be related more to a physical manifestation the Born rule than HUP. As if natures measurables really are a projection from an entirely different coordinate symmetry than we assume.
 
Last edited by a moderator:
  • #626
my_wan said:
This paper also uses an argument I previously attempted here wrt classical variables:
http://arxiv.org/abs/0804.0884"

That reference deserved to be labeled with the author's name. Hess is a persistent local realist who has attacked Bell and Bell tests from numerous angles. His work is widely rejected.

In this piece, he basically argues for the QM position by asserting that there are no classical probability spaces. He discusses the idea of incompatible measurements (i.e. >2) which is in fact the QM position. I guess if you move the bar far enough, everyone can claim victory.

The question I always ask myself for these arguments is really quite simple: what would history's greatest local realist - Einstein - think of the argument? Of course, we can only speculate but speculate I will. Einstein would have appreciated the Bell argument and would NEVER try to con his way out of it with an argument like Hess has made. Please, feel free to disagree...
 
Last edited by a moderator:
  • #627
Ok, so Hess has his critiques, but on what grounds are the counterarguments predicated? In fact this is why I chose this reference, rather than the original version, because it was a response to criticisms, thus contained references to those criticisms.

Criticizing it on the grounds that it fails to explain EPR correlations, or provide a mechanism for doing so, is a non-starter. Consider the following quote from the Hess paper, as a result of implied content of his critiques:

(Hess)-[PLAIN said:
http://arxiv.org/abs/0804.0884]It[/PLAIN] also should be noted that the author subscribes fully to the teachings of both quantum and Kolmogorov probability (as different and well proven probability frameworks) and to their well known relationship to actual experiments (see e.g. [16]). The author has neither any criticism for these frameworks nor for the definition of the “elements of physical reality” of the EPR paper [17] nor for the EPR-type experiments performed by Aspect [18] and others. The author criticizes exclusively the work of Bell as not being general enough to apply to general physics problems (quantum and/or classical) and the work of Bell’s followers for the same reason and for actual logical and mathematical mistakes.

So when you say widely rejected, precisely what was widely rejected? No specific claim was made that the given mechanism would even provide a realistic explanation of the inequality violations. Only that Bell's argument, as posed, lacked the generality needed for the generality often taken in its interpretation, whether a classical or quantum context. So his critiques proceeded on the grounds that the assumed variables have a presupposed relationship to the measurables, and proceeds to destroy the argument on those grounds. Well duh... So the implied meaning of "His work is widely rejected" is of little import to the questions that remain open and unanswered. Facts are not a democracy, and the claims here presupposes a generality lacking in the argument. Thus no complete proof or disproof exist atm.

I get a queasy feeling anytime I start trying to second guess how someone else would view something. I suspect Einstein had his own perspective, that didn't lack a full appreciation of the empirical validity of QM, nor the loss he was at to explain what quanta was.
I consider it quite possible that physics cannot be based on the field concept, i.e., on continuous structures. In that case, nothing remains of my entire castle in the air, gravitation theory included, [and of] the rest of modern physics. (Albert Einstein, 1954)
Here he placed the importance of describing what actually is above his own lifes work. So where we might presuppose Einstein would go with any given piece of empirical evidence is more than a little presumptuous.
 
Last edited by a moderator:
  • #628
my_wan said:
1. Ok, so Hess has his critiques, but on what grounds are the counterarguments predicated? In fact this is why I chose this reference, rather than the original version, because it was a response to criticisms, thus contained references to those criticisms.

So when you say widely rejected, precisely what was widely rejected? No specific claim was made that the given mechanism would even provide a realistic explanation of the inequality violations. Only that Bell's argument, as posed, lacked the generality needed for the generality often taken in its interpretation, whether a classical or quantum context. So his critiques proceeded on the grounds that the assumed variables have a presupposed relationship to the measurables, and proceeds to destroy the argument on those grounds. Well duh... So the implied meaning of "His work is widely rejected" is of little import to the questions that remain open and unanswered. Facts are not a democracy, and the claims here presupposes a generality lacking in the argument. Thus no complete proof or disproof exist atm.

2. I get a queasy feeling anytime I start trying to second guess how someone else would view something. I suspect Einstein had his own perspective, that didn't lack a full appreciation of the empirical validity of QM, nor the loss he was at to explain what quanta was.Here he placed the importance of describing what actually is above his own lifes work. So where we might presuppose Einstein would go with any given piece of empirical evidence is more than a little presumptuous.

1. It is normal, in this forum, to identify work which is not generally accepted (or worse, is generally rejected). Hess makes note of the fact that his position is rejected by Mermin. As to the substance of his argument: Hess is constantly trying new attacks on Bell. It is hard not to get the feeling that his position is based on emotion rather than science. When he comes up with something worth looking at in more detail, I will. In the meantime, I am waiting for a specific counterexample to discuss. He doesn't offer any.

2. Well, I presume to state that Einstein would have no part of Hess' ideas. He would have understood Bell immediately, and would never have tried to weasel out of it with anything less than something equally substantial. As you mention, Einstein would be willing to give up everything for one good argument. Fortunately, Bell only requires Einstein give up 1 thing.
 
  • #629
Again, what exactly has been rejected, the claim that the Bell argument contains this class of of variables the Bell argument doesn't address, or the claim that that no such class has been constructed to do so?

I mention this paper only because I did use a similar argument as one possibility among others. I am also dissatisfied with it, as I have noted. The 0 angle definition condition, I was forced into to make it work, is physically quiet similar to what Hess et al proposed in making a new HV set for each possible angle. A new set for each angle gets out of the 0 angle condition I had, but creates a new problem. The variables must still define the offset, and 1 or the other detector, but not both, has to count off from that offset. Thus it introduces the same relative coordinate condition I was forced to impose with an arbitrarily defined 0 setting.

The thing about Mermin's counter is that he presupposes the counterfactual coincidences must have the same coincidence rates relative to a separate run in which they were empirically established. In fact Mermin states he uses his red/green light toy model to articulate issues with.

Let's consider at a pair of unfair coins. These coins are special, and have a micro-dial setting to determine how unfair they are. You set it so they have an 15% chance of landing on the opposite sides. Now you take a 3rd coin, and want to set it so it has a 50% chance of landing on the same side as the 1st coin, and an 85% chance of landing on the opposite side as the second coin. Does the fact that it can't be done invalidate the reality of the coin settings? Yet separately you can do just that.

Are we arbitrarily imposing a similar physical absurdity, and hiding it behind a presupposed 3-way correlation? Does the variables we suppose are carried by the photons physically preclude such 3-way correlations for perfectly valid physical reasons? In fact, in QM, the probabilities must be considered jointly, precluding probabilities greater than 1. It is only through an a priori imposition that such conditions are demanded in QM, which are contrary to the rules of QM. So we are also violating the rules of QM, as well as physical constraints on the coin analogy, with such counterfactual a priori demands.

So if the rules of QM are not being violated, show how QM predicts a probability greater than 1, without presupposing it through counterfactual choices. Otherwise Bell's inequality sneaks a QM rules violation in the back door, via a counterfactual claim. The physical constraint, like the coins, would be in physical creation the 3rd correlated particle (variables) with the specified properties, not in what the detectors read after the fact, nor a constraint on any single pairing of properties and hv's.

That may be the strongest objection yet. Like trying to define 3 coins that can all land on opposite sides, because couterfactually any 2 can.
 
  • #630
my_wan said:
Again, what exactly has been rejected, the claim that the Bell argument contains this class of of variables the Bell argument doesn't address, or the claim that that no such class has been constructed to do so?

I mention this paper only because I did use a similar argument as one possibility among others. I am also dissatisfied with it, as I have noted. The 0 angle definition condition, I was forced into to make it work, is physically quiet similar to what Hess et al proposed in making a new HV set for each possible angle. A new set for each angle gets out of the 0 angle condition I had, but creates a new problem. The variables must still define the offset, and 1 or the other detector, but not both, has to count off from that offset. Thus it introduces the same relative coordinate condition I was forced to impose with an arbitrarily defined 0 setting.

The thing about Mermin's counter is that he presupposes the counterfactual coincidences must have the same coincidence rates relative to a separate run in which they were empirically established. In fact Mermin states he uses his red/green light toy model to articulate issues with.

Let's consider at a pair of unfair coins. These coins are special, and have a micro-dial setting to determine how unfair they are. You set it so they have an 15% chance of landing on the opposite sides. Now you take a 3rd coin, and want to set it so it has a 50% chance of landing on the same side as the 1st coin, and an 85% chance of landing on the opposite side as the second coin. Does the fact that it can't be done invalidate the reality of the coin settings? Yet separately you can do just that.

Are we arbitrarily imposing a similar physical absurdity, and hiding it behind a presupposed 3-way correlation? Does the variables we suppose are carried by the photons physically preclude such 3-way correlations for perfectly valid physical reasons? In fact, in QM, the probabilities must be considered jointly, precluding probabilities greater than 1. It is only through an a priori imposition that such conditions are demanded in QM, which are contrary to the rules of QM. So we are also violating the rules of QM, as well as physical constraints on the coin analogy, with such counterfactual a priori demands.

So if the rules of QM are not being violated, show how QM predicts a probability greater than 1, without presupposing it through counterfactual choices. Otherwise Bell's inequality sneaks a QM rules violation in the back door, via a counterfactual claim. The physical constraint, like the coins, would be in physical creation the 3rd correlated particle (variables) with the specified properties, not in what the detectors read after the fact, nor a constraint on any single pairing of properties and hv's.

That may be the strongest objection yet. Like trying to define 3 coins that can all land on opposite sides, because couterfactually any 2 can.

I reject the idea that a realistic theory is possible. It is really that simple. My definition of reality being the same as the EPR definition: if it can be predicted in advance, there must be an element of reality. But there cannot be 3 simultaneously real. This is not a requirement of QM, and in no way is QM given a preferred status in Bell other than by way of comparison. You cannot get a realistic theory with ANY function where there is rotational invariance, as Mermin demonstrated. Hess has provided nothing for me to reject other than his conclusion. There is no realistic model. Again.

QM does not ask for counterfactuality, so your argument is backwards. It is realism that requires extra assumptions, not QM. So if you think these requirements are absurd, well, that would simply mean you reject realism. Sorry, but you cannot have your cake and eat it too.

So define realism however you like. Define it like Hess if that makes you happy (or whatever his latest absurd definition of the week happens to be). But I won't agree that day is night, that blue is red, or whatever. I will stick with Einstein's elements of reality.
 
  • #631
my_wan said:
Bell's ansatz assumes a locally realistic mechanism must take a form that linearly transitions with the change in angle. What we have is a transition that changes with the square of the angle. Yet this empirical fact is ubiquitous. The same rules apply to polarizers, the efficiency loss in aerial antennas offset from the ideal setting, etc.
It was from evaluations along these lines that suggested to me that there might be something wrong with Bell's formulation. For example, if EPR elements of reality are too restrictively represented, or if the statistical independence represented by Bell's equation (2) supercedes it's represention of causal independence between A and b (B and a), then Bell's formulation isn't logically rigorous, and violations of BIs aren't physically relevant.

my_wan said:
This empirical fact may or may not have a realistic basis.
What do you mean by this? That the cos^2 theta rule can't be understood realistically?

my_wan said:
But the fact that EPR correlations exhibit the same detection profile says nothing about locality when the same effect occurs without any correlations involved. EPR correlations, in this view, would only indicate the mechanism is deterministically replicable.
This is the way everyone would think about it in the absence of interpretations of Bell to the contrary. And, this is why it's so important to continue to examine the assumptions underlying Bell's formulation. A couple of generations of professionals in the field going back and forth on what BI violations mean is reason enough to think that it's just possible that some subtle point which would render Bell's theorem physically irrelevant (except for it's possible application as an indicator of the presence and degree of entanglement) has been glossed over.

my_wan said:
Here is an interest paper by Michael Seevinck, which rigorously derives a version of Bell's inequalities for correlations.
The Quantum World is Not Built Up From Correlations.
Found. Phys. 36, 1573-1586 (2006)
I just briefly looked at this so far, but it would seem to support the idea that the nature of entanglement is relationships between and among things. Not things in themselves. Whether or not we see entanglement depends on how we look at things. Hence the nonseparability of the relationship between (the relationship between) the things being observed and the observational context. Which Bell doesn't quite capture in his LR ansatz.
 
  • #632
ThomasT said:
It was from evaluations along these lines that suggested to me that there might be something wrong with Bell's formulation. For example, if EPR elements of reality are too restrictively represented, or if the statistical independence represented by Bell's equation (2) supercedes it's represention of causal independence between A and b (B and a), then Bell's formulation isn't logically rigorous, and violations of BIs aren't physically relevant.

If you are going to make statements like this, you had better back it up.

So perhaps you can give me an example if how a) EPR elements of reality are too restrictive; b) a classical case in which Bell's (2) is violated when considering the full universe.

In other words, what is wrong with Bell's definitions (other that you don't like the conclusions they inevitably lead to) ?
 
  • #633
DrChinese said:
They most certainly ARE polarization clones of each other. And they are entangled. But they are not polarization entangled, which is quite different.
How do we know that they're polarization clones of each other if they don't produce entanglement stats?

Also, if
DrChinese said:
...the only difference in producing the entangled state versus the product state is a small rotation of a wave plate.
That wouldn't seem to indicate that they're "quite different", unless we are to assume that a "small rotation of a wave plate" somehow switches on some sort of action at a distance or ftl communication between the photons.

DrChinese said:
If we accept your physical assumption of "counter-propagating influences", then these should produce the same statistics as entangled particles. But they don't.
The fact that a wave plate rotation is required to produce polarization entanglement would seem to indicate that they weren't clones of each other to begin with. Or, maybe the wave plate rotation keeps them cloned but adjusts some other parameter which then results in entanglement. Or, maybe the wave plate rotation unclones them, and then, since they're uncloned they have to communicate via action at a distance or ftl to be 'entangled'.

DrChinese said:
Now why are these particles acting different? Because they are NOT in a superposition of polarization states. This is meaningful within QM but has no counterpart in a local realistic theory - in which there is no such thing as a superposition (by definition).
I think we both agree that (1) quantum superposition and quantum entanglement can't be understood in terms of separable (factorable) combinations of the individual systems. However, contrary to (1), (2) Bell (via a certain interpretation of the scope of EPR's definition of reality) has required LR models of entanglement to be represented in a separable form which contradicts the reality of the experimental situations to which that form is being applied. The paper that you referenced agrees with (1). So does every other paper I've read on this. I haven't found anything yet that specifially addresses (2), except for viable LR models that, in agreement with (1), encode the fact that joint detection is determined by different parameters than those which determine individual results -- but, according to you, we can't accept those because their predictions agree with qm and experiments.

By the way, thanks for the reference. My 'assertion' wrt a simplified 'realistic' view of the underlying optical disturbances is probably much too simplistic. It does seem to work for entangled photons produced by atomic cascades though. I'm just beginning a study of OPDC. So, my little simplification might turn out to be ridiculous.

Here's another paper (you've probably read it) that some viewers might be interested in. There's lots of good stuff at Sergienko's group's website.

http://people.bu.edu/alexserg/PRL3893_1993.pdf
Einstein-Podolsky-Rosen-Bohm Experiment Using Pairs of Light Quanta Produced by Type-II Parametric Down-Conversion
Authors:T.E. Kiess, Y.H. Shih, A.V. Sergienko, and C.O. Alley
Phys. Rev. Lett. v.71, pp. 3893-3897 (1993)

So far I don't find anybody saying that the correlations are due to action at a distance or ftl. Eg., in the paper referenced below, they define nonlocality rather innocuously (and in fact state that action at a distance isn't indicated). I think this might be the way that lots (most?) physicists think about it. Quantum nonlocality doesn't mean nonlocality. (The first link might time out, so I included a link to the preprint version.)

http://qopt.postech.ac.kr/publications/PhysRevA-60-p2685.pdf
http://arxiv.org/PS_cache/quant-ph/pdf/9811/9811060v1.pdf
Experimental study of a photon as a subsystem of an entangled two-photon state
Authors: Dmitry V. Strekalov, Yoon-Ho Kim, Yanhua Shih
Phys. Rev. A v.60, pp. 2685-2688 (1999)

DrChinese said:
The fact is that entangled particles have attributes that do not follow a local realistic explanation.
Only wrt Bell's LR model. Which we know doesn't fit the requirements of the experimental situation.

Considering the small experimental differences necessary for entanglement vs nonentanglement stats, and the fact that even LR models conforming to Bell's restrictions aren't that far away from qm predictions, and the fact that there are viable LR models that don't conform to Bell's restrictions, all support the idea that the 'problem of nonlocality' has to do with the way things are being talked about, and not anything to do with the existence of action at a distance or ftl anything.

DrChinese said:
Talk to Bell about this. Or God. I did not create our universe, so it is not my requirement. Next you will be complaining about the 4 color map theorem as being "inane".
The 4 color map theorem is logically rigorous. Bell's assessment of the form that an LR model of entanglement must take isn't.

Anyway, the requirement than any LR model in any form be incorrect is your requirement. You've been shown LR models whose predictions agree with those of qm and experimental results -- and your response is that you want them to produce a dataset that disagrees with qm and experimental results.

I know how you got there (there's only one way -- Bell's way), but wouldn't it make sense to at least look at them and evaluate whether they're realistic and/or local instead of dismissing them because they're quantitatively correct?

I'm going to keep my Bell talk to a minimum for the time being. You've opened up a whole new world for me with the OPDC stuff, and I feel compelled to learn as much about it as I can. (Hmmmm, maybe there is a method to your madness.) Anyway, thanks again.
 
  • #634
DrChinese said:
If you are going to make statements like this, you had better back it up.

So perhaps you can give me an example if how a) EPR elements of reality are too restrictive; b) a classical case in which Bell's (2) is violated when considering the full universe.

In other words, what is wrong with Bell's definitions (other that you don't like the conclusions they inevitably lead to) ?
Wrt a), they're in the literature, and they encode the more reasonable interpretation of EPR's conception of reality that would allow joint detection to be represented in a nonseparable form. Wrt b), I don't understand what you're asking.

Bell represented locality via the factorability (separability) of the joint situation which is entailed by the requirement to represent joint detection in terms of individual detections. However, because this isn't the reality of the joint situation, then the locality condition's relevance to locality is screened out or superceded by the fact that the joint situation is being modeled in terms of individual parameters which simply can't be combined in the way that Bell requires them to be combined (and also correctly model the nonseparability of the experimental situation) in his LR model.
 
Last edited:
  • #635
ThomasT said:
my_wan said:
This empirical fact may or may not have a realistic basis.
What do you mean by this? That the cos^2 theta rule can't be understood realistically?
No, that's not what I mean. But, at the end of the day, pragmatism trumps preconceptions. The empirical is what you take to the bank. We don't get to decide what nature is and isn't. I have an extensive list of classical analogs of QM, but in the general case it still breaks, and why it always breaks is tied with the issues in BI violations in some ways. So just because I agree with you, in principle, that there may be realistic mechanisms by some definition, I can't claim it must be so just because I can justify it in principle. It's just as unreasonable to marginalize people for trying, as it is unreasonable to claim what is true based solely on what can be heuristically justified.

%%%%%%%%%%
DrChinese said:
I reject the idea that a realistic theory is possible. It is really that simple. My definition of reality being the same as the EPR definition: if it can be predicted in advance, there must be an element of reality. But there cannot be 3 simultaneously real.
Here's a simplistic fundamental issue I have with that: Given an initial condition and momentums of a set of pool balls, I can predict in advance the topology of those balls will form an X. Does that make the X an 'element of reality' of those balls? If a measuring instrument simply selects a range of topologies, consistent with some condition, what does that leave you with wrt realism in EPR. I'll make this even clearer wrt Mermin's work you mention below.

DrChinese said:
This is not a requirement of QM, and in no way is QM given a preferred status in Bell other than by way of comparison.
Which was exactly my point. Bell counterfactually imposed a condition on QM that QM doesn't allow.

DrChinese said:
You cannot get a realistic theory with ANY function where there is rotational invariance, as Mermin demonstrated. Hess has provided nothing for me to reject other than his conclusion. There is no realistic model. Again.
Thanks for reminding me of this! In fact this is a fully general feature of all vectors, including vectorial components of classical objects!

Consider a pair of unit vectors, P and Q. You can even view them as a pair of balls colliding in space.
Rx = PxQx
Ry = PyQy
Rz = PzQz

Now we consider the perpendicular case:
Rx = 0
Ry = 0
Rz = 0

But, if we rotate this coordinate system 45 degrees we get:
Rx = 1/2
Ry = -1/2
Rz = 0

But nature doesn't care how we labeled our coordinate system, or that these labels give us incongruent results wrt values. The fact remains that 'reality' is the same (even the same instance of reality) no matter how we choose our coordinate basis. If we claim unreality on the grounds that 1/2 = -1/2 = 0 is false, that's our problem, not one of reality. The arbitrary angle verses relative offset requirement in BI imposes this absolutely general mathematically incongruent wrt values of ALL vectors, not just polarizer settings in EPR experiments, but ANY classical vector.

Based on this, following the BI version of realism, it's trivially provable that, since vectorial components of pool balls is a measurable property, pool balls are not real.

DrChinese said:
QM does not ask for counterfactuality, so your argument is backwards. It is realism that requires extra assumptions, not QM.
Again
That was exactly my point. It was imposed on QM, not asked for by QM. In fact I went farther and said it's invalid under QM alone, just as it's invalid as settings for 3 unfair coins. It remains confusing how my point is continually objected to by making my point.

DrChinese said:
So if you think these requirements are absurd, well, that would simply mean you reject realism. Sorry, but you cannot have your cake and eat it too.
So if realism requires this extra assumption, this means that, since the vectorial components of pool ball collisions are measurable, yet incongruent wrt values, pool balls can't be real.

DrChinese said:
So define realism however you like. Define it like Hess if that makes you happy (or whatever his latest absurd definition of the week happens to be). But I won't agree that day is night, that blue is red, or whatever. I will stick with Einstein's elements of reality.
I don't agree that 1/2 = -1/2 = 0 either, but if that means I'm supposed to define the pool balls as not real on these grounds, I'll pass.
 
Last edited:
  • #636
my_wan said:
That may be the strongest objection yet. Like trying to define 3 coins that can all land on opposite sides, because couterfactually any 2 can.
You may be interested in this post from a previous thread that discusses the issue raised by Hess, clearly showing that correct labelling of variables is paramount to understanding violations of BIs.
https://www.physicsforums.com/showpost.php?p=2707087&postcount=69"

Consider a certain disease that strikes persons in different ways depending on circumstances. Assume that we deal with sets of patients born in Africa, Asia and Europe (denoted a,b,c). Assume further that doctors in three cities Lyon, Paris, and Lille (denoted 1,2,3) are are assembling information about the disease. The doctors perform their investigations on randomly chosen but identical days (n) for all three where n = 1,2,3,...,N for a total of N days. The patients are denoted Alo(n) where l is the city, o is the birthplace and n is the day. Each patient is then given a diagnosis of A = +1/-1 based on presence or absence of the disease. So if a patient from Europe examined in Lille on the 10th day of the study was negative, A3c(10) = -1.

According to the Bell-type Leggett-Garg inequality

Aa(.)Ab(.) + Aa(.)Ac(.) + Ab(.)Ac(.) >= -1

In the case under consideration, our doctors can combine their results as follows

A1a(n)A2b(n) + A1a(n)A3c(n) + A2b(n)A3c(n)

It can easily be verified that by combining any possible diagnosis results, the Legett-Garg inequalitiy will not be violated as the result of the above expression will always be >= -1, so long as the cyclicity (XY+XZ+YZ) is maintained. Therefore the average result will also satisfy that inequality and we can therefore drop the indices and write the inequality only based on place of origin as follows:

<AaAb> + <AaAc> + <AbAc> >= -1

Now consider a variation of the study in which only two doctors perform the investigation. The doctor in Lille examines only patients of type (a) and (b) and the doctor in Lyon examines only patients of type (b) and (c). Note that patients of type (b) are examined twice as much. The doctors not knowing, or having any reason to suspect that the date or location of examinations has any influence decide to designate their patients only based on place of origin.

After numerous examinations they combine their results and find that

<AaAb> + <AaAc> + <AbAc> = -3

They also find that the single outcomes Aa, Ab, Ac, appear randomly distributed around +1/-1 and they are completely baffled. How can single outcomes be completely random while the products are not random. After lengthy discussions they conclude that there must be superluminal influence between the two cities.

But there are other more reasonable reasons. Note that by measuring in only two citites they have removed the cyclicity intended in the original inequality. It can easily be verified that the following scenario will result in what they observed:

- on even dates Aa = +1 and Ac = -1 in both cities while Ab = +1 in Lille and Ab = -1 in Lyon
- on odd days all signs are reversed

In the above case
<A1aA2b> + <A1aA2c> + <A1bA2c> >= -3
which is consistent with what they saw. Note that this equation does NOT maintain the cyclicity (XY+XZ+YZ) of the original inequality for the situation in which only two cities are considered and one group of patients is measured more than once. But by droping the indices for the cities, it gives the false impression that the cyclicity is maintained.

The reason for the discrepancy is that the data is not indexed properly in order to provide a data structure that is consistent with the inequalities as derived.Specifically, the inequalities require cyclicity in the data and since experimenters can not possibly know all the factors in play in order to know how to index the data to preserve the cyclicity, it is unreasonable to expect their data to match the inequalities.

For a fuller treatment of this example, see Hess et al, Possible experience: From Boole to Bell. EPL. 87, No 6, 60007(1-6) (2009)

Note that in deriving Bell's inequalities, Bell used Aa(l), Ab(l) Ac(l), where the hidden variables (l) are the same for all three angles. For this to correspond to the Aspect-type experimental situation, the hidden variables must be exactly the same for all the angles, which is an unreasonable assumption because each particle could have it's own hidden variables with the measurement equipment each having their own hidden variables, and the time of measurement after emission itself a hidden variable. So it is more likely than not that the hidden variables will be different for each measurement. However, in actual experiments the photons are only measured in pairs (a,b), (a,c) and (b,c). The experimenters, not knowing the exact nature of the hidden variables, can not possibly collect the data in a way that ensures the cyclicity is preserved. Therefore, it is not possible to perform an experiment that can be compared with Bell's inequalities.
 
Last edited by a moderator:
  • #637
ThomasT said:
Wrt a), they're in the literature, and they encode the more reasonable interpretation of EPR's conception of reality that would allow joint detection to be represented in a nonseparable form. Wrt b), I don't understand what you're asking.

I am asking for a reference for these alternative reasonable definitions. I am not familiar with any others generally out there. I have yet to see actual proposed ones.

As to b): if you think there is a classical counterexample, give it! And please, nothing where we have an unfair sample of doctors or similar. A full universe. You won't be able to do it!
 
  • #638
my_wan said:
Which was exactly my point. Bell counterfactually imposed a condition on QM that QM doesn't allow.

You still have it backwards. QM has nothing to do with it. And I don't follow your pool ball example.
 
  • #639
billschnieder said:
You may be interested in this post from a previous thread that discusses the issue raised by Hess, clearly showing that correct labelling of variables is paramount to understanding violations of BIs.
https://www.physicsforums.com/showpost.php?p=2707087&postcount=69"

Again, this example is an attempt to show that there is an unfair sample of a full universe. This is not Bell at all. For Bell, you use a full universe of trials.
 
Last edited by a moderator:
  • #640
DrChinese said:
I am asking for a reference for these alternative reasonable definitions. I am not familiar with any others generally out there. I have yet to see actual proposed ones.

As to b): if you think there is a classical counterexample, give it! And please, nothing where we have an unfair sample of doctors or similar. A full universe. You won't be able to do it!
The few that I know of aren't published. Just in preprint. You're probably aware of most, if not all, of them. Since it's unlikely that any of it will get published, there's not much to discuss.

By the way, after thinking about it, I think you're right about Bell's (2) wrt EPR settings. There's nothing special about these settings except that a certain LR implementation of Bell's (2) does agree with qm for EPR settings (and |a-b|=45o). But other than that no. And after looking at Bell's paper more closely it does seem that he's showing that an LR implementation of (2) is incompatible with all qm predictions for any settings.
 
  • #641
DrChinese said:
You still have it backwards. QM has nothing to do with it. And I don't follow your pool ball example.

But QM does have something to do with it. QM says the counterfactual case is interfered with by some some mechansim, often presumed FTL. Perhaps it's really interefereing with the absurdity the pre-conditions placed on the HV's from the start.

When you say you don't follow my pool ball example, what don't you get? Can you look at any single vectorial outcome of a classical interaction, and ask what that vectorial product looks like under different coordinate rotations? The 'values' of the 'predicted' measurables are incompatible wrt those 'values' obtained under an arbitrary coordinate rotation. Per the realism definition the predictability of those values, given any particular rotation, requires us to defines those vectorial products as 'elements of reality'. But rotate your coordinate system on this same physical event and your value obtained from this 'element of reality' becomes inconsistent with the prior 'value' of the same event.
 
  • #642
DrChinese said:
And please, nothing where we have an unfair sample of doctors or similar.

:smile: :smile: :smile:
 
  • #643
my_wan said:
No, that's not what I mean. But, at the end of the day, pragmatism trumps preconceptions. The empirical is what you take to the bank. We don't get to decide what nature is and isn't. I have an extensive list of classical analogs of QM, but in the general case it still breaks, and why it always breaks is tied with the issues in BI violations in some ways. So just because I agree with you, in principle, that there may be realistic mechanisms by some definition, I can't claim it must be so just because I can justify it in principle. It's just as unreasonable to marginalize people for trying, as it is unreasonable to claim what is true based solely on what can be heuristically justified.
Thanks for the reply. Some interesting considerations for future threads.
 
  • #644
DrChinese said:
Again, this example is an attempt to show that there is an unfair sample of a full universe. This is not Bell at all. For Bell, you use a full universe of trials.
No, only that's it's, in principle, possible that the sampling used by Bell isn't valid. It has nothing to do with the full universe's sampling, and everything to do with how we choose to define our sampling of it.

Wrt the vector argument, there's a very simple reason vectors are not generally rotation invariant. When looking at the product of a pair of vectors, the defining vectors are indeterminate. That is there is an arbitrarily large possible number of vector pairs that could have created that vector. That 1 vector can even represent the result of an arbitrarily large number of products from actual vectors.

So when Bell's inequalities specifies any arbitrary theta must be modeled to avoid FTL, it could be requiring the particular vectorial instances that defined a vector to be uniquely identified after the fact. This is impossible even for the product of a single pair of vectors, given only a single resulting vector.

---------
I hoped a more reasonable objection would be posed. Like the fact that the pool ball example doesn't define a rotationally invariant function, like Mermin defined. Whereas QM is dependent on the existence rotationally invariant functions. So how can a rotationally invariant functions represent anything real, if the pool ball analogy holds?

Consider a particle with a real default polarization. The default polarization is -only- unique in that a polarizer at that setting has essentially a 100% chance of passing that particle, but offsets can also pass some of the time via cos^2(theta). If the default polarization is completely randomized over a large group of particles, it's physically impossible for anyone polarizer setting to uniquely define a statistical function unique to a given polarizer setting. Only the particular sequence of detections can be unique, case instances of a probability which, by definition, are not themselves probabilities, which are later used to define the coincidences. Thus probability functions, not the case instances that result from them, are physically -required- to be rotationally invariant.

This would explain also why I could only model inequality violations, in my computer simulations, if I restricted one or the other setting to be defined at 0 angle. Even though the 0 angle could be arbitrarily chosen, and this didn't uniquely identify the other detector setting in calculating the detection sequence for later comparison to derive coincidences. Perhaps I should attempt to replace my binary bits with some form of predefined sets of vectors.
 
Last edited:
  • #645
DrChinese said:
Again, this example is an attempt to show that there is an unfair sample of a full universe. This is not Bell at all. For Bell, you use a full universe of trials.

I suspect you have ready to produce a citation to a published example of an experiment in which they made absolutely sure that the full universe of all possible values of all hidden variables was realized.

In any case, the point of the example is that by making similar errors in macroscopic situations, you can create FTL paradoxes at the macroscopic level too.

As my_wan already explained, the error is NOT the fact that the data is incomplete or sampled unfairly. The error is the fact that the data is not properly indexed according to the contexts. In other words, multiple contexts are mixed together in a manner not expected by the inequalities, such that the only cases for which the inequalities will be universally valid irrespective of how the data is indexed, are non-contextual variables.
 
Last edited:
  • #646
billschnieder said:
I suspect you have ready to produce a citation to a published example of an experiment in which they made absolutely sure that the full universe of all possible values of all hidden variables was realized...

I think you are admitting that Bell is correct. There is no local realistic universe.

And we have already been through the bit about the fair sampling. There are a number of experiments that do not require the fair sampling assumption that I have already referenced. You reject all counterevidence so it is meaningless to discuss this with you.

I suspect you even like the Monkees more than the Beatles. :biggrin:
 
  • #647
my_wan said:
So when Bell's inequalities specifies any arbitrary theta must be modeled to avoid FTL, it could be requiring the particular vectorial instances that defined a vector to be uniquely identified after the fact. This is impossible even for the product of a single pair of vectors, given only a single resulting vector.

...

This would explain also why I could only model inequality violations, in my computer simulations, if I restricted one or the other setting to be defined at 0 angle. Even though the 0 angle could be arbitrarily chosen, and this didn't uniquely identify the other detector setting in calculating the detection sequence for later comparison to derive coincidences. Perhaps I should attempt to replace my binary bits with some form of predefined sets of vectors.

All it takes is one.
 
  • #648
DrChinese said:
All it takes is one.
What conditions must be met to qualify as this "one"? Does it require an explicit theory that explains exactly the mechanism by which BI violations occur? Does is simply require a toy models which mimics such violations? Are these (toy) models required to be rotational invariant, in spite of the fact that you pointed out Mermin showed rotational invariance entail non-real variables? Are you denying that realistic classical variables exist that lack rotational invariance?

Why do I ask so many questions at once? Because I give detailed explanations and spread dozens of questions and you never answer anyway, so I ask in irony. You responses, or lack of, are tipping my scales.

Look at how you just now responded to the -RAW- sarcasm of billschnieder, by accusing him of an admission of your point of view.

DrChinese said:
I think you are admitting that Bell is correct. There is no local realistic universe.

And we have already been through the bit about the fair sampling. There are a number of experiments that do not require the fair sampling assumption that I have already referenced. You reject all counter evidence so it is meaningless to discuss this with you.

I suspect you even like the Monkees more than the Beatles. :biggrin:

The "fair sampling" accusation here is facetious as hell., and you know it. You've taken a prior issue, where the sampling accuracy of the experimental data was questioned, and pretending it's the same argument contained in a toy model describing contextually of variables. I think you know good and well that "fair sampling" is a separate issue, and if I'm wrong, why respond to explanations that its not with accusations of your opponents admitting they are wrong in the same post they are explaining this to you?

You've continually mirrored my points, as if they were a denial of my points. You've claimed ignorance repeatedly, even of something so simple as a vectorial product before and after a coordinate rotation. Failed to explain your position when asked. Failed to even explain what you claimed not to understand, even when asked. Continually, and chronically, fail to provide any context, or justification, or explanation, to your rebuttals. You even fail to specify what, in post for which much effort was put in, your rebuttals actually refer to in that post.

What's your point of even debating here? To run authoritative interference of any viewpoints you don't like?

Show me I'm wrong, and answer some questions. Ask some questions if somethings not clear. But to simply keep accusing people of viewpoints they are expending great effort and honest feedback to deny, with no good faith return rebuttals, appears awfully bad to me. My apologies for the aggravation, but it appears well called for when I'm not the only one getting this treatment, and others seem to understand what you deny understanding of. Without even the courtesy of questions indicate where the clarity was lacking, only accusations of admissions to your point of view.

The real shame is that you have a perspective I really really want to understand.
 
  • #649
RUTA said:
Changes in the set up are all that you need to change the distribution of outcomes -- you don't need any 'thing' other than equipment characteristics, i.e., no reference to quantum entities, waves, etc. For example, see section 4.3 Geometrical Account of QLE starting on p 28 of our FoP paper, http://users.etown.edu/s/stuckeym/FOP 2008.pdf. In particular notice how how Eq. 31 becomes Eq. 32 on p. 29.


Thanks RUTA, for the interesting paper.
"[URL Spacetime and the Quantum:
Relational Blockworld and the Quantum Liar Paradox[/B][/URL]

W.M. Stuckey • Michael Silberstein • Michael Cifone

...
It is now generally believed that Einstein-Podolsky-Rosen (EPR) correlations, i.e., correlated space-like separated experimental outcomes which violate Bell’s inequality, force us to abandon either the separability or locality principle.
...
As we will show, the Relational Blockworld [9–11] interpretation of NRQM points to a far more intimate and unifying connection between spacetime and the quantum than most have appreciated.
...
RBW employs the spatiotemporal relations via symmetries of the entire (past, present and future) experimental configuration and is thus fundamentally kinematical. And unlike other BW inspired accounts of quantum mechanics such as BCQM, RBW is truly acausal, adynamical and atemporal.


I guess that in this field of view you are right – there is nothing 'special' about the wires 'interacting' with the wave function in the Afshar experiment – it’s just a different setup that generates a different mathematical (RBW) formula to explain what happens.

For a layman like me it’s quite easy, if one has to make a choice of, what can be consider more real than the other – and I intuitively choose the experiment and the physical wires, not the RBW mathematics, as my reality. Maybe future progress in science can physically prove me wrong – and then I have to change my mind, whether I like it or not.

Another 'objection' I have against RBW is the way you get rid of 'one problem' (separability vs. locality principle) by introducing another (to me stranger) 'phenomena' in; symmetries of the entire past, present and future + truly acausal, adynamical and atemporal...?

It doesn’t give me that 'natural' "WOW-feeling"... but maybe it’s idiocy to look/hope for some "human/natural/logical" explanation to what’s going on in the QM-world... ?:bugeye:?
 
Last edited by a moderator:
  • #650
my_wan said:
What conditions must be met to qualify as this "one"? Does it require an explicit theory that explains exactly the mechanism by which BI violations occur? Does is simply require a toy models which mimics such violations? Are these (toy) models required to be rotational invariant, in spite of the fact that you pointed out Mermin showed rotational invariance entail non-real variables? Are you denying that realistic classical variables exist that lack rotational invariance?

Why do I ask so many questions at once? Because I give detailed explanations and spread dozens of questions and you never answer anyway, so I ask in irony. You responses, or lack of, are tipping my scales.

Look at how you just now responded to the -RAW- sarcasm of billschnieder, by accusing him of an admission of your point of view.



The "fair sampling" accusation here is facetious as hell., and you know it. You've taken a prior issue, where the sampling accuracy of the experimental data was questioned, and pretending it's the same argument contained in a toy model describing contextually of variables. I think you know good and well that "fair sampling" is a separate issue, and if I'm wrong, why respond to explanations that its not with accusations of your opponents admitting they are wrong in the same post they are explaining this to you?

You've continually mirrored my points, as if they were a denial of my points. You've claimed ignorance repeatedly, even of something so simple as a vectorial product before and after a coordinate rotation. Failed to explain your position when asked. Failed to even explain what you claimed not to understand, even when asked. Continually, and chronically, fail to provide any context, or justification, or explanation, to your rebuttals. You even fail to specify what, in post for which much effort was put in, your rebuttals actually refer to in that post.

What's your point of even debating here? To run authoritative interference of any viewpoints you don't like?

Show me I'm wrong, and answer some questions. Ask some questions if somethings not clear. But to simply keep accusing people of viewpoints they are expending great effort and honest feedback to deny, with no good faith return rebuttals, appears awfully bad to me. My apologies for the aggravation, but it appears well called for when I'm not the only one getting this treatment, and others seem to understand what you deny understanding of. Without even the courtesy of questions indicate where the clarity was lacking, only accusations of admissions to your point of view.

The real shame is that you have a perspective I really really want to understand.

I have repeated indicated that it is not possible to come up with a local realistic dataset. That is all it takes to refute Bell. You tried and failed, as have others - including myself! Yes, I have tried to break Bell many times and this has taught me where its strengths are.

I am trying to be patient, but what I actually have on my hands is people who deny a standard scientific viewpoint which has been scrutinized by thousands. Which has over a thousand papers published annual with both theoretical and experimental support. And you are treating my position - the standard one - as if it requires non-stop defense. Well, actually the burden is on you.

Yes, I know that science can be wrong but that possibility does not make it wrong and is in NO WAY a help to your position. It should give you pause, however, in your assertions. I know that whenever I find myself speculating against the mainstream, that is usually a sure sign that I need to do some additional research. I do actively disagree with some scientific research - especially in the area of human research studies - but I temper that with the desire to present something USEFUL in its place. I have repeatedly suggested that

It does little good for you to say "I reject standard definitions" and "I need more proof". I can't provide that for you, and in fact no one can. Only you can do that.

-----------------------------

If you think that Bell is wrong, consider one of the following:

a) Come up with a dataset. You should have already learned the difficulty with this after developing your model.
b) Come up with a different and better definition of locality that yields the possibility of a local realistic dataset (under this revised definition). Then present it and see if others accept this as an alternative definition.
c) Come up with a different and better definition of realism that yields the possibility of a local realistic dataset (under this revised definition). Then present it and see if others accept this as an alternative definition.

As to the fair sampling assumption: billschnieder brought it up again, not me. It does not properly belong in discussions of EPR and Bell, instead is more appropriate to the evaluation of Bell tests. I take this area quite seriously and am involved in active research into such models, specifically looking at the De Raedt local realistic models. But please note: they are the ONLY team that has EVER - to my knowledge - provided a local realistic dataset to critique. So you are far off the mark, my friend. Give me something specific that addresses the meat of the subject.

Stuff like your billiard ball example - and billschnieder's African doctors - does not come close. This is the quantum world, and experimentalists are running hundreds of experiments that are flat out in contradiction to the local realistic world.
 

Similar threads

Replies
45
Views
3K
Replies
4
Views
1K
Replies
18
Views
3K
Replies
6
Views
2K
Replies
2
Views
2K
Replies
100
Views
10K
Back
Top