Register to reply

Arguments Against Superdeterminism

by ueit
Tags: arguments, superdeterminism
Share this thread:
kote
#91
Aug14-09, 09:14 AM
P: 871
Quote Quote by ThomasT View Post
A higher order explanation isn't necessarily superfluous, though it might, in a certain sense, be considered as such if there exists a viable lower order explanation for the same phenomenon.
ThomasT,

If there is no viable lower order explanation then by definition you aren't dealing with a higher order explanation. Higher order explanations, as such, are not necessary, and unless they are reducible to first order explanations, they cannot be sufficient either.

Basically, they aren't true causes (or explanations) at all.
ThomasT
#92
Aug14-09, 04:01 PM
P: 1,414
Quote Quote by ueit View Post
What I am interested is local determinism, as non-local determinism is clearly possible (Bohm's interpretation). There are many voices that say that it has been ruled out by EPR experiments.
Lhv formalisms of quantum entangled states are ruled out -- not the possible existence of lhv's. As things stand now, there's no conclusive argument for either locality or nonlocality in Nature. But the available physical evidence suggests that Nature behaves deterministically according to the principle of local causation.

Quote Quote by ueit View Post
I am interested in seeing if any strong arguments can be put forward against the use of SD loophole (denying the statistical independence between emitter and detector) to reestablish local deterministic theories as a reasonable research object.
You've already agreed that the method of changing the polarizer settings, as well as whether or not they're changed while the emissions are in flight, incident on the polarizers, is irrelevant to the rate of joint detection.

The reason that Bell inequalities are violated has to do with the formal requirements due to the assumption of locality. This formal requirement also entails statistical independence of the accumulated data sets at A and B. But entanglement experiments are designed and executed to produce statistical dependence vis the pairing process.

There's no way around this unless you devise a model that can actually predict individual detections.

Or you could reason your way around the difficulty by noticing that the hidden variable (ie., the specific quality of the emission that might cause enough of it to be transmitted by the polarizer to register a detection) is irrelevant wrt the rate of joint detection (the only thing that matters wrt joint detection is the relationship, presumably produced via simultaneous emission, between the two opposite-moving disturbances) . Thus preserving the idea that the correlations are due to local interactions/transmissions, while at the same time modelling the joint state in a nonseparable form. Of course, then you wouldn't have an explicitly local, explicitly hidden variable model, but rather something along the lines of standard qm.
SW VandeCarr
#93
Aug14-09, 05:38 PM
P: 2,501
Quote Quote by ThomasT View Post

Yes, of course, the presumed existence of hidden variables comes with the assumption of determinism. I don't think anybody would deny that hidden variables exist.
Well, if you assume D then you must be including local hidden variables. Therefore you're rejecting both Bell's Theorem and the Heisenberg uncertainty principle. Moreover, I guess quantum fluctuations at the Planck scale could not be random either.

My reductio ad absurdum argument was based on thermodynamics which at the theoretical level is based on probabilities. If a system can only exist in one possible state and only transit into one other possible state, there is no Markov process. All states exist with p=1 or p=0. (past. present, and future). Under D, probabilities can only reflect our uncertainty. If you plug 0 or 1 into the Gibbs equation, you get positive infinity or 0. Any values in between (under D) are merely reflections of our uncertainty. Yet we can actually measure finite non zero values of entropy in experiments (defined as Q/T or heat/temp). Such results cannot be only reflections of our uncertainty. Remember, there is no statistical independence under D.

None of this either proves or disproves D. I don't think it can be done. It seems to be essentially a metaphysical issue. However, it seems to me (I'm not a physicist) like you have give up a lot to assume D at all scales.
ThomasT
#94
Aug14-09, 09:13 PM
P: 1,414
Quote Quote by SW VandeCarr View Post
Well, if you assume D then you must be including local hidden variables. Therefore you're rejecting both Bell's Theorem and the Heisenberg uncertainty principle. Moreover, I guess quantum fluctuations at the Planck scale could not be random either.
Yes, I'm including local hidden variables. Bell's analysis has to do with the formal requirements of lhv models of entangled states, not with what might or might not exist in an underlying quantum reality. The HUP has to do with the relationship between measurements on canonically conjugate variables. The product of the statistical spreads is equal to or greater than Planck's constant. Quantum fluctuations come from an application of the HUP. None of this tells us whether or not there is an underlying quantum reality. I would suppose that most everybody believes there is. It also doesn't tell us whether Nature is local or nonlocal. So, the standard assumption is that it's local.


Quote Quote by SW VandeCarr View Post
None of this either proves or disproves D. I don't think it can be done.
I agree.

Quote Quote by SW VandeCarr View Post
It seems to be essentially a metaphysical issue.
I suppose so, but not entirely insofar as metaphysical constructions can be evaluated wrt our observations of Nature. And I don't think that one has to give up anything that's accepted as standard mainstream physical science to believe in a locally deterministic underlying reality.
ThomasT
#95
Aug14-09, 10:11 PM
P: 1,414
Quote Quote by kote View Post
Local causation stemming from real classical particles and waves has been falsified by experiments. EPRB type experiments are particularly illustrative of this fact.
These are formal issues. Not matters of fact about what is or isn't true about an underlying reality.

Quote Quote by kote View Post
If there is evidence of deep reality being deterministic, I would like to know what it is :).
It's all around you. Order and predictability is the rule in physical science, not the exception. The deterministic nature of things is apparent on many levels, even wrt quantum experimental phenomena. Some things are impossible to predict, but, in general, things are not observed to happen independently of antecedent events. The most recent past (the present) is only slightly different from 1 second before. Take a movie of any physical process that you can visually track and look at it frame by frame.

There isn't any compelling reason to believe that there aren't any fundamental deterministic dynamics governing the evolution of our universe, or that the dynamics of waves in media is essentially different wrt any scale of behavior. In fact, quantum theory incorporates lots of classical concepts and analogs.

Quote Quote by kote View Post
As for the universe being locally deterministic, this has been proven impossible.
This is just wrong. Where did you get this from?

Anyway, maybe you should start a new thread here in the philosophy forum on induction and/or determinism. I wouldn't mind discussing it further, but I don't think we're helping ueit wrt the thread topic.
SW VandeCarr
#96
Aug15-09, 04:33 AM
P: 2,501
=ThomasT;I suppose so, but not entirely insofar as metaphysical constructions can be evaluated wrt our observations of Nature. And I don't think that one has to give up anything that's accepted as standard mainstream physical science to believe in a locally deterministic underlying reality.
I think we may have to give up more if we want D. You didn't address my thermodynamic argument. Entropy is indeed a measure of our uncertainty regarding the state of a system. We already agreed that our uncertainty has nothing to do with nature. Yet how is it that we can measure entropy as the relation Q/T? The following shows how we can derive the direct measure of entropy from first principles (courtesy of Count Iblis):

http://en.wikipedia.org/wiki/Fundame...rst_principles


The assumption behind this derivation is that the position and momentum of each of N particles (atoms or molecules) in a gas are uncorrelated (statistically independent). However D doesn't allow for statistical independence. Under D the position and momentum of each particle at any point in time is predetermined. Therefore there is full correlation of the momenta of all the particles.

What is actually happening when the experimenter heats the gas and observes a change in the Q/T relation (entropy increases)? Under D the whole experiment is a predetermined scenario with the actions of the experimenter included. The experimenter didn't decide to heat the gas or even set up the experiment. The experimenter had no choice. She or he is an actor following the deterministic script. Everything is correlated with everything else with measure one. There really is no cause and effect. There is only a the predetermined succession of states. Therefore you're going to have to give up the usual (strong) form of causality where we can perform experimental interventions to test causality (if you want D).

Causality is not defined in mathematics or logic. It's usually defined operationally where, given A is the necessary, sufficient and sole cause of B, if you remove A, then B cannot occur. Well under D we cannot remove A unless it was predetermined that p(B)=0. At best, we can have a weak causality where we observe a succession of states that are inevitable.
ueit
#97
Aug15-09, 07:45 AM
P: 389
Quote Quote by kote View Post
ueit,

Statistics are calculated mathematically from individual measurements. They are aggregate observations about likelihoods. Determinism deals with absolute rational causation. It would be an inductive fallacy to say that statistics can tell us anything about the basic mechanisms of natural causation.
There is no fallacy here. One may ask what deterministic models could fit the statistical data. If you are lucky you may falsify some of them and find the "true" one. There is no guarantee of success but there is no fallacy either.

Linguistically, we may use statistics as 2nd order explanations in statements of cause and effect, but it is understood that statistical explanations never represent true causation. If determinism exists, there must necessarily be some independent, sufficient, underlying cause - some mechanism of causation.
I don't understand the meaning of "independent" cause. Independent from what? Most probable, the "cause" is just the state of the universe in the past.

Because of the problem of induction, no irreducible superdeterministic explanation can prove anything about the first order causes and effects that basic physics is concerned with.
No absolute proof is possible in science and I do not see any problem with that. Finding a SD mechanism behind QM could lead to new physics and I find this interesting.

The very concept of locality necessarily implies classical, billiard-ball style, momentum transfer causation. The experiments of quantum mechanics have conclusively falsified this model.
This is false. General relativity or classical electrodynamics are local theories, yet they are not based on the billiard-ball concept but on fields.
ueit
#98
Aug15-09, 08:25 AM
P: 389
Quote Quote by ThomasT View Post
Lhv formalisms of quantum entangled states are ruled out -- not the possible existence of lhv's. As things stand now, there's no conclusive argument for either locality or nonlocality in Nature. But the available physical evidence suggests that Nature behaves deterministically according to the principle of local causation.

You've already agreed that the method of changing the polarizer settings, as well as whether or not they're changed while the emissions are in flight, incident on the polarizers, is irrelevant to the rate of joint detection.

The reason that Bell inequalities are violated has to do with the formal requirements due to the assumption of locality. This formal requirement also entails statistical independence of the accumulated data sets at A and B. But entanglement experiments are designed and executed to produce statistical dependence vis the pairing process.

There's no way around this unless you devise a model that can actually predict individual detections.

Or you could reason your way around the difficulty by noticing that the hidden variable (ie., the specific quality of the emission that might cause enough of it to be transmitted by the polarizer to register a detection) is irrelevant wrt the rate of joint detection (the only thing that matters wrt joint detection is the relationship, presumably produced via simultaneous emission, between the two opposite-moving disturbances) . Thus preserving the idea that the correlations are due to local interactions/transmissions, while at the same time modelling the joint state in a nonseparable form. Of course, then you wouldn't have an explicitly local, explicitly hidden variable model, but rather something along the lines of standard qm.
I think I should better explain what I think it does happen in an EPR experiment.

1. At source location, the field is a function of the detectors' state. Because the model is local this information is "old". If the detectors are at 1 ly away, then the source "knows" the detectors' state as it was 1 year in the past.

2. From this available information and the deterministic evolution law the source "computes" the future state of the detectors when the particles arrive there.

3. The actual spin of the particles is set at the moment of emission and does not change on flight.

4. The correlations are a direct result of the way the source "chooses" the spins of the entangled particles. It so happens that this "choice" follows Malus's law.

In conclusion, changing the detectors before detection has no relevance on the experimental results because these changes are taken into account when the source "decides" the particles' spin. Bell's inequality is based on the assumption that the hidden variable that determines the particle spin is not related to the way the detectors are positioned. The above model denies this. Both the position of the detector and the spin of the particle are a direct result of the past field configuration.
ueit
#99
Aug15-09, 08:33 AM
P: 389
Quote Quote by SW VandeCarr View Post
The assumption behind this derivation is that the position and momentum of each of N particles (atoms or molecules) in a gas are uncorrelated (statistically independent). However D doesn't allow for statistical independence. Under D the position and momentum of each particle at any point in time is predetermined. Therefore there is full correlation of the momenta of all the particles.
The trajectory of the particle is a function of the field produced by all other particles in the universe, therefore D does not require a strong correlation between the particles included in the experiment. Also I do not see the relevance of predetermination to the issue of statistical independence. The digits of Pi are strictly determined, yet no correlation exists between them. What you need for the entropy law to work is not absolute randomness but pseudorandomness.
SW VandeCarr
#100
Aug15-09, 02:54 PM
P: 2,501
Quote Quote by ueit View Post
The trajectory of the particle is a function of the field produced by all other particles in the universe, therefore D does not require a strong correlation between the particles included in the experiment. Also I do not see the relevance of predetermination to the issue of statistical independence. The digits of Pi are strictly determined, yet no correlation exists between them. What you need for the entropy law to work is not absolute randomness but pseudorandomness.
Correlation is the degree of correspondence between two random variables. There are no random variables involved in the computation of pi.

Under D, probabilities only reflect our uncertainty. They have nothing to do with nature (as distinct from ourselves). Statistical independence is an assumption based on our uncertainty. Ten fair coin tosses are assumed to be statistically independent based on our uncertainty of the outcome. We imagine there are 1024 possible outcomes, Under D there is only one possible outcome and if we had perfect information we could know that outcome.

Under D, not only is the past invariant, but the future is also invariant. If we had perfect information the future would be as predictable as the past is "predictable". It's widely accepted that completed events have no information value (ie p=1) and that information only exists under conditions of our uncertainty.

I agree that with pseudorandomness the thermodynamic laws work, but only because of our uncertainty given we lack the perfect information which could be available (in principle) under D.

EDIT: When correlation ([tex]R^{2}[/tex]) is unity, it is no longer probabilistic in that no particles move independently of any other. Under D all particle positions and momenta are predetermined. If a full description of particle/field states is in principle knowable in the past, it is knowable in future under D.
ThomasT
#101
Aug20-09, 02:32 PM
P: 1,414
Quote Quote by SW VandeCarr View Post
What is actually happening when the experimenter heats the gas and observes a change in the Q/T relation (entropy increases)? Under D the whole experiment is a predetermined scenario with the actions of the experimenter included. The experimenter didn't decide to heat the gas or even set up the experiment. The experimenter had no choice. She or he is an actor following the deterministic script. Everything is correlated with everything else with measure one. There really is no cause and effect. There is only a the predetermined succession of states. Therefore you're going to have to give up the usual (strong) form of causality where we can perform experimental interventions to test causality (if you want D).

Causality is not defined in mathematics or logic. It's usually defined operationally where, given A is the necessary, sufficient and sole cause of B, if you remove A, then B cannot occur. Well under D we cannot remove A unless it was predetermined that p(B)=0. At best, we can have a weak causality where we observe a succession of states that are inevitable.
The assumption of determinism and the application of probabilities are independent considerations.

I wouldn't separate causality into strong and weak types. We observe invariant relationships, or predictable event chains, or, as you say, "a succession of states that are inevitable". Cause and effect are evident at the macroscopic scale.

Determinism is the assumption that there are fundamental dynamical rules governing the evolution of any physical state or spatial configuration. We already agreed that it can't be disproven.

The distinguishing characteristic of ueit's proposal isn't that it's deterministic. What sets it apart is that it involves an infinite field of nondiminishing strength centered on polarizer or other filtration/detection devices and/or device combinations and propagating info at c to emission devices thereby determining the time and type of emission, etc., etc. So far, it doesn't make much sense to me.

We already have a way of looking at these experiments which allows for an implicit, if not explicit, local causal view.

Anyway, his main question about arguments against the assumption of determinism has been answered, and I thought we agreed on this -- there aren't any good ones.
SW VandeCarr
#102
Aug20-09, 03:04 PM
P: 2,501
Quote Quote by ThomasT View Post
Anyway, his main question about arguments against the assumption of determinism has been answered, and I thought we agreed on this -- there aren't any good ones.
Of course you can't disprove or really even argue against metaphysical assumptions (except with other metaphysical assumptions). Nature appears effectively deterministic at macro-scales if we disregard human intervention and human activity in general. At quantum scales, it remains to be proven that hidden variables exist. (Afaik, there is no real evidence for hidden variables).Therefore strict (as opposed to effective) determinism remains a matter of taste. In any case, to the extent that science uses probabilistic reasoning, science is not based de facto on strict determinism. Thermodynamics is based almost entirely on probabilistic reasoning. Quantum mechanics is deterministic only insofar as probabilities are determined and confirmed by experiment.

(Note: I'm using "effective determinism" in terms of what we actually observe within the limits of measurement, and "strict determinism" as a philosophical paradigm.)
ThomasT
#103
Aug20-09, 08:31 PM
P: 1,414
Quote Quote by SW VandeCarr View Post
At quantum scales, it remains to be proven that hidden variables exist. (Afaik, there is no real evidence for hidden variables).
I think everybody should believe that hidden variables exist, ie., that there are deeper levels of reality than our sensory faculties reveal to us. The evidence is electrical, magnetic, gravitational, etc., phenomena.

Whether local hidden variable mathematical formulations of certain experimental preparations are possible is another thing altogether. This was addressed by Bell.

Ueit is interested in lhv models. Bell says we're not going to have them for quantum entangled states, and so far nobody has found a way around his argument.
WaveJumper
#104
Aug21-09, 04:46 PM
P: 649
Quote Quote by ThomasT
Anyway, his main question about arguments against the assumption of determinism has been answered, and I thought we agreed on this -- there aren't any good ones.

That's right, but how is it different to the idea that we are living in the Matrix and if no good arguments can be put forward against it, is that a model of the universe that you think should even be considered by science?
ThomasT
#105
Aug21-09, 08:26 PM
P: 1,414
Quote Quote by WaveJumper View Post
That's right, but how is it different to the idea that we are living in the Matrix and if no good arguments can be put forward against it, is that a model of the universe that you think should even be considered by science?
One difference is that there are some good reasons to believe in determinism. It seems that our universe is evolving in a somewhat predictable way. There are many particular examples of deterministic evolution on various scales. This suggests some pervading fundamental dynamic(s). So, physics makes that assumption.

We might be in some sort of Matrix. But there's no particular reason to think that we are. The question is, does our shared, objective reality seem more deterministic the more we learn about it?
kote
#106
Aug21-09, 08:34 PM
P: 871
Quote Quote by ThomasT View Post
The question is, does our shared, objective reality seem more deterministic the more we learn about it?
Well I think if you're looking at it that way the answer is very clearly "no" . QM had to go screw things up with the Copenhagen Interpretation giving up on deterministic objective reality completely. No one questioned it with Newton.

Since you mentioned it, it looks more and more like the world could be discrete, suggesting a structure with limits on its basic numerical accuracy - very Matrix-like.

I don't think science can tell us anything about the issue though. Hume covered that pretty well in my opinion. I do think the assumption of determinism is a rational extension of logic that needs to be made for the world to be intelligible for us.
ThomasT
#107
Aug21-09, 09:04 PM
P: 1,414
Quote Quote by kote View Post
Well I think if you're looking at it that way the answer is very clearly "no" . QM had to go screw things up with the Copenhagen Interpretation giving up on deterministic objective reality completely.
The CI tells us that the quantum of action and the requirements for objective communication place limits on what we can say about Nature. This has nothing to do with the assumption of determinism, which is a rational extension of what we do observe wrt the evolution of systems on various scales.

Quote Quote by kote View Post
Since you mentioned it, it looks more and more like the world could be discrete, suggesting a structure with limits on its basic numerical accuracy - very Matrix-like.
The more I learn, the more it seems to me that the world is a fundamentally seamless complex of interacting waveforms in a hierarchy of media. The metaphysical extension of quantization isn't discreteness per se, but rather resonances, harmonics, etc.

But maybe I don't understand what you're getting at here.

Quote Quote by kote View Post
I don't think science can tell us anything about the issue though. Hume covered that pretty well in my opinion. I do think the assumption of determinism is a rational extension of logic that needs to be made for the world to be intelligible for us.
Science is how we most objectively observe the world and least ambiguously communicate those observations. It wouldn't make much sense for us to talk about the world in any way other than how it seems to us to be evolving -- which is deterministically.
ThomasT
#108
Aug21-09, 09:47 PM
P: 1,414
Quote Quote by ueit View Post
1. At source location, the field is a function of the detectors' state. Because the model is local this information is "old". If the detectors are at 1 ly away, then the source "knows" the detectors' state as it was 1 year in the past.
Ok, lets say the setup is A <1ly> E <1ly> B. The emission part of a series of runs begins and ends before the filters/detectors at A and B are even built. After all of the emissions that might possibly be detected in the experiment have been in transit for, say, 10 months, then the experimenters at A and B build their ends and put the stuff in place.

If they've set things up correctly, then when the data sets at A and B are properly paired and correlated with the appropriate angular differences, then you'll see something closely approximating a cos^2 Theta dependency (Malus Law).

But the filters'/detectors' state couldn't have had anything to do with the emission values because the filters/detectors didn't even exist until all of the emissions were already more than 3/4 of the way to the filters/detectors.

Quote Quote by ueit View Post
2. From this available information and the deterministic evolution law the source "computes" the future state of the detectors when the particles arrive there.
But, in the above scenario, the source couldn't have the necessary information, even nonlocally, because there were no filters/detectors to generate a field until long after all of the emissions originated.

Yet the joint results would approximate a Malus Law dependency between angular difference and rate of coincidental detection.


Register to reply

Related Discussions
Why is there such little talk regarding superdeterminism? General Discussion 23
Arguments against LQG Beyond the Standard Model 24
Creationist Arguments. Biology 14
Geometric arguments General Math 5
Two contradictory arguments... Introductory Physics Homework 15