A Assumptions of the Bell theorem

  • #91
stevendaryl said:
Sure. My point is that physicists, unlike mathematicians, don’t consider inconsistencies fatal. Probably our best theories are inconsistent, but work well enough in a limited domain.

For a mathematician, his subject is defined by the theory. So if that theory is inconsistent, what are they talking about? But for a physicist, the ultimate subject matter is defined by observations. Theories are just tools. If the tools are flawed (inconsistent), then you just learn to work with their flaws.
 
  • Like
Likes Fra and Demystifier
Physics news on Phys.org
  • #92
stevendaryl said:
Theories are just tools. If the tools are flawed (inconsistent), then you just learn to work with their flaws.
And I hope, work to improve them?
/Fredrik
 
  • #93
stevendaryl said:
Theories are just tools. If the tools are flawed (inconsistent), then you just learn to work with their flaws.
But tools for what? If they are tools for making predictions, then it's OK. But physicists also use theories for conceptual understanding. So if the theory is inconsistent, then conceptual understanding can also be inconsistent. And if conceptual understanding is inconsistent and the physicist is aware of it, then the tool does not fulfill its purpose.
 
  • Like
Likes Fra
  • #94
stevendaryl said:
I'm not sure I understand the point you're making, but I think I agree that the weirdness of quantum mechanics is not its nonlocality. The only relevance of nonlocality is that it shows that that weirdness can't easily be explained in terms of a nonweird "deeper theory".
I realize was kind of redundant, I agree with what you write

I just meant to say that the it's the application of Reichenbach's Common Cause Principle that seems out of place. It does not follow that the probabiltiy spaces that Alice and Bob alone and together can represent via ensembles consistuted the common probability space that is presumed in the RCC? In here, there is the old pathological expectaions of how causation "must work", which apparently doesn't work so in nature.

/Fredrik
 
  • #95
lugita15 said:
Anyway, what would you say about counterfactual definiteness?
In the meantime, I have changed my mind on that. I think it is a vague term with at least 3 different meanings: micro realism, macro realism and determinism. Since all 3 are already included in my two lists, counterfactual definiteness should not be added to the list.
 
  • #96
Demystifier said:
Then I choose 1. Quantum theory in its minimal form is incomplete, one should add something, like objective collapse (a'la GRW) or additional variables (a'la Bohm).

But for this thread, option 2. is more interesting.
So the relevance for this thread would then be that the law of total probability should be among the necessary assumptions.
 
  • Like
Likes Demystifier
  • #97
The total law of probability assumes that the summation index/integration variable (the hidden variable parameter space) constitutes a proper partition of the event space. If the "hidden variables" isn't a pairwise disjoint set, then the law of total probability does not hold. As the law of total probability is key construct in a "sum over paths" approach, assumptions on this partition, is equivalent to assumptions of the intermediated interactions and thus in the conceptual extension the hamiltonian. And this assumption/ansatz of Bellts theorem that seems to me, hard to defend even a priori. Then only vauge defense, is that in the case where the hidden variable is note strictly isolated from the environment, but rather known, and its' merely hidden from the experiemnter, then this form makes more sense. Ie. nothing prevents a "hidden variable" mechanism, that does not constitute a partition, and thus doesn't obey the premise of bells theorem. This is because transitions in QM seems to make use "several paths at once", not just one at a time - simply weigthed.

This has been my point in previous posts as well, to distinguish between a really HIDDEN variable, and simply ignorance of the experimenter.

I think that the solipsist HV of Demystifiers, is one example of a possibiilty. As I think of it, then the hidden variable is hidden simply because its subjective to the agent. But it nevertheless rules the action of that agent, but in a way that looks "weird" to an outside observe, and this contributes to the total "interaction mechanism" in QM.

/Fredrik
 
  • Like
Likes Demystifier
  • #98
Fifty years of Bell’s theorem | CERN (home.cern)

Fifty years of Bell’s theorem

A paper by John Bell published on 4 November 1964 laid the foundations for the modern field of quantum-information science

4 NOVEMBER, 2014

By Christine Sutton

On 4 November 1964, a journal called Physics received a paper written by John Bell, a theoretician from CERN. The journal was short-lived, but the paper became famous, laying the foundations for the modern field of quantum-information science.

The background to Bell’s paper goes back to the late 1930s and Albert Einstein’s dislike of quantum mechanics, which he argued required “spooky actions at a distance”. In other words, according to quantum theory, a measurement in one location can influence the state of a system in another location. Einstein believed this appeared to occur only because the quantum description was not complete.

The argument as to whether quantum mechanics is indeed a complete description of reality continued for several decades until Bell, who was on leave from CERN at the time, wrote down what has become known as Bell’s theorem. Importantly, he provided an experimentally testable relationship between measurements that would confirm or refute the disliked “action at a distance”.

In the 1970s, experiments began to test Bell’s theorem, confirming quantum theory and at the same time establishing the base for what has now become major area of science and technology, with important applications, for example, in quantum cryptography.

Bell was born in Belfast, where he attended Queen’s University, eventually coming to CERN in 1960. For much of November the university is hosting a series of events in celebration of his work, including an exhibition Action at a Distance: The Life and Legacy of John Stewart Bell with photographs, objects and papers relating to Bell’s work alongside videos exploring his science and legacy. He sadly died suddenly in 1990. Now the Royal Irish Academy is calling for 4 November to be named “http://www.ria.ie/john-bell-day.aspx”.

For more

The original paper: “On the Einstein Podolsky Rosen Paradox” by J S Bell [PDF]

Fifty years of Bell’s theorem | CERN (home.cern)
 
  • #101
RUTA said:
Here is a good paper on Bell: https://arxiv.org/abs/1408.1826
Hmmm. I pass opinions of the historical description itself but...

"Nowadays, it is sometimes reported as ruling out, or at least calling in question, realism. But these are all mistakes. What Bell’s theorem, together with the experimental results, proves to be impossible (subject to a few caveats we will attend to) is not determinism or hidden variables or realism but locality"

Set aside the actual historical account, this summary makes no sense to me.

"A physical theory is EPR-‐local iff according to the theory procedures carried out in one region do not immediately disturb the physical state of systems in sufficiently distant regions in any significant way"

According to this definition I would not say we have EPR-nonlocality. All that is disturbed is the local expectation about the remote physical state, nothing else. Any "actual states" are always by definition shielded by expectations in a measurement theory, there is simply no way to directly "access" any raw truth by bypassing the constraints of the inference and measurement process. Ie. you can not gain knowledge by bypassing the scientific or inference process.

So is the problem realism? ie. that reality does not exist? What do they mean by realism or state of reality?

"If, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity."

No, I don't see a problem with that either. Ie. there is no problem with imaging a bare actual state of matter, it is as far as I can see, not incompatible with what we know! The problem I see, and fallacy is

The implicit assumption that it's the "raw reality" that are the correct variables in the causal relations of the laws of nature.

Lets just note that this is NOT the only possibility. The other possibility (which in my view is the by far more obvious and natural one, even though nonstandard) is that the laws of nature rather explain causal relations between expectations only! Ie. the "true state of matter" does NOT need to be (even should not) be explicitly present in the laws.

For those that are unfamiliar with this Agent based model logic, the economic interacts are an excellent example. At some point, the TRUE value of a product or curreny really does not matter for the GAME itself! It's all about expectations, influencing and revising expectations. If the market has high expectations on a value, then that is what its worth, even if the real state of matter os just fake.

So even if it is fine, to think of, and imagine about actual states of matter, that is not the problem. The key insight could well be that the INTERACTIONS are nevertheless best understood in terms of causal relations between expectations, NOT causal relations between elements of reality.

This is why I think both a kind of realism and locality can be saved, but its the "nature of causation" that we do not understand; or what is the flaw in the bell premise => This directly addresses the question of the nature of physical law! Ie. do nature "obey laws", or do nature just behave "law like", and why?

/Fredrik
 
Last edited:
  • #102
I just say, no reality, nothing to talk... no existence...
 
  • Like
Likes Demystifier
  • #103

John Stewart Bell​


Quick Info​

Born28 July 1928
Belfast, Northern IrelandDied1 October 1990
Geneva, Switzerland

SummaryJohn Stewart Bell was an Irish mathematician who worked in quantum mechanics.
thumbnail.jpg
View four larger pictures


Biography​

John Bell's great achievement was that during the 1960s he was able to breathe new and exciting life into the foundations of quantum theory, a topic seemingly exhausted by the outcome of the Bohr-Einstein debate thirty years earlier, and ignored by virtually all those who used quantum theory in the intervening period. Bell was able to show that discussion of such concepts as 'realism', 'determinism' and 'locality' could be sharpened into a rigorous mathematical statement, 'Bell's inequality', which is capable of experimental test. Such tests, steadily increasing in power and precision, have been carried out over the last thirty years.

Indeed, almost wholly due to Bell's pioneering efforts, the subject of quantum foundations, experimental as well as theoretical and conceptual, has became a focus of major interest for scientists from many countries, and has taught us much of fundamental importance, not just about quantum theory, but about the nature of the physical universe.

In addition, and this could scarcely have been predicted even as recently as the mid-1990s, several years after Bell's death, many of the concepts studied by Bell and those who developed his work have formed the basis of the new subject area of quantum information theory, which includes such topics as quantum computing and quantum cryptography. Attention to quantum information theory has increased enormously over the last few years, and the subject seems certain to be one of the most important growth areas of science in the twenty-first century.

John Stewart Bell's parents had both lived in the north of Ireland for several generations. His father was also named John, so John Stewart has always been called Stewart within the family. His mother, Annie, encouraged the children to concentrate on their education, which, she felt, was the key to a fulfilling and dignified life. However, of her four children - John had an elder sister, Ruby, and two younger brothers, David and Robert - only John was able to stay on at school much over fourteen. Their family was not well-off, and at this time there was no universal secondary education, and to move from a background such as that of the Bells to university was exceptionally unusual.

Bell himself was interested in books, and particularly interested in science from an early age. He was extremely successful in his first schools, Ulsterville Avenue and Fane Street, and, at the age of eleven, passed with ease his examination to move to secondary education. Unfortunately the cost of attending one of Belfast's prestigious grammar schools was prohibitive, but enough money was found for Bell to move to the Belfast Technical High School, where a full academic curriculum which qualified him for University entrance was coupled with vocational studies.

Bell then spent a year as a technician in the Physics Department at Queen's University Belfast, where the senior members of staff in the Department, Professor Karl Emeleus and Dr Robert Sloane, were exceptionally helpful, lending Bell books and allowing him to attend the first year lectures. Bell was able to enter the Department as a student in 1945. His progress was extremely successful, and he graduated with First-Class Honours in Experimental Physics in 1948. He was able to spend one more year as a student, in that year achieving a second degree, again with First-Class Honours, this time in Mathematical Physics. In Mathematical Physics, his main teacher was Professor Peter Paul Ewald, famous as one of the founders of X-ray crystallography; Ewald was a refugee from Nazi Germany.

Bell was already thinking deeply about quantum theory, not just how to use it, but its conceptual meaning. In an interview with Jeremy Bernstein, given towards the end of his life and quoted in Bernstein's book [1], Bell reported being perplexed by the usual statement of the Heisenberg uncertainty or indeterminacy principle (\Delta x \Delta p ≥ \hbarΔxΔp≥ℏ, where \Delta xΔx and \Delta pΔp are the uncertainties or indeterminacies, depending on one's philosophical position, in position and momentum respectively, and ℏ is the (reduced) Planck's constant).
It looked as if you could take this size and then the position is well defined, or that size and then the momentum is well defined. It sounded as if you were just free to make it what you wished. It was only slowly that I realized that it's not a question of what you wish. It's really a question of what apparatus has produced this situation. But for me it was a bit of a fight to get through to that. It was not very clearly set out in the books and courses that were available to me. I remember arguing with one of my professors, a Doctor Sloane, about that. I was getting very heated and accusing him, more or less, of dishonesty. He was getting very heated too and said, 'You're going too far'.
At the conclusion of his undergraduate studies Bell would have liked to work for a PhD. He would also have liked to study the conceptual basis of quantum theory more thoroughly. Economic considerations, though, meant that he had to forget about quantum theory, at least for the moment, and get a job, and in 1949 he joined the UK Atomic Research Establishment at Harwell, though he soon moved to the accelerator design group at Malvern.

It was here that he met his future wife, Mary Ross, who came with degrees in mathematics and physics from Scotland. They married in 1954 and had a long and successful marriage. Mary was to stay in accelerator design through her career; towards the end of John's life he returned to problems in accelerator design and he and Mary wrote some papers jointly. Through his career he gained much from discussions with Mary, and when, in 1987, his papers on quantum theory were collected [21], he included the following words:
I here renew very especially my warm thanks to Mary Bell. When I look through these papers again I see her everywhere.
Accelerator design was, of course, a relatively new field, and Bell's work at Malvern consisted of tracing the paths of charged particles through accelerators. In these days before computers, this required a rigorous understanding of electromagnetism, and the insight and judgment to make the necessary mathematical simplifications required to make the problem tractable on a mechanical calculator, while retaining the essential features of the physics. Bell's work was masterly.

In 1951 Bell was offered a year's leave of absence to work with Rudolf Peierls, Professor of Physics at Birmingham University. During his time in Birmingham, Bell did work of great importance, producing his version of the celebrated CPT theorem of quantum field theory. This theorem showed that under the combined action of three operators on a physical event: PP, the parity operator, which performed a reflection; CC, the charge conjugation operator, which replaced particles by anti-particles; and TT, which performed a time reversal, the result would be another possible physical event.
Unfortunately Gerhard Lüders and Wolfgang Pauli proved the same theorem a little ahead of Bell, and they received all the credit.

However, Bell added another piece of work and gained a PhD in 1956. He also gained the highly valuable support of Peierls, and when he returned from Birmingham he went to Harwell to join a new group set up to work on theoretical elementary particle physics. He remained at Harwell till 1960, but he and Mary gradually became concerned that Harwell was moving away from fundamental work to more applied areas of physics, and they both moved to CERN, the Centre for European Nuclear Research in Geneva. Here they spent the remainder of their careers.

Bell published around 80 papers in the area of high-energy physics and quantum field theory. Some were fairly closely related to experimental physics programmes at CERN, but most were in general theoretical areas.

The most important work was that of 1969 leading to the Adler-Bell-Jackiw (ABJ) anomaly in quantum field theory. This resulted from joint work of Bell and Ronan Jackiw, which was then clarified by Stephen Adler. They showed that the standard current algebra model contained an ambiguity. Quantisation led to a symmetry breaking of the model. This work solved an outstanding problem in particle physics; theory appeared to predict that the neutral pion could not decay into two photons, but experimentally the decay took place, as explained by ABJ. Over the subsequent thirty years, the study of such anomalies became important in many areas of particle physics. Reinhold Bertlmann, who himself did important work with Bell, has written a book titled Anomalies in Quantum Field Theory [10], and the two surviving members of ABJ, Adler and Jackiw shared the 1988 Dirac Medal of the International Centre for Theoretical Physics in Trieste for their work.

While particle physics and quantum field theory was the work Bell was paid to do, and he made excellent contributions, his great love was for quantum theory, and it is for his work here that he will be remembered. As we have seen, he was concerned about the fundamental meaning of the theory from the time he as an undergraduate, and many of his important arguments had their basis at that time.

The conceptual problems may be outlined using the spin-\large\frac{1}{2}\normalsize21 system. We may say that when the state-vector is \alpha_{+}α+ or \alpha_{-}α− respectively, s_{z}sz is equal to \large\frac{1}{2}\normalsize \hbar21ℏ and -\large\frac{1}{2}\normalsize \hbar−21ℏ respectively, but, if one restricts oneself to the Schrödinger equation, s_{x}sx and s_{y}sy just do not have values. All one can say is that if a measurement of s_{x}sx, for example, is performed, the probabilities of the result obtained being either \large\frac{1}{2}\normalsize \hbar21ℏ or -\large\frac{1}{2}\normalsize \hbar−21ℏ are both \large\frac{1}{2}\normalsize21.

If, on the other hand, the initial state-vector has the general form of c_{+}\alpha_{+}+ c_{-}\alpha_{-}c+α++c−α−, then all we can say is that in a measurement of s_{z}sz, the probability of obtaining the value of \large\frac{1}{2}\normalsize \hbar21ℏ is |c_{+}^{2} |∣c+2∣, and that of obtaining the value of -\large\frac{1}{2}\normalsize \hbar−21ℏ is |c_{-}^{2} |∣c−2∣. Before any measurement, s_{z}sz just does not have a value.

These statements contradict two of our basic notions. We are rejecting realism, which tells us that a quantity has a value, to put things more grandly -- the physical world has an existence, independent of the actions of any observer. Einstein was particularly disturbed by this abandonment of realism -- he insisted in the existence of an observer-free realm. We are also rejecting determinism, the belief that, if we have a complete knowledge of the state of the system, we can predict exactly how it will behave. In this case, we know the state-vector of the system, but cannot predict the result of measuring s_{z}sz.

It is clear that we could try to recover realism and determinism if we allowed the view that the Schrödinger equation, and the wave-function or state-vector, might not contain all the information that is available about the system. There might be other quantities giving extra information -- hidden variables. As a simple example, the state-vector above might apply to an ensemble of many systems, but in addition a hidden variable for each system might say what the actual value of s_{z}sz might be. Realism and determinism would both be restored; s_{z}sz would have a value at all times, and, with full knowledge of the state of the system, including the value of the hidden variable, we can predict the result of the measurement of s_{z}sz .

A complete theory of hidden variables must actually be more complicated than this -- we must remember that we wish to predict the results of measuring not just s_{z}sz, but also s_{x}sx and s_{y}sy, and any other component of ss. Nevertheless it would appear natural that the possibility of supplementing the Schrödinger equation with hidden variables would have been taken seriously. In fact, though, Niels Bohr and Werner Heisenberg were convinced that one should not aim at realism. They were therefore pleased when John von Neumann proved a theorem claiming to show rigorously that it is impossible to add hidden variables to the structure of quantum theory. This was to be very generally accepted for over thirty years.

Bohr put forward his (perhaps rather obscure) framework of complementarity, which attempted to explain why one should not expect to measure s_{x}sx and s_{y}sy (or xx and pp) simultaneously. This was his Copenhagen interpretation of quantum theory. Einstein however rejected this, and aimed to restore realism. Physicists almost unanimously favoured Bohr.

Einstein's strongest argument, though this did not become very generally apparent for several decades lay in the famous Einstein-Podolsky-Rosen (EPR) argument of 1935, constructed by Einstein with the assistance of his two younger co-workers, Boris Podolsky and Nathan Rosen. Here, as is usually done, we discuss a simpler version of the argument, thought up somewhat later by David Bohm.

Two spin-\large\frac{1}{2}\normalsize21 particles are considered; they are formed from the decay of a spin-\large\frac{1}{2}\normalsize21 particle, and they move outwards from this decay in opposite directions. The combined state-vector may be written as (\large\frac{1}{√2}\normalsize )(\alpha_{1-}\alpha_{2+} - \alpha_{1-} \alpha_{2+})(√21)(α1−α2+−α1−α2+), where the \alpha_{1}sα1s and \alpha_{2}sα2s for particles 1 and 2 are related to the \alpha sαs above. This state-vector has a strange form. The two particles do not appear in it independently; rather either state of particle 1 is correlated with a particular state of particle 2. The state-vector is said to be entangled.

Now imagine measuring s_{1}zs1z. If we get +\large\frac{1}{2}\normalsize v+21v, we know that an immediate measurement of s_{2}zs2z is bound to yield -\large\frac{1}{2}\normalsize \hbar−21ℏ, and vice-versa, although, at least according to Copenhagen, before any measurement, no component of either spin has a particular value.

The result of this argument is that at least one of three statements must be true:
(1) The particles must be exchanging information instantaneously i.e. faster than light;
(2) There are hidden variables, so the results of the experiments are pre-ordained; or
(3) Quantum theory is not exactly true in these rather special experiments.
The first possibility may be described as the renunciation of the principle of locality, whereby signals cannot be passed from one particle to another faster than the speed of light. This suggestion was anathema to Einstein. He therefore concluded that if quantum theory was correct, so one ruled out possibility (3), then (2) must be true. In Einstein's terms, quantum theory was not complete but needed to be supplemented by hidden variables.

Bell regarded himself as a follower of Einstein. He told Bernstein [1]:
I felt that Einstein's intellectual superiority over Bohr, in this instance, was enormous; a vast gulf between the man who saw clearly what was needed, and the obscurantist.
Bell thus supported realism in the form of hidden variables. He was delighted by the creation in 1952 by David Bohm of a version of quantum theory which included hidden variables, seemingly in defiance of von Neumann's result. Bell wrote [21]:
In 1952 I saw the impossible done.
In 1964, Bell made his own great contributions to quantum theory. First he constructed his own hidden variable account of a measurement of any component of spin. This had the advantage of being much simpler that Bohm's work, and thus much more difficult just to ignore. He then went much further than Bohm by demonstrating quite clearly exactly what was wrong with von Neumann's argument.

Von Neumann had illegitimately extended to his putative hidden variables a result from the variables of quantum theory that the expectation value of A + BA+B is equal to the sum of the expectation values of AA and of BB. (The expectation value of a variable is the mean of the possible experimental results weighted by their probability of occurrence.) Once this mistake was realized, it was clear that hidden variables theories of quantum theory were possible.

However Bell then demonstrated certain unwelcome properties that hidden variable theories must have. Most importantly they must be non-local. He demonstrated this by extending the EPR argument, allowing measurements in each wing of the apparatus of any component of spin, not just s_{z}sz. He found that, even when hidden variables are allowed, in some cases the result obtained in one wing must depend on which component of spin is measured in the other; this violates locality. The solution to the EPR problem that Einstein would have liked, rejecting (1) but retaining (2) was illegitimate. Even if one retained (2), as long as one maintained (3) one had also to retain (1).

Bell had showed rigorously that one could not have local realistic theories of quantum theory. Henry Stapp called this result [18]:
the most profound discovery of science.
The other property of hidden variables that Bell demonstrated was that they must be contextual. Except in the simplest cases, the result you obtained when measuring a variable must depend on which other quantities are measured simultaneously. Thus hidden variables cannot be thought of as saying what value a quantity 'has', only what value we will get if we measure it.

Let us return to the locality issue. So it has been assumed that quantum theory is exactly true, but of course this can never be known. John Clauser, Richard Holt, Michael Horne and Abner Shimony adapted Bell's work to give a direct experimental test of local realism. Thus was the famous CHHS-Bell inequality [19], often just called the Bell inequality. In EPR-type experiments, this inequality is obeyed by local hidden variables, but may be violated by other theories, including quantum theory.

Bell has reached what has been called experimental philosophy; results of considerable philosophical importance may be obtained from experiment. The Bell inequalities have been tested over nearly thirty years with increasing sophistication, the experimental tests actually using photons with entangled polarisations, which are mathematically equivalent to the entangled spins discussed above. While many scientists have been involved, a selection of the most important would include Clauser, Alain Aspect and Anton Zeilinger.

While at least one loophole still remains to be closed [in August 2002], it seems virtually certain that local realism is violated, and that quantum theory can predict the results of all the experiments.

For the rest of his life, Bell continued to criticize the usual theories of measurement in quantum theory. Gradually it became at least a little more acceptable to question Bohr and von Neumann, and study of the meaning of quantum theory has become a respectable activity.

Bell himself became a Fellow of the Royal Society as early as 1972, but it was much later before he obtained the awards he deserved. In the last few years of his life he was awarded the Hughes Medal of the Royal Society, the Dirac Medal of the Institute of Physics, and the Heineman Prize of the American Physical Society. Within a fortnight in July 1988 he received honorary degrees from both Queen's and Trinity College Dublin. He was nominated for a Nobel Prize; if he had lived ten years longer he would certainly have received it.

This was not to be. John Bell died suddenly from a stroke on 1st October 1990. Since that date, the amount of interest in his work, and in its application to quantum information theory has been steadily increasing.
https://mathshistory.st-andrews.ac.uk/Biographies/Bell_John/

Other Mathematicians born in Ireland
A Poster of John Bell
 
  • #104
Fra said:
This is why I think both a kind of realism and locality can be saved, but its the "nature of causation" that we do not understand; or what is the flaw in the bell premise => This directly addresses the question of the nature of physical law! Ie. do nature "obey laws", or do nature just behave "law like", and why?
Sorry, I don't understand exactly what you're suggesting.

The simple intuitive reasoning for why anti-correlated spin-1/2 EPR seems to show nonlocality is this:

Pick an inertial coordinate system in which Alice and Bob are at rest, and Alice's measurement takes place slightly before Bob's. Then let's talk about three different events:
  1. ##e_1##: Immediately before Alice makes her measurement.
  2. ##e_2##: Immediately after Alice makes her measurement.
  3. ##e_3##: When Bob makes his measurement.
Alice is trying, based on information available to her, to figure out what result Bob will get at event ##e_3##. For simplicity, let's assume that Alice and Bob both agree ahead of time to measure spins along the z-axis.

At event ##e_1##, Alice is completely uncertain about Bob's result at ##e_3##. He might get spin-up. He might get spin-down. Her subjective probabilities for each possibility is 50%.

At event ##e_2##, Alice is completely certain about Bob's result. If she got spin-up, she knows that Bob will get spin-down, and vice-versa.

So Alice's subjective knowledge about Bob's result changed drastically between event ##e_1## and event ##e_2##. The issue, then, is what caused this change in her knowledge?

Classically, your knowledge about future events changes according to a two-step, back and forth process:
  1. You learn something about past (or present) conditions.
  2. Those facts allow you to recompute probabilities for those future events.
You can only learn something about conditions that are in your causal past (events that influence your present), and that can only give you information about the causal future of those conditions. The example I gave earlier in this thread is that Alice finds Bob's wallet, and she predicts that he will be unable to board a flight due to a lack of proper ID. Alice seeing the wallet in the present tells her something about the past: That Bob lost his wallet. That past tells you about Bob's future: that he will be unable to board the flight.

Without spending too much time on it, I think that it's pretty much true that every case of learning something about future events has this character: You go backwards in time, using deduction, from observation to facts about the past, and then you go forwards in time, again using deduction from past conditions to future events. Einstein's speed of light limitation on influences further restricts things: The causal past of your current observations can only tell you about conditions in your backwards light cone, and the causal future of those conditions can only tell you about events in the future light cone of those conditions.

In contrast, in EPR, Alice's observations now tell her something about Bob's future measurements in a way that is not (apparently) mediated by the back and forth causal influences within a light cone. That's a sense in which Alice's information is nonlocal information about Bob. That doesn't necessarily imply nonlocal influences.
 
  • #105
stevendaryl said:
In contrast, in EPR, Alice's observations now tell her something about Bob's future measurements in a way that is not (apparently) mediated by the back and forth causal influences within a light cone. That's a sense in which Alice's information is nonlocal information about Bob. That doesn't necessarily imply nonlocal influences.
i follow and I agreee with what you write. But if you label that non-local, it is a nonissue I think.

Your definition does not seem to be the same as that paper?
"A physical theory is EPR-‐local iff according to the theory procedures carried out in one region do not immediately disturb the physical state of systems in sufficiently distant regions in any significant way"

This does not speak about information, it speaks about physical states. Which I think the author means to be be "elements of reality". ITs seems by changing definitions we can just decide if we want local or nonlocal :)

Edit: the only twist I see on this, which i suspect is nothing like the author means. Is that even expectations are somehow elements of reality, in that they are encoded somwhere. But this is obviously NOT what they mean, as that makes it even more strange (from their perspective).

stevendaryl said:
The issue, then, is what caused this change in her knowledge?
Anyway, this is was mote the question i considered worth asking. The answer to your question to me is, simply an information update, combined with the premise implied in pair creation. That Alices information has information about Bobs future is not a problem per see at all.

The question I asked is not why we have a correlation, we know we do.

The question I ask is: If we have a pre-correlation, then why does the causal ansatz in bells theorem fail?

Or put differently, HOW COME, the "physics" and "interactions" as infereed by a physicists at Bobs lab (inferring the laws of physics), is indifferent to wether Alice perfomed the measurement or not? Does the particle still behave as a superposition, even after Alice popped the bubble? Why? And how can we "make sense of this"? Ie what is the logic here, that can make us all say "aha"?

/Fredrik
 
Last edited:
  • #106
Fra said:
Your definition does not seem to be the same as that paper?
"A physical theory is EPR-‐local iff according to the theory procedures carried out in one region do not immediately disturb the physical state of systems in sufficiently distant regions in any significant way"
Well, my definition is more agnostic about what's going on. Alice's observations tell her something about a spacelike separated event, and it isn't apparently mediated by causal influences. If you assume that learning information is always mediated by causal influences, then you have to conclude that there are nonlocal influences going on.

Fra said:
This does not speak about information, it speaks about physical states. Which I think the author means to be be "elements of reality". ITs seems by changing definitions we can just decide if we want local or nonlocal :)

Physical states become involved if you're trying to understand how information can propagate. If you just leave it as a mystery, then you can just shrug your shoulders and say: I don't know what the heck is going on.

Fra said:
Anyway, this is was mote the question i considered worth asking. The answer to your question to me is, simply an information update, combined with the premise implied in pair creation. That Alices information has information about Bobs future is not a problem per see at all.
No, it's not simply an information update. If Alice is merely finding out information about Bob, then that information existed before she learned it. That's what Bell ruled out (except for the various loopholes such as FTL influences, back-in-time influences, Many-Worlds, superdeterminism, and ... I think that covers it).
 
  • #107
stevendaryl said:
If Alice is merely finding out information about Bob, then that information existed before she learned it. That's what Bell ruled out (except for the various loopholes such as FTL influences, back-in-time influences, Many-Worlds, superdeterminism, and ... I think that covers it).
I think you missed the distinction I tried to made then. I do not agree Bell ruled this out.

I tried to make the distinction between existence of

1) "hidden information" but that does not have causal influences, because its ISOLATED. Not just isolated from the experimenter (as in ignorance) but truly isolated from ALL environment (weter this makes sense in reality to accomplish over long distances is of course a practical question). This means this information really does NOT exist in the environemnt!

2) "ignorance" where the information exists, and in principle COULD be know to the experimenter, and the experimental apparatous, but isn't because the theorist making predictions didnt know about it. This is the case bell rules out. In this case the information exists in the environment, and possible known by the expeimental device, basically known but all exceopt Bob!

My point is that (1) remains a possibility, but this type of hidden information is NOT regular ignorance. It's a fundamental form of hidden information, just like you make consider how information is hidden in a black hole, or tha truth is "screened" from all inference.

But a real theory that realizes (1) is not yet know, The point is just that it remains not only possible, but at least in my mind extremely reasonable and almost like the only reasonble possibiity. To understad this, I think one needs to describe things not by means of a differential equation, but as agent based model. This has been iterating in my head for years and trying reconstruct things in a new framework is not easily done. Agent based models are quite alien to physicsts, but more common in economic interaction models etc. AS we know from history, some ways of describing things, like in terms of geometry, has certain intuitive advantages that helps understand and get insight. I see this just like that. Agent based models, nicely merget with various qbist kind of interpretations, especially if you add the oncept of evolution of law, that smoling talked about.

So the wet dream, informallyt is that say our standard model is a nash equiblirium, and the optimal strategies simple describe the equilibrium strategies of the elemetary particles.

/Fredrik
 
  • #108
Fra said:
1) "hidden information" but that does not have causal influences, because its ISOLATED. Not just isolated from the experimenter (as in ignorance) but truly isolated from ALL environment (weter this makes sense in reality to accomplish over long distances is of course a practical question). This means this information really does NOT exist in the environemnt!
I am not sure what that would mean. But it sounds nonlocal to me.

But there is another issue, which is that Bell’s proof shows that you can’t really separate the information about what result Bob will get from information about which measurement Bob will perform.

That is, there can’t be a predetermined “case” statement saying if Bob performs this measurement, he will get this result, and if he performs this other measurement he will this other result, etc. You can only meaningfully talk about his results (or the probability for different results) after you know the measurement that Bob will perform.

So I don’t see how you can have hidden information, nonlocal or not, about Bob’s result without knowledge of the measurement he will perform. Which in principle means some kind of superdeterminism.
 
  • #109
stevendaryl said:
Sorry, I don't understand exactly what you're suggesting.

The simple intuitive reasoning for why anti-correlated spin-1/2 EPR seems to show nonlocality is this:

Pick an inertial coordinate system in which Alice and Bob are at rest, and Alice's measurement takes place slightly before Bob's. Then let's talk about three different events:
  1. ##e_1##: Immediately before Alice makes her measurement.
  2. ##e_2##: Immediately after Alice makes her measurement.
  3. ##e_3##: When Bob makes his measurement.
Alice is trying, based on information available to her, to figure out what result Bob will get at event ##e_3##. For simplicity, let's assume that Alice and Bob both agree ahead of time to measure spins along the z-axis.

At event ##e_1##, Alice is completely uncertain about Bob's result at ##e_3##. He might get spin-up. He might get spin-down. Her subjective probabilities for each possibility is 50%.

At event ##e_2##, Alice is completely certain about Bob's result. If she got spin-up, she knows that Bob will get spin-down, and vice-versa.

So Alice's subjective knowledge about Bob's result changed drastically between event ##e_1## and event ##e_2##. The issue, then, is what caused this change in her knowledge?

Classically, your knowledge about future events changes according to a two-step, back and forth process:
  1. You learn something about past (or present) conditions.
  2. Those facts allow you to recompute probabilities for those future events.
You can only learn something about conditions that are in your causal past (events that influence your present), and that can only give you information about the causal future of those conditions. The example I gave earlier in this thread is that Alice finds Bob's wallet, and she predicts that he will be unable to board a flight due to a lack of proper ID. Alice seeing the wallet in the present tells her something about the past: That Bob lost his wallet. That past tells you about Bob's future: that he will be unable to board the flight.

Without spending too much time on it, I think that it's pretty much true that every case of learning something about future events has this character: You go backwards in time, using deduction, from observation to facts about the past, and then you go forwards in time, again using deduction from past conditions to future events. Einstein's speed of light limitation on influences further restricts things: The causal past of your current observations can only tell you about conditions in your backwards light cone, and the causal future of those conditions can only tell you about events in the future light cone of those conditions.

In contrast, in EPR, Alice's observations now tell her something about Bob's future measurements in a way that is not (apparently) mediated by the back and forth causal influences within a light cone. That's a sense in which Alice's information is nonlocal information about Bob. That doesn't necessarily imply nonlocal influences.
I think that there is an important aspect that needs to be explicitly mentioned. Alice cannot make any predictions about event ##e_3## based only on the local measurements she makes. She also needs to know what the state of the two particle system is. That seems like nonlocal information to me. So Alice's input is the result of her measurement plus the state of the system, then the output of her prediction is the result of Bob's measurement.
 
  • #110
martinbn said:
I think that there is an important aspect that needs to be explicitly mentioned. Alice cannot make any predictions about event ##e_3## based only on the local measurements she makes. She also needs to know what the state of the two particle system is. That seems like nonlocal information to me. So Alice's input is the result of her measurement plus the state of the system, then the output of her prediction is the result of Bob's measurement.
But the state of the system is presumably deductible from the fact that it was prepared in some particular way in Alice's past. There was, for example, a particle that decayed into an anti-correlated electron/positron pair.
 
  • #111
stevendaryl said:
I am not sure what that would mean. But it sounds nonlocal to me.
No, that is not the idea. It shold be local.
stevendaryl said:
But there is another issue, which is that Bell’s proof shows that you can’t really separate the information about what result Bob will get from information about which measurement Bob will perform.

That is, there can’t be a predetermined “case” statement saying if Bob performs this measurement, he will get this result, and if he performs this other measurement he will this other result, etc. You can only meaningfully talk about his results (or the probability for different results) after you know the measurement that Bob will perform.

So I don’t see how you can have hidden information, nonlocal or not, about Bob’s result without knowledge of the measurement he will perform. Which in principle means some kind of superdeterminism.
If we for simplicitly introduce the "agent" notion also for the anti-correlated pair, it makes it easier to explain:

Consider the pair production produces two agents(particles) with correlated states (that's the premise). One heads towards Alice and one towards Bob. Then the idea is that the "hidden information" of each agent, IS the agents subjective expectation of its own environment. This is physically encoded in the microstate of the agent. The only way for any part of the environment(ie any other agent) to infer this hidden info, is to interact (ie make a measurement) (*)

This hidden info, can not predict outcomes of combinations with unknown future interactions that involves delayed choices from other agents in the environment. It just explains, the agents ACTION in response to perturbation. Ie. it explains ONE side of the interaction. The outcome of the interaction also depends on the ACTION of the other interacting part, ie Bobs measurement device; whose ACTION in turn depends on it's expectation of it's environment and incoming agent; and this expectation follows from device settings and information of the state preparation prior to pair production event.

This is a way to see how the physical interactions observed at Bobs lab, are really independent of wether Alice made a measurement or not. But in a way that also explains the correlation, without non-locality, and still allowing for a "hidden realism" that isn't ruled out by Bells theorem.

They key is in this view is the logic of causation. The hidden info, does not constitue a partition of the event space, so the the law of total probability does not hold, when summed over the "hidden variable indexes", and without this I think you can't prove Bells theorem. The possible hidden states, always overlap as they can randomly make transitions. To understand why this makes would force me into more speculation of a reconstruction of QM that what's warranted, but in short, that's the idea.

(*) ANY agent (=inteacting observer) holds hidden information. This "information" can be thoughto the index the observer just in the sense that as I think Zurek said, "What an observer knows is inseparable from what the observer is". One can also imagine construcing information based metrics from this, where the "distance" between two agents, does not refer to an embedding space, but related to their differing information.

It was in this sense I see this as an information update, but perhaps better put as an "DELAYED information update". I think all information updates in principled delayed, but if we envision the experiment where Alice and Bob are very far away, and that it would be possible to keep the pairs isolated from the environment for such a lone time, this becomes extremely delayed, which leads to the weirdness.

I supposed to explain this better one needs to construct and lookg at a model for this possibility.

/Fredrik
 
  • #112
NEWS RELEASE 24-SEP-2020

A question of reality​

SPRINGER
Research News

Physicist Reinhold Bertlmann of the University of Vienna, Austria has published a review of the work of his late long-term collaborator John Stewart Bell of CERN, Geneva in EPJ H. This review, 'Real or Not Real: that is the question', explores Bell's inequalities and his concepts of reality and explains their relevance to quantum information and its applications.

John Stewart Bell's eponymous theorem and inequalities set out, mathematically, the contrast between quantum mechanical theories and local realism. They are used in quantum information, which has evolving applications in security, cryptography and quantum computing.

The distinguished quantum physicist John Stewart Bell (1928-1990) is best known for the eponymous theorem that proved current understanding of quantum mechanics to be incompatible with local hidden variable theories. Thirty years after his death, his long-standing collaborator Reinhold Bertlmann of the University of Vienna, Austria, has reviewed his thinking in a paper for EPJ H, 'Real or Not Real: That is the question'. In this historical and personal account, Bertlmann aims to introduce his readers to Bell's concepts of reality and contrast them with some of his own ideas of virtuality.

Bell spent most of his working life at CERN in Geneva, Switzerland, and Bertlmann first met him when he took up a short-term fellowship there in 1978. Bell had first presented his theorem in a seminal paper published in 1964, but this was largely neglected until the 1980s and the introduction of quantum information.

Bertlmann discusses the concept of Bell inequalities, which arise through thought experiments in which a pair of spin-½ particles propagate in opposite directions and are measured by independent observers, Alice and Bob. The Bell inequality distinguishes between local realism - the 'common sense' view in which Alice's observations do not depend on Bob's, and vice versa - and quantum mechanics, or, specifically, quantum entanglement. Two quantum particles, such as those in the Alice-Bob situation, are entangled when the state measured by one observer instantaneously influences that of the other. This theory is the basis of quantum information.

And quantum information is no longer just an abstruse theory. It is finding applications in fields as diverse as security protocols, cryptography and quantum computing. "Bell's scientific legacy can be seen in these, as well as in his contributions to quantum field theory," concludes Bertlmann. "And he will also be remembered for his critical thought, honesty, modesty and support for the underprivileged."

###​

Reference:

R. Bertlmann (2020), Real or Not Real: that is the question, European Physical Journal H, DOI 10.1140/epjh/e2019-90071-6

https://www.eurekalert.org/pub_releases/2020-09/s-aqo092420.php
 
  • Informative
  • Sad
Likes physika and atyy
  • #113
stevendaryl said:
But the state of the system is presumably deductible from the fact that it was prepared in some particular way in Alice's past. There was, for example, a particle that decayed into an anti-correlated electron/positron pair.
It doesn't matter. The point was tha it is not a local information, Alice has to have it to make the prediction.
 
  • #114
martinbn said:
It doesn't matter. The point was tha it is not a local information, Alice has to have it to make the prediction.
I presume the idea that the whole Alice+Bob experiment and preparation is a joint project in the sense that both Alice and Bob constructs the lab and the preparate procedure together, then goes to their remote detectors. The the expectations of Alice and Bob are based on information they "locally acquired" in the past, and keeps "locally stored". And their expectations has no reason to change unless the original preparation side is secrectly invaded by someone to sabotage the experiment: It would not change Alices expectation, but it would show disagreement between observation and expectation.

/Fredrik
 
  • #115
Fra said:
the expectations of Alice and Bob are based on information they "locally acquired" in the past, and keeps "locally stored".

@Demystifier may have hoped to identify the one crucial assumption (habit of thought) that prevents us from really understanding quantum theory. But they all seem to be interrelated, and the ideas "measurement", "observer", or "agent" and "information" terribly tangled. Can information be strictly local?
 
  • #116
WernerQH said:
@Demystifier may have hoped to identify the one crucial assumption (habit of thought) that prevents us from really understanding quantum theory. But they all seem to be interrelated, and the ideas "measurement", "observer", or "agent" and "information" terribly tangled. Can information be strictly local?
It's true that all terms and unclear definitions complicates the discussion, and one can question most things at different levels.

Observer and agents are sort of the same thing, except when I say agent I usually mean it as a thinking-tool or indicative of an ambition to realize a fully interacting observer, or an inside observer (in a qbist-like interpretation); as opposed to the external more passive observer, as part of hte classical context, that is the more conventional CI interpretations.
WernerQH said:
Can information be strictly local?
I think one can make this very complicated. Trying to avoid making it complicate, with local information I just meant local with respect to the agents. Ie. localized to an agent (internal structure of the agent). For example, information is "localized to Bob", then I mean the information is physically encoded an retained in Bobs microstructure.

One may ponder about what it means if one agent is extended in space, does this mean information is non-locally stored? An ambitious analysis probably needs to question and analyze the construction and emergence of spacetieme as well andt the relation between information divergence (a measure on the space of probability distributions) and metrics in spacetime. Perhaps one can argue that whenever there IS information divergence, space is defined. One also can't help thinking that storing information in a "point" will likely lead to problems, just as we have the problem of "point particles" with finite energy or mass. So thinking further about this, needs to rethink also continuum of both information and it's relations (spacetime). But for the OT I think this is not required at this point. I think one can discsusse the EPR an pretend spacetime and information are clear?

/Fredrik
 
  • #117
Fra said:
So thinking further about this, needs to rethink also continuum of both information and it's relations (spacetime). But for the OT I think this is not required at this point. I think one can discsusse the EPR an pretend spacetime and information are clear?
My point of view is the exact opposite. I think that Bell's theorem shows that physics is fundamentally nonlocal. Spacetime seems reasonably clear to me :-), but information is too unspecific a term. None of the pixels on the computer screen carries information by itself - it is the pattern that's important.
 
  • Like
Likes physika
  • #118
WernerQH said:
My point of view is the exact opposite. I think that Bell's theorem shows that physics is fundamentally nonlocal. Spacetime seems reasonably clear to me :-), but information is too unspecific a term. None of the pixels on the computer screen carries information by itself - it is the pattern that's important.
One of my personal insights is that no matter what concepts you consider the best "starting points" for a reconstruction, there is no solid external reference. Everything tends to be referencing to other concepts like you note. But this is how a relational theory works. It's IMO neither a "problem" nor a "flaw". I try to embrace it and undestand how things ought to work in such a mode and how to get a handle to it. This is exactly why I think an evolutionary model remains the only rational viable framework. At least I have not been able to come up with something more clever. And its why the Agent based model abstraction seems "natural", and it's why Qbism seems the best way forwarf from some minimal interpretation if you leave the standard theory behind and seek a better theory..

And the relational nature of this, suggests to me that parts of nature hold information about other parts of nature, this defined by assymmetric mutual expectaions, which creates tensions and explains interactions. Evolutionary learning models are more robust with respect to initinal conditions than the conventional paradigms that typically ends up in a finetuning trap. So the information Agent A has About B is to me synonous with "expected predictive power" in an evolutionary context, which is related to subjective probabilites. And this is exactly what is evolving as the agent "expects" about each other. Emphasis on expected because the preducttion can prove wrong, but still be correctly constructed. The action is assumed to follow from the correctly constructed expectation, not from the hidden truth.

/Fredrik
 
  • #119
Demystifier said:
Necessary assumptions:
- macroscopic realism (macroscopic measurement outcomes are objective, i.e. not merely a subjective experience of an agent)
So does macroscopic realism cover any many world or multi-timeline trickery? Like i recall a demonstration of "saving locality" by having each of the two separated bell measurements create a bubble where both possible results (like spin up and down) exist simultaneously. Then when the two bubbles meet the appropriate pairs of results connect into two seemingly-non-local (parallel) universes, or one pair of results survives to become permanent and the other disappears. Obviously anywhere measurement result information spread to would become part of these "bubbles," including agents, so in that sense I'm guessing these parallel experimenter-experiences would be considered subjective?
 
  • #120
eloheim said:
So does macroscopic realism cover any many world or multi-timeline trickery?
That's a good question. The answer is - yes. But Bell theorem assumes that there is only one outcome, so it should be added to the list. That being said, I have to stress that I don't think that many worlds is a way to save locality, see my https://arxiv.org/abs/1703.08341.
 

Similar threads

  • · Replies 333 ·
12
Replies
333
Views
18K
  • · Replies 292 ·
10
Replies
292
Views
10K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
119
Views
3K
  • · Replies 226 ·
8
Replies
226
Views
23K
Replies
44
Views
5K
  • · Replies 40 ·
2
Replies
40
Views
2K
  • · Replies 228 ·
8
Replies
228
Views
15K