# A How do entanglement experiments benefit from QFT (over QM)?

• Featured

#### A. Neumaier

Nice reference; didn't know it before. After ridiculing the textbook form of Born's rule, Peres says among others:
Asher Peres said:
If you visit a real laboratory, you will never find there Hermitian operators. All you can see are emitters (lasers, ion guns, synchrotrons and the like) and detectors. The experimenter controls the emission process and observes detection events. The theorist’s problem is to predict the probability of response of this or that detector, for a given emission procedure. Quantum mechanics tells us that whatever comes from the emitter is represented by a state ρ (a positive operator, usually normalized to 1). Detectors are represented by positive operators $E_µ$, where µ is an arbitrary label whose sole role is to identify the detector. The probability that detector µ be excited is tr(ρ$E_µ$). A complete set of $E_µ$, including the possibility of no detection, sums up to the unit matrix and is called a positive operator valued measure (POVM) [6].

The various $E_µ$ do not in general commute, and therefore a detection event does not correspond to what is commonly called the “measurement of an observable.” Still, the activation of a particular detector is a macroscopic, objective phenomenon. There is no uncertainty as to which detector actually clicked. [...]
Traditional concepts such as “measuring Hermitian operators,” that were borrowed or adapted from classical physics, are not appropriate in the quantum world. In the latter, as explained above, we have
emitters and detectors, and calculations are performed by means of POVMs.
Just as in the thermal interpretation. In the introduction, he echoes another point of the thermal interpretation:
Asher Peres said:
The situation is much simpler: the pair of photons is a single, nonlocal, indivisible entity . . . It is only because we force upon the photon pair the description of two separate particles that we get the paradox [...]

Last edited:

#### RUTA

Well, you have to analyze an experiment with a theory (or model) that is valid to analyze this experiment. There's no way to analyze an experiment involving photons with non-relativstic QM only since photon cannot be described non-relativsitically at all. What you can describe non-relativistically are often the matter involved in the experiement since large parts of atomic, molecular, and solid state physics can be described by non-relativsitic quantum mechanics or even classical mechanics.

Another point are fundamental issues with Einstein causality, which cannot be analyzed using non-relativstic theory at all since the question, whether causal effects are propgating faster than light or not is irrelevant for non-relativstic physics to begin with. Since in Newtonian physics actions at a distance are the usual way to describe interactions you cannot expect that the causality structure of relativistic spacetime is respected. So finding violations of Einstein causality using non-relativstic approximations is not a surprise but already put in to begin with.

Of course, entanglement itself is independent on whether you use relativistic or non-relativistic QT to describe it.
To answer the OP, here is what the experimentalists say in their paper, "Entangled photons, nonlocality and Bell inequalities in the undergraduate laboratory":

Consider a quantum mechanical system consisting of two photons called, for historical reasons, the “signal” and “idler” photons.
So, to address the OP, these experimentalists consider their Bell basis state to be "quantum mechanics," not "quantum field theory" even though they created it with photons. The only place they say anything at all about "relativistic" is here:

This gives no information about the choice of alpha. It is also the probability we would find if the signal photon had not been measured. Thus quantum mechanics (in the Copenhagen interpretation) is consistent with relativistic causality. It achieves that consistency by balancing two improbable claims: the particles influence each other nonlocally, and the randomness of nature prevents us from sending messages that way. A comment by Einstein succinctly captures the oddness of this situation. In a 1947 letter to Max Born he objected that quantum mechanics entails “spooky actions at a distance.”
So, whether or not you consider the Bell basis states to be "relativistic" when created with photons, these states and standard Hilbert space formalism account for the violation of the CHSH inequality when using photons. Again, there is nothing in the formalism without additional metaphysical interpretation that resolves the mystery of the correlations.

#### atyy

Also you don't read my statements carefully enough. I never claimed that outcomes are independent of the settings of measurement devices. The contrary is true! Everything depends on the specific preparation of the initial state and the specific setup of measurement devices used to probe it. Measurements are due to local interactions of (parts of) the system with measurement devices. The found strong correlations are due to the initial-state preparation in an entangled state not due to the local measurement on one part of the system and some mysterious spooky actions at a distance on far-distant other parts ofthe system.
What do you mean by local interactions with measurement devices? Take a free relativistic quantum field theory. It predicts violation of the Bell inequalities. But interactions are not in the theory.

#### PeterDonis

Mentor
Take a free relativistic quantum field theory. It predicts violation of the Bell inequalities.
Does it? In a free theory different free particles cannot be entangled, since that would require them to have interacted in the past (or to be produced from some interaction).

#### A. Neumaier

What do you mean by local interactions with measurement devices? Take a free relativistic quantum field theory. It predicts violation of the Bell inequalities. But interactions are not in the theory.
A free field theory by itself predicts nothing (apart from no scattering) since there are no interactions that would allow anything to be measured.

#### A. Neumaier

In a free theory different free particles cannot be entangled, since that would require them to have interacted in the past (or to be produced from some interaction).
No. In a free theory entangled states are possible: they might have existed forever.

#### atyy

A free field theory by itself predicts nothing (apart from no scattering) since there are no interactions that would allow anything to be measured.
A free theory has observables.

#### A. Neumaier

A free theory has observables.
But it is a closed system involving all of spacetime, hence allows no measurement.

#### atyy

But it is a closed system involving all of spacetime, hence allows no measurement.
An interacting theory would not change that.

#### A. Neumaier

An interacting theory would not change that.
But it would represent the measurement process (which is what vanhees71 means) and, at least in the thermal interpretation, tell what happens.

#### vanhees71

Gold Member
To answer the OP, here is what the experimentalists say in their paper, "Entangled photons, nonlocality and Bell inequalities in the undergraduate laboratory":

So, to address the OP, these experimentalists consider their Bell basis state to be "quantum mechanics," not "quantum field theory" even though they created it with photons. The only place they say anything at all about "relativistic" is here:

So, whether or not you consider the Bell basis states to be "relativistic" when created with photons, these states and standard Hilbert space formalism account for the violation of the CHSH inequality when using photons. Again, there is nothing in the formalism without additional metaphysical interpretation that resolves the mystery of the correlations.
I don't know, where you get the information from that the authors of this nice paper do not mean the right thing in writing down Eq. (1). Ok, you might criticize that they don't take the full Bose structure into account, but that's ok here since they are interested only in the polarization state and label the photons by an distinguishable property as s and i (the physical property distinguhishing the photons is the momentum part of the states).

The other quote is, however, indeed unfortunate, because it makes the wrong claim that "the particles influence each other nonlocally". This is simply not true in standard QFT, where the microcausality constraint is valid. The "nonlocality" is about correlations not about causal effects and this together with the correct other half of the sentence: "the randomness of nature prevents us from sending messages that way", makes the entire theory consistent and indeed consequentielly there are no "spooky actions at a distance" at all. Einstein was right in criticizing the claim of such spooky actions at a distance, because this claim has been made by the Copenhagener's at this time. It's a shame that after all the decades we know better through Bell's analysis together with all the beautiful experiments done in connection with it and still make such claims, contradicting the very construction of the theory itself. As often, the math is much smarter than the gibberish sometimes grouped around it in papers and textbooks :-(.

#### vanhees71

Gold Member
What do you mean by local interactions with measurement devices? Take a free relativistic quantum field theory. It predicts violation of the Bell inequalities. But interactions are not in the theory.
Thanks for this argument! This makes it very clear that the correlations described by entanglement have nothing to do with spooky actions at a distance. Entanglement is there in the free theory, where no interactions are considered.

Of course, the free theory is empty physics wise. You can invent as many non-interacting fields as you like. You cannot measure any consequences of their existence, because they don't interact with your measurement devices. So do discuss this issue about "spooky actions at a distance" you cannot simply ignore interactions, but you must consider the interaction of the measured object with the measurement device, but these interactions are of course also governed by the interactions described by the Standard Model, and thus are strictly local. If a photon hits an atom in a photodectector the interaction is with this atom, and the "signal" caused by it travels with at most the speed of light and does not lead to any spooky actions at a distance due to the very construction of the theory in terms of a local, microcausal QFT.

#### RUTA

I don't know, where you get the information from that the authors of this nice paper do not mean the right thing in writing down Eq. (1). Ok, you might criticize that they don't take the full Bose structure into account, but that's ok here since they are interested only in the polarization state and label the photons by an distinguishable property as s and i (the physical property distinguhishing the photons is the momentum part of the states).
The point of the quote addresses the OP -- why don't experimentalists say they're using QFT when analyzing photons? They say they're using QM. This was just another example of the semantics.

This is simply not true in standard QFT, where the microcausality constraint is valid. The "nonlocality" is about correlations not about causal effects and this together with the correct other half of the sentence: "the randomness of nature prevents us from sending messages that way", makes the entire theory consistent and indeed consequentielly there are no "spooky actions at a distance" at all.
You understand that the formalism does not supply a causal mechanism for the correlations. That's a start! Now, to understand what so many people find mysterious about entanglement, simply tell them what does explain the correlations. Keep in mind that the formalism predicts the correlations, but does not explain them, i.e., it provides no causal mechanism, as you admit. So, if you want to be done with all this "gibberish," just explain the correlations without merely invoking the formalism.

#### vanhees71

Gold Member
Of course the formalism doesn not supply a causal mechanism for the correlations in the sense you seem to imply (but not explicitly mention to keep all this in a mystery ;-)), because there is no causal mechanism. The causal mechanism is the preparation procedure. E.g., two photons in the polarization-singlet state are created in a parametric downconversion event, where through local (sic) interaction of a laser field (coherent state) with a birefringent crystal a photon gets annihilated and two new photons created, necessarily in accordance with conservation laws (within the limits of the uncertainty relations involved of course) leads to a two-photon state, where both the momenta and the polarization of these photons are necessarily entangled. There's nothing mysterious about this. The formalism thus indeed describes and in that sense also explains the correlations. By "explaining" in the sense of the natural sciences you always mean you can understand it (or maybe not) from the fundamental laws discovered so far. The fundamental laws themselves (in contemporary modern physics mostly expressed in terms of symmetry principles) are the result of careful empirical research and accurate measurements, development of adequate mathematical models/theories, their test and, if necessary, refinement.

It is impossible to explain any physics without invoking "the formalism". This is as if you forbid to use language in communicating. It's impossible to communicate without the use of the adequate language, and the major breakthrough in men's attitude towards science in the modern sense is to realize, as Galileo famously put it, that the language of nature is mathematics (particularly geometry), and this is the more valid with modern physics than ever.

#### Jimster41

Gold Member
How does that explain the mechanism controlling the evolution between that preparation and the actual later (and or space-like separated) event when they get measured and “display” said correlation. You seem to imply that later (and or space-like separated) event is fully defined earlier and only therefore “locally” (in an understandable preparation process). I get that goal and in some sense I agree. No hidden variables. But then what transpires to cause delay and support separation but limit it also? What constrains it? (In addition to the human experimenter) How does that work? I am totally unsatisfied with the answer “nothing real, nothing knowable, nothing worth trying to imagine”. That hypothetical flow is to me what space-time and whatever conservation laws (of geometry and duration?) govern it do microscopically. I am okay with the statement, we will never be able to “observe” the beables of that but then we are already way down into that problem already... doesn’t mean we can’t deduce stuff.

For example: say no experimenter you or anyone else knows decides to end that particular entanglement experiment. How far does it go? What dictates that it end?
What qualifications does some space-time location have to have to manifest the correlations you prepared. Do all entanglements therefore persist indefinitely. That seems unlikely since we are standing firmly on classical stuff... and would not be here otherwise.

#### vanhees71

Gold Member
Well, I think the first step to understand these things is to look at what's done in the lab. You have a very concrete setup consisting of a laser and certain types of birefringent crystals which through nonlinear optics enables you to create entangled photon pairs. That's the preparation procedure. This is also well understood by effective QED descriptions, i.e., using some constitutive parameters to describe the down-conversion process. It's all based on phenomenological experience and then brought into an efficient formalism to understand "how it works".

Then you have other equipment to measure polarization. In the most simple case you just use some polarization foil like "polaroid" in a certain orientation letting photons with some linear-polarization state through and the ones in the perpendicular polarization are absorbed. These filters you use on both sites where the photons are registered (or not registered). Then you can established in a series of measurements that the single-photon polarization is completely indetermined. Taking accurate measurement protocols to ensure that you can check the correlations of each of the entangled photon pairs you find a 100% correlation between polarization measurements in the same direction. The only thing that has to do with a human experimenter is that he decides what he wants to measure, and there's no subjective element in this, if that's what's behind your question.

The very idea that this is an interesting measurement is a prediction of the theory, but it's finally defined if you can set up such a concrete experiment to measure it. There's no more you can expect from natural science. What in addition do you expect? Why are such questions never asked about classical mechanics or electrodynamics? You never ask why Newton's postulates describe the Newtonian world accurately (and indeed that's true, i.e., within the now known limits of applicability Newtonian mechanics is a very good description of the corresponding phenomena observed in nature)? But why don't you ask? Is it, because the classical-physics description has no irreducible probability element in it? Isn't it a as pretty weird idea to think that everything is strictly deterministic, compared to our daily experience of pretty random events?

#### Jimster41

Gold Member
That all makes sense. And I've heard others make that argument, that we can't grasp irreducible probilistic-ness. We aren't wired to... I think that's possible. But I'm not sure I'm willing to give up on that intinct just yet. It does smack of fatalism - and I think there are too many things left un-detailed, if not un-explained.

to which, I thought you were going to say, "it will propagate until it hits something that it must interact with"... so I've been trying to hone my question.

We can enumerate things that would meet that qualification and imagine how they might arrive in some sense in the way of our experimental infinitely propagating two-photon. And that we will never know when that happens, can't ever know. Fine, but how did those things come into existence. Or were they already there. Well, they got made in the Big Bang or shortly thereafter... particles etc. Distributed by the inflationary period, condensed from... some...

what, Quantum?
Yes, it broke down, condensed, collapsed, interacted. Space started getting bigger. Whatever, hence the ever rambling Pachinko machine of the universe full of billiard balls drifting or zooming around... entropy-ing. It's all quite... linear.

But then what causes situations of negative entropy?

It's maleable, and has ... random non-linear fluctuations.

What allows for maleability of the general process, why didn't it just run down completely right off? What governs that maleability? Whence maleable? Why not maleable? Why any non-linearity?

Accidents of interaction, rarities and anomalies, tunneling, non-perturbative effects. Sometimes in a Pachinko machine the balls bounce up.

Oh, can I make a Pachinko machine where the balls bounce up more? What if I make one Pachinko machine that works just right and another one and compare them - but find they are different... like two relativistic observers with identical physics. What common laws govern the designs of those two different machines?

Space-time rules.

What governs those space-time rules. What are those?

I don't know, Einstein said c.

How does c get worked out by space-time?

Last edited:

#### Lord Jestocost

Gold Member
2018 Award
How does that explain the mechanism controlling the evolution between that preparation and the actual later (and or space-like separated) event when they get measured and “display” said correlation.
There is no mechanism, because the “photon”, for example, is merely an encodement of a set of potentialities or possible outcomes of measurements, viz. a mental conceptual link between particular macroscopic preparation devices and a series of macroscopic measurement events.

#### Jimster41

Gold Member
There is no mechanism, because “phonons”, for example, a merely an encodement of a set of potentialities or possible outcomes of measurements, viz. a mental conceptual link between particular macroscopic preparation devices and a series of macroscopic measurement events.
Aren't they also involved in the heat capacity of materials... as in giving different materials different heat capacity. Heat capacity is pretty important to me... on a daily basis.

Sorry, did you mean phonons or photons?

#### Lord Jestocost

Gold Member
2018 Award
Sorry, I mean of course "photon".

#### Jimster41

Gold Member
You think with photons. We all do. I'm going to go ahead and call myself (and you for that matter) and all this here. Real as it gets. Don't mean to sound snippy. Cognitive dissonance.

We need to lean on the concept of photons and all these conceptual links to reality with all our weight I think. I don't see any way around it. That's why I keep coming here making an idiot out of myself.

And to try to respond to the point being made - which seems to be kind of that there is nothing universally interesting about our enumeration of observables - they are things we invented so we are mesmerized by them. That just seems pretty solipsistic.

Maybe our two photon hits a two photon made by some experimenters we don't know. We are now connected by space-time's rules... I get it's useless to either of us as a random number but the connection is physical, isn't it? It has physical implications for what happens next.

Last edited:

#### Lord Jestocost

Gold Member
2018 Award
You think with photons.
As a working physicist, just for FAPP! From a philosophical point of view, I don't mistake the map for the territory, an instrumentalist's point of view.

#### DrChinese

Gold Member
Isn't it a as pretty weird idea to think that everything is strictly deterministic, compared to our daily experience of pretty random events?
I agree with this with no reservations. Human experience is demonstrably NOT deterministic, and yet there is obviously a strong desire to provide rules and order for everything else.

#### A. Neumaier

Human experience is demonstrably NOT deterministic, and yet there is obviously a strong desire to provide rules and order for everything else.
Casting dice is also demonstrably NOT deterministic, and yet Laplace provided rules and order for them that remained valid until the advent of quantum theory..

#### atyy

But it would represent the measurement process (which is what vanhees71 means) and, at least in the thermal interpretation, tell what happens.
But the thermal interpretation is not (yet?) a standard interpretation. I do agree that what you say would likely be true of an interpretation that solves the measurement problem (eg. maybe something like Bohmian Mechanics or the thermal interpretation, but that is also not a standard interpretation at this time).

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving