What is the mechanism behind Quantum Entanglement?

  • #151
We state it, because you claim, against empirical facts, an "average-only validity" of the conservation laws. Also angular momentum is conserved event by event and not only on average, and indeed this false claim has nothing to do with any mystery of entanglement. In fact, often an entangled state results due to conservation laws, as in the original EPR gedanken experiment (for momentum) as well as in Bohm's version (for angular momentum).
 
Physics news on Phys.org
  • #152
RUTA said:
Charge and energy are conserved exactly (for each trial) in these experiments. Does that bear at all on the mystery of entanglement per the Bell states? No, so why state it when doing so leads precisely to confusing statements like this one? I think I'll stick to my presentation of the empirical and mathematical facts that define "average-only" projection and "average-only" conservation (for spin angular momentum in this case) and not introduce extraneous facts. Indeed, I'll make my Posts 113 and 129 an Insight so I can just link to that concise explanation and list of the relevant facts in the future.
Do you have experimental evidence of conservation of angular momentum being violated in a single experiment?
 
  • #153
vanhees71 said:
We state it, because you claim, against empirical facts, an "average-only validity" of the conservation laws. Also angular momentum is conserved event by event and not only on average, and indeed this false claim has nothing to do with any mystery of entanglement. In fact, often an entangled state results due to conservation laws, as in the original EPR gedanken experiment (for momentum) as well as in Bohm's version (for angular momentum).
You're claiming the empirical and mathematical facts listed in Post 129 are false? If so, then you are denying standard textbook QM.

Here is the outline:
1. Some people find entanglement to be mysterious, despite the fact that the formalism of QM maps beautifully to the experiments and entanglement is being used to develop new technologies.
2. Entanglement is the key difference between quantum information processing and computing and its classical counterpart.
3. Quantum information theorists have reconstructed QM as a probability theory based on information-theoretic principles. In these reconstructions, they build the entirety of finite-dimensional QM from the indivisible fundamental unit of binary quantum information, i.e., the quantum bit (qubit).
4. The qubit differs from the classical bit for classical probability theory in one respect, i.e., continuous reversibility between pure states.
5. Facts 3 and 4 are summed up by Information Invariance & Continuity.
6. Therefore, the mystery of entanglement per quantum information theory ultimately resides in Information Invariance & Continuity. However, for those who are not practicing quantum information theorists, this is not a very transparent principle, so a physical example helps.
7. The qubit can be physically instantiated in any number of ways.
8. I (and Brukner, Zeilinger, Mueller, Dakic, etc.) find spin-1/2 particles to provide a nice visual example of the qubit (see figures in Post 113).
9. In that example, Information Invariance & Continuity manifests itself as "average-only" projection of spin angular momentum per the empirical and mathematical facts listed in Post 129. If you were using photons and polarizers instead, Information Invariance & Continuity manifests itself as "average-only" transmission through the polarizer (as explained in our published papers).
10. Extrapolating "average-only" projection to the corresponding Bell state, we have "average-only" conservation of spin angular momentum between different inertial reference frames related by spatial rotations in the plane of symmetry where the reference frames are those of the corresponding set of complementary spin measurements. These facts are listed in Post 129.
11. Facts 1-10 above are empirical and mathematical facts independent of interpretation.
Conclusion: The interpretation-independent "mechanism" responsible for entanglement and its "mysterious" (non-classical) behavior can be summed up most generally per information-theoretic reconstructions of QM by Information Invariance & Continuity. This is manifested in physical instantiations of Bell state entangled qubits as "average-only" conservation (as defined) of the relevant entangled property.

Everything you continue to request has already been posted, e.g., exact mathematical statements, example of state preparation, example of corresponding measurement, etc. Sorry if you still don't understand what has been presented, I can't think of any further simplifications. If anyone else sees how to make it simpler, please let me know!
 
  • #154
PeroK said:
Do you have experimental evidence of conservation of angular momentum being violated in a single experiment?
See Post 155.
 
  • #155
I have to sign off now and get back to writing the Insight and book on what I've been presenting here. If anyone has any suggestions for how to make this presentation easier to understand, contact me directly via a Physics Forums Conversation.
 
  • #156
RUTA said:
See Post 155.
My understanding of what you say is that:

a) If you measure spin angular momentum about different axes, then the question of conservation is indeterminate - but conservation is not manifestly violated. And, indeed, the incompatibility of spin AM measurements about different axes precludes a comprehensive measurement of AM about all axes in any experiment. In that sense, three-dimensional spin AM in QM is fundamentally indeterminate. There is nothing special about Bell states in that respect.

b) You have manifestly conservation on average about all axes.

If that's correct, then saying you can't prove conservation of AM about the z-axis if you don't measure both particles about the z-axis is a hollow statement.
 
  • #157
RUTA said:
You're claiming the empirical and mathematical facts listed in Post 129 are false? If so, then you are denying standard textbook QM.
I don't know, what you want to say. So I can't say whether your claims are wrong or false. What for sure is wrong is the claim that the conservation laws wouldn't hold on an event-by-event basis.
RUTA said:
Here is the outline:
1. Some people find entanglement to be mysterious, despite the fact that the formalism of QM maps beautifully to the experiments and entanglement is being used to develop new technologies.
This is irrelevant for physics.
RUTA said:
2. Entanglement is the key difference between quantum information processing and computing and its classical counterpart.
Nobody denies this.
RUTA said:
3. Quantum information theorists have reconstructed QM as a probability theory based on information-theoretic principles. In these reconstructions, they build the entirety of finite-dimensional QM from the indivisible fundamental unit of binary quantum information, i.e., the quantum bit (qubit).
This is no surprise either.
RUTA said:
4. The qubit differs from the classical bit for classical probability theory in one respect, i.e., continuous reversibility between pure states.
Also agreed.
RUTA said:
5. Facts 3 and 4 are summed up by Information Invariance & Continuity.
It's not clear to me, what you mean with that.
RUTA said:
6. Therefore, the mystery of entanglement per quantum information theory ultimately resides in Information Invariance & Continuity. However, for those who are not practicing quantum information theorists, this is not a very transparent principle, so a physical example helps.
7. The qubit can be physically instantiated in any number of ways.
8. I (and Brukner, Zeilinger, Mueller, Dakic, etc.) find spin-1/2 particles to provide a nice visual example of the qubit (see figures in Post 113).
Sure.
RUTA said:
9. In that example, Information Invariance & Continuity manifests itself as "average-only" projection of spin angular momentum per the empirical and mathematical facts listed in Post 129. If you were using photons and polarizers instead, Information Invariance & Continuity manifests itself as "average-only" transmission through the polarizer (as explained in our published papers).
Please finally define what you mean with "average-only projection" with clear mathematical statements. One cannot communicate without clear mathematical definitions.
RUTA said:
10. Extrapolating "average-only" projection to the corresponding Bell state, we have "average-only" conservation of spin angular momentum between different inertial reference frames related by spatial rotations in the plane of symmetry where the reference frames are those of the corresponding set of complementary spin measurements. These facts are listed in Post 129.
11. Facts 1-10 above are empirical and mathematical facts independent of interpretation.
Conclusion: The interpretation-independent "mechanism" responsible for entanglement and its "mysterious" (non-classical) behavior can be summed up most generally per information-theoretic reconstructions of QM by Information Invariance & Continuity. This is manifested in physical instantiations of Bell state entangled qubits as "average-only" conservation (as defined) of the relevant entangled property.
You haven't made a clear mathematical statement. So it's impossible for me to understand the meaning for your text-only vague statements.
RUTA said:
Everything you continue to request has already been posted, e.g., exact mathematical statements, example of state preparation, example of corresponding measurement, etc. Sorry if you still don't understand what has been presented, I can't think of any further simplifications. If anyone else sees how to make it simpler, please let me know!
You have not given a clear description of what you are talking about. This would mean:

(a) the system under consideration (one spin, many spins?)
(b) the state the system is prepared in (##\hat{\rho}=...##).
(c) which (spin?) observables are measured.
(d) what does "average-only validity of conservation laws" mean for you. In the standard meaning of this words it's clearly a wrong statement.
(e) what does "average-only projection" mean. It's not defined in the standard literature, and you haven't given a clear mahthemtal definition either.
 
  • Like
Likes physicsworks and PeroK
  • #158
Averages are conserved indeed, and I understand how a measurement can appear to violate conservation laws in QM. However note that a measured state is in fact a partial state of a macroscopic measurement device entangled with one measured qubit. If the quantity in the measured qubit decreases by 1, could it be that the same quantity in the measurement device is increased by ##1/n## where n is the amount of that type of quantum information?
 
  • #159
PeroK said:
My understanding of what you say is that:

a) If you measure spin angular momentum about different axes, then the question of conservation is indeterminate - but conservation is not manifestly violated. And, indeed, the incompatibility of spin AM measurements about different axes precludes a comprehensive measurement of AM about all axes in any experiment. In that sense, three-dimensional spin AM in QM is fundamentally indeterminate. There is nothing special about Bell states in that respect.

b) You have manifestly conservation on average about all axes.

If that's correct, then saying you can't prove conservation of AM about the z-axis if you don't measure both particles about the z-axis is a hollow statement.
This reminds of me of something I meant to say. The key to understanding the mystery of entanglement as presented by EPR, Bell, and Mermin (and many others of course) is the assumption of counterfactual definiteness (CD) that seems to be necessary per the rotational symmetry of the Bell states giving exact conservation of spin AM in the same reference frame (aka when making the same spin measurements in the symmetry plane). However, if you assume CD when making different spin measurements, you get the Bell inequality which is violated by QM. The violation of CD is characterized in many ways, e.g., complementarity, non-Boolean algebra, non-commutativity, superposition, qubit structure, Information Invariance & Continuity, etc. We're adding one more way to characterize it, i.e., "average-only" projection/conservation as I described. Why bother adding yet another characterization? Because it leads immediately to a direct analogy with SR where NPRF has been long accepted as resolving the mysteries of time dilation and length contraction (see any intro physics textbook, for example).

If we had exact projection and conservation between different reference frames per CD, i.e., if we did measure ##\cos{\theta}## at ##\hat{b}## for ##|\psi\rangle = |z+\rangle## and Bob did measure ##\cos{\theta}## when Alice measured +1 for the Bell triplet state in the symmetry plane, then the ##\hat{z}## frame and Alice's frame would constitute "preferred frames" where you measure h while everyone else is measuring a fraction of h (like moving through the aether and getting some fraction of c for the speed of light).

I'll add that to the Insight, thnx.
 
  • #160
Please make clear statements! Please define what you mean by "average-only" conservation. In the usual meaning of this word it's contradicting all empirical evidence. So you must mean something different.

The only thing I can clearly guess is that you discuss two spins ##s=1/2## in the singlet state,
$$|S=0,M=0 \rangle=\frac{1}{\sqrt{2}} (|1/2,-1/2 \rangle-|-1/2,1/2 \rangle).$$
Angular-momentum conservation can now be discussed for the usual example of Bohm's version of EPR: The spin-singlet state is prepared by the decay of a spin-0-particle in its rest frame (and there's no merit in overcomplicating things by discussing this in another frame, where the particle moves, but if you want to, you can simply make a unitary transformation to such a frame; it won't change any conclusions of course). Then angular-momentum conservation in fact dictates the above singlet state, and this implies that the angular-momentum conservation is fulfilled event by event.

To empirically check angular-momentum distribution you must measure the spin of both particles in the same direction in each measurement, i.e., you measure first in a direction ##\vec{n}_1##, and the prediction is that you get with probability 1/2 either ##m_1=1/2## and ##m_2=-1/2## or ##m_1=-1/2## and ##m_2=+1/2##. The sum is always ##M=1/2-1/2=0##, i.e., angular-momentum conservation for this component of the spin holds event by event. Now you repeat this for another direction ##\vec{n}_2##, and again you find that the angular-momentum conservation holds event by event.

It doesn't make sense to try to confirm angular-momentum conservation by measuring the spin of particle 1 in one direction ##\vec{n}_1## and that of particle 2 in another direction ##\vec{n}_2##, because then you never measure any total spin component. You cannot infer from such a measurement whether any component of the total spin is the same as before the mother particle's decay. To quote Peres: "Unperformed experiments have no results".

What's for me clear with all the confirmation of quantum theory against local HV theories is that observables only take predetermined values if the system is prepared in a corresponding state (an eigenstate of the corresponding self-adjoint operators of the measured observables). Thus of course "counteractual definiteness" is violated.
 
  • Like
Likes Dragrath, LittleSchwinger and physicsworks
  • #161
RUTA said:
I have to sign off now and get back to writing the Insight and book on what I've been presenting here. If anyone has any suggestions for how to make this presentation easier to understand, contact me directly via a Physics Forums Conversation.
I am sympathetic to what you are working on. It reminds me of some of the other work on unifying Probability and Quantum Probability as outlined in https://www.math.ucdavis.edu/~greg/intro-2005.pdf

One suggestion is to start with a more concise and mathematical presentation that could be more quickly digested by experts.
 
  • Like
Likes vanhees71 and hutchphd
  • #162
CoolMint said:
RUTA is insisting that the nobody understands quantum mechanics. Esp how it relates to the 'classical' world.
I know and understand Vanhees' view. But I don't think RUTA is saying nobody understands QM, except in the sense nobody understands anything. By this, I mean every theory, every single one, is based on assumptions that are simply accepted. In that sense, ultimate knowledge is unobtainable. Besides that, science rests on doubt - always, it must be somewhere in the back of your mind - this may be wrong. RUTA is saying that he believes there is a relativity principle similar to the POR (which says the laws of physics are the same in all inertial frames or frames traveling at constant velocity relative to an inertial frame). It is a beautiful principle of maximum symmetry in inertial frames. But that does not explain why it is true. Questions like that are rampant throughout science and always will be. My favourite area of science is how to formulate theories so that the assumptions are like the POR - beautiful and intuitive. But they are assumptions whose validity depends on experiments. That is the key:


And, of course, Brian Cox is right - we all should read Feynman. Caveat - not his Lectures on Physics except as a supplement to a more usual physics text or after, without going into why.

Thanks
Bill
 
Last edited:
  • #163
Fra said:
Without that SOLID support to make preparation and log massive amounts of data, how would you corroborate QM in the first place?
In many ways. One way often not mentioned is showing classical mechanics is a limiting case of QM. A common way is showing Feynman's path integral approach leads to the Principle Of Least Action. But a more sophisticated way is by the use of a process called coarse graining:
https://www.sciencenews.org/blog/context/gell-mann-hartle-spin-quantum-narrative-about-reality

Thanks
Bill
 
  • #164
bhobba said:
In many ways. One way often not mentioned is showing classical mechanics is a limiting case of QM. A common way is showing Feynman's path integral approach leads to the Principle Of Least Action. But a more sophisticated way is by the use of a process called coarse graining:
https://www.sciencenews.org/blog/context/gell-mann-hartle-spin-quantum-narrative-about-reality

Thanks
Bill
I am aware of that, but that is missing my point. I am not actually claiming that there IS a classical reality, on the contrary :smile: there is no evidence for any sharp Heisenberg cuts anywher in nature.

But if you look at the theory, to determine with certainty distributions, and process enough data to infer hamiltonians and distributions etc, IMHO at least, presumes conceptucally a SOLID reference frame for information processing and for a solid spacetime. And as this SOLID reference only exists approximately, but in the theory we use hard constraints to be eternal and timeless. This does not match to me.

If one only cares about the practical success this may seem esotheric, but if one looks at the structure of the theory, and how it's elements presumable map to nature, then QM is an effective theory at best, which means it is a potentia fallacy take the "truncated" wisdom from the effective theory and extrapolate to hold even when searching for unification (GUT as well as gravity).

/Fredrik
 
  • #165
bhobba said:
classical mechanics is a limiting case of QM.
Classical mechanics is not just Newtons "theory" is also represents that there is a place where information can be encoded (with certainty) and that can be SHARED among observers. Without this - we can not construct and conduct a quantum preparation an experiment and observers can't agree with certainty on distributions.

So what you say, IMO implies that not only CM but also QM is "emergent". IF you agree on that, then we agree. But I am searching for HOW QM emerges, and how that is described.

So to restate my point: What you describe, how QM works, is successful and explains CM in large limits, seems to represent what we see in nature, BUT I think theory of QM (with set hilbert spaces and god given hamiltonians) does not seem to describe the actual inferece we do.

/Fredrik
 
  • Skeptical
Likes bhobba and PeroK
  • #166
To make the point even clearer:
Fra said:
I am not actually claiming that there IS a classical reality, on the contrary :smile: there is no evidence for any sharp Heisenberg cuts anywher in nature.
This is also why the kind of "observers" that is required to construct QM, also does not exists. This is the core point.

Yes, it effectively works anyway. But when analysing the logical structure of the theory from inference, this is a problem for me at least.

/Fredrik
 
  • #167
Fra said:
Classical mechanics is not just Newtons "theory" is also represents that there is a place where information can be encoded (with certainty) and that can be SHARED among observers.

May I suggest you read Landau - Mechanics? It contains nothing about - information. It is, however, Classical Mechanics (non-relatvistic) based on the Principle Of Least Action, easily derivable from Feynman's Path Integral Formulation.

As I have said, we cannot directly interact with the QM world. We know about it from its interactions with the classical world of everyday experience or increasingly from strange phenomena here in the everyday world that can only be explained by QM. Now pinning what the everyday world is, is a deep philosophical issue and, by the forum rules, not on topic here. For our purpose, a world out there that we experience is taken as a given. Since everything is quantum, how such a world is a limiting case of a theory that assumes it in the first place is a deep issue. The surprising thing is that significant progress, such as using coarse-grained histories, decoherence etc, has allowed significant progress to be made - although problems remain. If you would like more detail, may I suggest a modern interpretation like Consistent Histories that delves into such issues:

https://quantum.phys.cmu.edu/CHS/histories.html

This is not an endorsement of Consistent Histories except that it is an interesting interpretation many call - Copenhagen done right. I have laid my cards on the table regarding interpretations far too many times to repeat it here. It is a reasonable starting point to answer the questions you seem interested in.

Thanks
Bill
 
Last edited:
  • #168
Fra said:
This is also why the kind of "observers" that is required to construct QM, also does not exists. This is the core point.
QM can be formulated in a way that does not require observers. However, it is a good place to start viewing QM as a generalised probability theory, although, strictly speaking, even that view does not require observers. It would require a deep sojourn into the philosophy of probability, again not on topic here. If it worries you look at probability as the Kolmogorov axioms and Generalised Probability Theory as a generalisation of those axioms. Applying an axiomatic mathematical system is also a deep but philosophical issue. Like Euclidian Geometry, we simply use intuitive ideas such as a point has position and no size and a line length but no breadth. Of course, such don't exist but are useful abstractions in applications. It is similar to the inertial frames of SR.

Thanks
Bill
 
  • Like
Likes Dragrath and vanhees71
  • #169
bhobba said:
May I suggest you read Landau - Mechanics? It contains nothing about - information. It is, however, Classical Mechanics (non-relatvistic) based on the Principle Of Least Action, easily derivable from Feynman's Path Integral Formulation.

As I have said, we cannot directly interact with the QM world.
How do you come to that conclusion? To the contrary, with more and more advanced technology we are more and more able to observe the "quantum world" (as if there were any other world than the "quantum world"). To handle generic quantum system nowadays becomes more and more applied, and more and more universities of applied sciences develop curricula for the development of "quantum technology".
 
  • Like
Likes physika and WernerQH
  • #170
We do not have direct sensory interaction with the quantum world as for example the Planck constant is very small to make any useful difference or contribution. If the quantum world exist as such between measurements. In that sense, we are probing indirectly, as you need classical-like machinery, usually bigger than а kitchen table which is already classical.
Maybe bhobba had this in mind by 'direct' experience of the quantum world.
 
  • #171
Obligatory Consistent Histories take: We probe the quantum world by identifying properties which are 'ambivalent' (having both a classical and a quantum description, e.g. the collective degrees of freedom of some measurement apparatus), and using quantum theory to establish a logical relation between these ambivalent properties and the quantum properties we are interested in probing. I.e. Not so much a Heisenberg cut, but a 'Heisenberg overlap'
 
  • #172
Fra said:
Classical mechanics is not just Newtons "theory" is also represents that there is a place where information can be encoded (with certainty) and that can be SHARED among observers.
bhobba said:
May I suggest you read Landau - Mechanics? It contains nothing about - information.

A different suggestion (SCNR): Fra, your views seem to be sufficiently evolved and detailed that it would make sense to write them down in a more coherent form than just as comments on other peoples questions and answers. Maybe as an FQXi essay, maybe as a paper of some form, maybe as a series of blog post, or... I am not suggesting that you should link your PF account to those "external activities" and give away more of your identity than you want. But I do suggest that you should do some activity in that direction. Otherwise you risk kidding yourself with respect to your views and their impact.
 
  • #173
RUTA said:
When Alice and Bob make measurements in different reference frames, Alice(Bob) says Bob(Alice) must average his(her) data according to her(his) partition of the data in order to conserve spin angular momentum. All of this follows from the exact conservation of spin angular momentum responsible for the Bell state with its rotational symmetry to begin with. As long as Alice and Bob are making measurements in the same reference frame (same orientation relative to source) their outcomes will be exactly in accord with conservation of spin angular momentum. And, not surprisingly, that can be easily accounted for via local realism. The “weirdness” of entanglement occurs for measurements in different reference frames. That’s where the relative “average-only” conservation holds
a sort of statistical consistency?
 
Last edited:
  • #174
bhobba said:
May I suggest you read Landau - Mechanics? It contains nothing about - information. It is, however, Classical Mechanics (non-relatvistic) based on the Principle Of Least Action, easily derivable from Feynman's Path Integral Formulation.
Thanks for the suggestion, I haven't read that book, but if it's point is to start with some principle of least action, given a lagrangian or hamiltonian I can't see how that will solve any of the deeper questions?

bhobba said:
https://quantum.phys.cmu.edu/CHS/histories.html

This is not an endorsement of Consistent Histories except that it is an interesting interpretation many call - Copenhagen done right.
I'm sure it's interesting but I am not a fan of that interpretation.

/Fredrik
 
  • #175
bhobba said:
QM can be formulated in a way that does not require observers.
Yes, this is what many even wants to do. Ie. solve the measurement problem by REMOVING the observer.
This is the opposite strategy of what I suggest.

I take here a more qbist stance, that the agent is CENTRAL. Doing away with this, is throwing the baby out with the bathwater.

When I wrote that the observers that are needed to construct QM "does not exist", I didn't mean that the observers are not important or a problem, just that the idealisation of the "observer" that supports the theory, also defines it. Instead of doing away with the observers, I suggest only that we realized that an observer is more than merely a spacetime frame of reference?

bhobba said:
However, it is a good place to start viewing QM as a generalised probability theory, although, strictly speaking, even that view does not require observers.
I do view it a bit like "generalised probability" as well, but I think we see it in different ways and will most certainly not agree. The "generalised probability" is what I call "inference", and it's more than just probability one fixed states spaces so I think inference is a better name. In this generalisation the "observer" is the subsystem that detects, postprocesses and encodes the "observations". (I.e the agent).

bhobba said:
It would require a deep sojourn into the philosophy of probability, again not on topic here
Indeed, the boots are already in the mud.

/Fredrik
 
  • #176
gentzen said:
A different suggestion (SCNR): Fra, your views seem to be sufficiently evolved and detailed that it would make sense to write them down in a more coherent form than just as comments on other peoples questions and answers. Maybe as an FQXi essay, maybe as a paper of some form, maybe as a series of blog post, or... I am not suggesting that you should link your PF account to those "external activities" and give away more of your identity than you want. But I do suggest that you should do some activity in that direction. Otherwise you risk kidding yourself with respect to your views and their impact.
Thanks for your concern! But I have no illusions of anything here. I've been struggling with this for 25 years by now, and the a priori chance of working this out is of course nil. I decided long ago to not officially publish and vauge ideas anywhere, but have the ambition to work the theory out and publish something iff it solves some of the major the problems. Before then, no one will care about this no more than I care about string theory. Whoever has an idea, bears the responsibility to realize it. All else along the way are informal discussions for me that is often interesting for several reasons.

/Fredrik
 
  • #177
vanhees71 said:
How do you come to that conclusion? To the contrary, with more and more advanced technology we are more and more able to observe the "quantum world" (as if there were any other world than the "quantum world"). To handle generic quantum system nowadays becomes more and more applied, and more and more universities of applied sciences develop curricula for the development of "quantum technology".

I think at the moment, direct observation of the quantum world, such as the scanning tunnelling microscope, requires the use of a macro object. But I take your point - technology is progressing rapidly, and such may (perhaps even likely) not hold in the future. So I stand corrected. It is a leftover from the early days of QM and is only an intuitive starting point to a theory based on observables. Even then, there are several QM formulations, all equivalent, some of which do not require observation:
http://math.bu.edu/people/mak/papers/Styer Am J Phys 2002.pdf

So I retract my statement, except as a starting point to a deeper understanding of QM.

Thanks
Bill
 
  • #178
vanhees71 said:
How do you come to that conclusion? To the contrary, with more and more advanced technology we are more and more able to observe the "quantum world"
This is what I consider to be the meaning of the indirect contact. Ie. The increasing amount of complexity of both preparation and postprocessing of large amounts of information for the image to emerge is what creates a sort of distance in the inference chain. But yes technology makes us reach further.

/Fredrik
 
  • #179
Sure, without measurement devices and other technology there'd not be much of physics and the other natural sciences as we know it today!
 
  • #180
It is just impossible to go much below several nm of scale as the familiar building blocks of matter turn to quantumness(unpredictability). Sure, single atoms can still be manipulated but on a very different set of terms. I'd love to be able to dive in that Sea of unpredictability.
 
  • #181
vanhees71 said:
Causality means that the state of a (quantum) system can be influenced only by the past and not the future. In relativistic models of spacetime this implies that there cannot be causal influences from space-like separated events. In both classical and quantum relativistic theories this has been realized by a strict use of the paradigm of local field theories. In quantum field theory it is realized by a formal mathematical demand called the "microcausality principle", i.e., the quantum fields are the building blocks for all the operators that describe observables at a point in spacetime (usually densities like charge density, energy-momentum densities, etc.) must commute with the Hamilton density for space-like separated space-time arguments. This rules out any "spooky actions at a distance", i.e., causal effects can only be due to signals that propagate with a speed less than or equal to the speed of light in vacuum.
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?

The only interactions in the SM have a preferred direction of time on their face are those involving the W boson, and even then, CP violation is well quantified in the CKM/PMNS matrixes and CPT symmetry still holds to limit the way that time asymmetry can change the relevant laws. Moreover, observations of entanglement do not generically involve interactions that include W bosons (except, perhaps virtual W bosons in high order loops).

For example, suppose that two particles are entangled and one of them is measured sometime later.

Why can't information regarding the resolution of that measurement travel back in time to the point of entanglement along the path that we perceive that the particle took to get there; and then, the information could be transmitted to the other particle from the point of entanglement to the point in time where the second entangled particle is measured?

The two particles are connected by an unbroken chain within space-time to each other in the same light cone, so that wouldn't violate locality, only causality. (It isn't even obvious to me in that case that "reality", which I agree is poorly named, would be broken.)

Doesn't the observation that a Feynman diagram can be rotated in any way desired and still hold true imply that the SM does not require causality, in the sense of there being a preferred direction of time?
 
  • #182
ohwilleke said:
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?
This look like the transactional interpretation of QM; discussion of that belongs in a separate thread in the interpretations subforum.
 
  • #183
PeterDonis said:
This look like the transactional interpretation of QM; discussion of that belongs in a separate thread in the interpretations subforum.
I'm sure you are right. But could you help me understand what makes this a transactional interpretation of QM as opposed to what was already being discussed here? I'm not sure I grasp what the distinction is.
 
  • #184
ohwilleke said:
could you help me understand what makes this a transactional interpretation of QM as opposed to what was already being discussed here?
Actually, I just noticed that this thread is in the interpretations subforum. So the transactional interpretation can be discussed here, at least as far as how it would account for entanglement and the associated correlations that violate the Bell inequalities, and doesn't require a separate thread.
 
  • #185
ohwilleke said:
ould you help me understand what makes this a transactional interpretation of QM
Look up the transactional interpretation, and you will see that it says what you were saying in the post I responded to.
 
  • #186
PeterDonis said:
Look up the transactional interpretation, and you will see that it says what you were saying in the post I responded to.
O.K., I had mostly been wondering why this wasn't an interpretation of quantum mechanics, so your previous post actually answered that question, although I appreciate knowing what this interpretation is called.

The transactional interpretation of quantum mechanics (TIQM) takes the wave function of the standard quantum formalism, and its complex conjugate, to be retarded (forward in time) and advanced (backward in time) waves that form a quantum interaction as a Wheeler–Feynman handshake or transaction. It was first proposed in 1986 by John G. Cramer, who argues that it helps in developing intuition for quantum processes.
"Cramer claims it avoids the philosophical problems with the Copenhagen interpretation and the role of the observer, and resolves various quantum paradoxes, such as quantum nonlocality, quantum entanglement and retrocausality." (from the article linked below).

and relatedly:

The Wheeler–Feynman absorber theory (also called the Wheeler–Feynman time-symmetric theory), named after its originators, the physicists Richard Feynman and John Archibald Wheeler, is an interpretation of electrodynamics derived from the assumption that the solutions of the electromagnetic field equations must be invariant under time-reversal transformation, as are the field equations themselves. Indeed, there is no apparent reason for the time-reversal symmetry breaking, which singles out a preferential time direction and thus makes a distinction between past and future. A time-reversal invariant theory is more logical and elegant. Another key principle, resulting from this interpretation and reminiscent of Mach's principle due to Tetrode, is that elementary particles are not self-interacting. This immediately removes the problem of self-energies.

It makes sense that I'm inclined towards this approach, since most of what I initially learned about quantum mechanics I learned from reading things written by Feynman.

The link calls this interpretation explicitly non-local, however, and I'm still not clear on why it is non-local instead of acausal. Apparently, it defines causality different than @vanhees71. But, I suppose that that is really just splitting hairs.
 
  • #187
ohwilleke said:
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?
This is an assumption we make in all of physics.

In classical electrodynamics we choose the retarded solutions connecting the sources (charge and current densities) with the em. field. Of course there are infinitely many Green's functions that formally also solve the Maxwell equations (among the the advanced Green's function). The reason is that classical electrodynamics is time-reversal invariant, and we need the causality assumption in addition to the Maxwell equations to impose the corresponding boundary conditions to select the retarded solution as the one describing the emission of em. waves from their sources. Of course the theory must be formulated such that such a "causality choice" is possible, and the wave equation is such an equation, and it is closely related to the relativistic spacetime model, Minkowski space, which admits such a "causal order".

In relativistic QFT the way to enable the "causal order" is the microcausality constraint, i.e., that local observables commute at space-like separated arguments. Among other things that makes the time-ordering in the perturbative evaluation of S-matrix elements frame-independent and the S-matrix elements Poincare covariant. Also it ensures the cluster-decomposition principle.
ohwilleke said:
The only interactions in the SM have a preferred direction of time on their face are those involving the W boson, and even then, CP violation is well quantified in the CKM/PMNS matrixes and CPT symmetry still holds to limit the way that time asymmetry can change the relevant laws. Moreover, observations of entanglement do not generically involve interactions that include W bosons (except, perhaps virtual W bosons in high order loops).
Sure, among other things the weak interaction breaks time-reversal invariance. This, however, just says that for some processes the time-reversed process does not exist in nature.
ohwilleke said:
For example, suppose that two particles are entangled and one of them is measured sometime later.

Why can't information regarding the resolution of that measurement travel back in time to the point of entanglement along the path that we perceive that the particle took to get there; and then, the information could be transmitted to the other particle from the point of entanglement to the point in time where the second entangled particle is measured?

The two particles are connected by an unbroken chain within space-time to each other in the same light cone, so that wouldn't violate locality, only causality. (It isn't even obvious to me in that case that "reality", which I agree is poorly named, would be broken.)

Doesn't the observation that a Feynman diagram can be rotated in any way desired and still hold true imply that the SM does not require causality, in the sense of there being a preferred direction of time?
The preferred direction of time in the sense of a "causal time arrow" is, indeed, an assumption you impose to any physical theory. If a model (like electrodynamics or quantumchromodynamics) is time-reversal invariant for any process also the time-reversed process is possible in Nature (according to this theory).

In our classical example suppose you have a point-like source of radiation (i.e., time-dependent charges and currents within some small region). The usual situation of course is that electromagnetic waves are propagating out from this source, described by the retarded solution of the Maxwell equations. As an example take the Hertzian dipole radiation treated in any textbook on electrodynamics.

The time reversed situation is, in principle, possible. It describes some wave propagating towards the source in such a way that it is completely absorbed by this source. This situation indeed does not violate any laws of physics (including causality!). It is, however, very difficult to realize in practice. You'd have to somehow arrange sources of this incoming wave precisely such that it get's completely absorbed when it arrives at the localized charge distribution, and you'd have to do this over a wide area far away from it very precisely, and this is practically impossible.

Of course, it's not impossible for microscopic situations, where you often can prepare the "time-reversed process".
 
  • #188
vanhees71 said:
The time reversed situation is, in principle, possible.
Don't you agree that a time-symmetric picture is more natural? Certainly for microscopic processes.

The virtue of the transactional interpretation (TI) is that it incorporates the Born rule, which in most other interpretations is an incongruent addition to unitary evolution, leading to the infamous measurement problem. Every physicist should know that a ket by itself is meaningless, and measurable quantities arise only when it is combined with a bra. And those have opposite time-dependencies. (At least in the Schrödinger and interaction pictures.)

The deficiency of TI is that it doesn't explain how forward and backward traveling waves ("offer" and "confirmation" waves) give rise to transactions ("handshakes"). Moreover, these waves cannot be waves in real space. They are merely mathematical devices describing correlations between events (Green functions), and the formalism was worked out long before Cramer introduced TI: the Schwinger/Keldysh closed time-path formalism.

To give the argument another twist, one could rephrase TI in terms of particles: a transaction could be seen as (for example) the exchange of a photon traveling forwards in time and an anti-photon traveling backwards in time. Of course we are habituated to think of objects moving forward in time, but the formalism doesn't dictate this. In fact I think it's futile to try to explain quantum processes in terms objects (be they waves or particles), because statements about the properties of those objects are inevitably contradictory. The formalism doesn't require the existence of such objects in the classical sense at all; electrons and photons enter QED only as correlation functions (Green functions) describing correlations between events.

A typical Bell-type experiment involves some "wiggling" of electrons in a Ca-atom, followed by similar "wiggling" of electrons in the detectors a few meters away, a few nanoseconds later. On what happens in between theory remains silent. We should be happy to have a theory that predicts the statistical regularities (non-local correlations) of those short-lived, localized current fluctuations. We shouldn't ask for more. :smile:
 
  • #189
WernerQH said:
Don't you agree that a time-symmetric picture is more natural?
I don't, because our experience is not time symmetric. So our physical model should not be time symmetric either.

Note that time symmetry of laws is not the same as time symmetry of a model. A particular model is based on a particular solution of the laws, not just the laws themselves. Time symmetric laws can have time asymmetric solutions; such solutions just have to come in pairs, each one the time reverse of the other. So there is no problem in building time asymmetric models using time symmetric laws.
 
  • Like
Likes Dragrath, bhobba and vanhees71
  • #190
PeterDonis said:
So there is no problem in building time asymmetric models using time symmetric laws.
I don't think anyone would argue otherwise. But this is the interpretation section. Doesn't this ad hoc requirement bother you?
 
  • #191
WernerQH said:
Don't you agree that a time-symmetric picture is more natural? Certainly for microscopic processes.
I don't know. I don't consider the "discrete spacetime symmetries" very intuitive. Why should nature be invariant under time-reversal? Our everyday experience is also such that there's a clear direction of time.

In physics the most fundamental "arrow of time" is the just postulated causal ordering, i.e., the cause of an event must be temporally before this event.

Then one can show that various other "arrows of time" follow from this fundamental "causal arrow of time". One is the "thermodynamic arrow of time", which is defined by that direction of time for which the entropy of a coarse-grained description of the dynamics is not decreasing (staying constant defines then a thermal-equilibrium state). The usual way to derive it is to use the Boltzmann equation, which follows from the microscopic dynamics by neglecting correlations at the two-body level, i.e., the two-body phase-space distribution function in the collision term is assumed to be well approximated by the corresponding product of one-body distribution functions, and then the H-theorem follows from unitarity of the S-matrix (note that you don't need the assumption of time-reversal invariance as often claimed in the textbook literature; see Landau&Lifshitz vol. X for this important point), but looking closely at the derivation of the Boltzmann equation you see that this thermodynamic direction of time comes just from the assumption of the causal direction of time.
WernerQH said:
The virtue of the transactional interpretation (TI) is that it incorporates the Born rule, which in most other interpretations is an incongruent addition to unitary evolution, leading to the infamous measurement problem. Every physicist should know that a ket by itself is meaningless, and measurable quantities arise only when it is combined with a bra. And those have opposite time-dependencies. (At least in the Schrödinger and interaction pictures.)
I'm not familiar with the various interpretations. For me Born's rule is just another postulate of QT, which connects the abstract formalism to real-world observations, providing the minimal interpretation by just saying how to get the probabilities for measurement outcomes given the initial (pure or mixed) state and the Hamiltonian of the system, providing the dynamics.
WernerQH said:
The deficiency of TI is that it doesn't explain how forward and backward traveling waves ("offer" and "confirmation" waves) give rise to transactions ("handshakes"). Moreover, these waves cannot be waves in real space. They are merely mathematical devices describing correlations between events (Green functions), and the formalism was worked out long before Cramer introduced TI: the Schwinger/Keldysh closed time-path formalism.
The Schwinger-Keldysh formalism is based on the usual time evolution of quantum theory, combining the time-ordered ##\hat{U}(t,t_0)## and anti-time-ordered ##\hat{U}^{\dagger}(t,t_0)## when calculating the time evolution of the statistical operator. It's a calculational tool. I don't understand what this should have to do with forward or backward traveling waves.
WernerQH said:
To give the argument another twist, one could rephrase TI in terms of particles: a transaction could be seen as (for example) the exchange of a photon traveling forwards in time and an anti-photon traveling backwards in time. Of course we are habituated to think of objects moving forward in time, but the formalism doesn't dictate this. In fact I think it's futile to try to explain quantum processes in terms objects (be they waves or particles), because statements about the properties of those objects are inevitably contradictory. The formalism doesn't require the existence of such objects in the classical sense at all; electrons and photons enter QED only as correlation functions (Green functions) describing correlations between events.
In the standard formalism of QFT nothing travels backwards in time, and the photons are anyway identical with the anti-photons, because photons are strictly neutral. The claim that antiparticles were particles moving backward in time is ironically exactly the wrong interpretation of the formalism. Microcausality demands that the free-field operators must always contain both positive- and negative-frequency modes, but one writes a creation operator in front of the negative-frequency modes and a destruction operator in front of the positive-frequency modes, leading to particles and antiparticles having both positive energy and moving both forward in time. In fact this "Feynman-Stueckelberg trick" saves causality by implementing the microcausality constraint.
WernerQH said:
A typical Bell-type experiment involves some "wiggling" of electrons in a Ca-atom, followed by similar "wiggling" of electrons in the detectors a few meters away, a few nanoseconds later. On what happens in between theory remains silent. We should be happy to have a theory that predicts the statistical regularities (non-local correlations) of those short-lived, localized current fluctuations. We shouldn't ask for more. :smile:
 
  • Like
Likes LittleSchwinger, physika and ohwilleke
  • #192
hutchphd said:
Doesn't this ad hoc requirement bother you?
Why is it an "ad hoc" requirement that our models should match what we actually observe?
 
  • Like
Likes Dragrath and vanhees71
  • #193
PeterDonis said:
Why is it an "ad hoc" requirement that our models should match what we actually observe?
My interpretation is that the apparent/effective timelessness of microphysics is plausible due to how the laws and states are decomposed as we infer them, and I do not worry about it. This assymmetry should go away if we treat information about states and information about laws on similar footing.

I interpret it to be related to the asymmetry in the way we infer the laws, and the way we infer the initial state(preparation). This assymmetry holds most clearly specifically for small subsystems, for cosmological scale observations the inference of states and laws blur more, because we(the observer) can not observe these phenomena sufficiently many times with many different intitial conditions.

So what I find to be "ad hoc" is the artificial separation of "knowledge about laws" and "knowledge about initial conditions"; from the view of inference this sticks out as an inconsistency. Some sort of understanding of unification or "relation" between states and laws is missing.

If this didn't make much sense, there is a whole book-length attempt to explain (Time Reborn, by Lee Smolin)

/Fredrik
 
  • #194
vanhees71 said:
The Schwinger-Keldysh formalism is based on the usual time evolution of quantum theory, combining the time-ordered ##\hat{U}(t,t_0)## and anti-time-ordered ##\hat{U}^{\dagger}(t,t_0)## when calculating the time evolution of the statistical operator. It's a calculational tool. I don't understand what this should have to do with forward or backward traveling waves.
When you apply contractions of operators, anti-time-ordering implies that you have propagators going backwards in time. I understand that you prefer classical habits of thought and choose to simply redefine what "propagates".
 
  • #195
PeterDonis said:
Why is it an "ad hoc" requirement that our models should match what we actually observe?
It is "ad hoc" because that is what it is (it flows from no more fundamental consideration)
It clearly doesn't bother you so you have answered my question I believe.. I guess part of me thinks the arrow of time shoukd somehow appear on a celestial billboard !
 
  • #196
hutchphd said:
It is "ad hoc" because that is what it is (it flows from no more fundamental consideration)
By this definition, every requirement is ultimately "ad hoc" because it ultimately rests on some proposition that just "is what it is" and doesn't flow from a "more fundamental consideration". Ultimately there must always be some set of propositions that are that way; otherwise we have an infinite regress of "more fundamental considerations" that never bottom out in anything.
 
  • Like
Likes Dragrath, vanhees71 and hutchphd
  • #197
Absolutely true. But we do not know when we have reached some minimum number of "fundamental" truths (nor perhaps can we know). A phenomenological theory with 100 adjustable parameters is not equivalent to QFT even though each is rooted in "what it is". The fact that all theories are "ad hoc" does not make them equally interesting.
 
  • #198
hutchphd said:
we do not know when we have reached some minimum number of "fundamental" truths
Yes, but the rule you raised a question about was "our models needs to match our actual observations". Wouldn't this end up being part of that minimum number of fundamental truths no matter what else happens?
 
  • #199
But my point was that the phenomenological theory with 100 parameters was more "ad hoc" and therefore less interesting. It is Feynman's argument that the fundamental physical law is U=0 where U is the "unworldlyness".
 
  • #200
hutchphd said:
my point was that the phenomenological theory with 100 parameters was more "ad hoc" and therefore less interesting.
But your original question to me in this subthread was why the "ad hoc" requirement for models to match our actual observations doesn't bother me. And my answer is simply that, whether you want to label the requirement as "ad hoc" or not, it seems to me like a requirement that's going to be there regardless of anything else, so why should it bother me? It shouldn't bother anyone. It's necessary to build models at all.
 
Back
Top