PeterDonis said:
Please give a reference to an established interpretation of QM that does not have superposition.Please give a reference on the TI that makes the same claims you are making. I don't think one exists; I think what you are calling the "Transactional Interpretation" is not the established TI itself, but your own personal version of it. That's why I keep asking you for references.I'm not. That's why I keep asking you for references. If you point me at a reference (a textbook or peer-reviewed paper) and say "this is the established physical model I'm talking about", then I (and other readers of this thread) can evaluate for myself whether I agree that you're describing it correctly and that it makes the same claims you are making. Without that, all we have is what you claim, and that's not enough.
Hi Peter
I understand - and appreciate - that you are here trying to maintain high quality of content. But you are going to have to follow the references I give before you complain I'm not giving you anything.
I'll break it down as much as I can, but I can't do all the reading and thinking for you.
I pointed you out to the TI
Wikipedia article because it gives you the first point of contact to judge its merit, in case you are not at all familiar with it.
To dig deeper, the
very first reference in that page is the https://www.researchgate.net/publication/226312851_Transactional_Interpretation_of_Quantum_Mechanics on ResearchGate (free to download).
Before you even download the .pdf, the images in the abstract are spacetime diagrams that explain the crux of the concept.
Which is that all interactions are local in a strong sense of "locality" - it's just that absorber interacts with the emitter via "advanced wave", meaning a wave propagation backwards in time along the light-like cone surface.
So if you understand the topic of superposition, at this point already it should be clear there is no such thing as "superposition" occurring in the paradigm of this interpretation. "Entanglement" is just what observers call a circumstance where transaction yielded seeming non-local correlation between detection sites. It's really not very complex concept if you think of it from a "static spacetime" perspective (which is why it's interesting that Einstein never made any comments about this possibility)
That paper is quite long and involved, especially if you are not too familiar with the topic, so I won't blame you if you don't investigate it too carefully. But if you do, it does offer a lot of interesting commentary about the history of quantum interpretations. Unfortunately the PDF is captured as images (as opposed to text), but let me just point out few interesting details for your benefit;
What he is getting at here is that there's non-locality of "strong" kind (that would allow establishing simultaneity and thus would conflict with the basic tenet of SR), and weak kind (that demonstrates seemingly impossible correlations -> Bell experiments).
He chooses to call the second kind also "non-locality". Personally I find it simpler to view locality as having all interactions pass via light-like connections, and thus I'd call TI "local" too. But that's just semantics - the main thing is to
understand the topic being discussed, not agree with what words to use.
This is referring to the exact topic of my post from yesterday - about the philosophical pitfall of considering unobservable things as not existing - even when we have a fully logical reason for that observational limit to exist (in the context of SR, it is the stance that simultaneity of events does not exist).
(Here SV is "State Vector")
So this is a direct reference to Einstein complaining about the loss of "strong" sense of locality, and he is preparing to point out the difference to the Bell type of non-locality where no actual superluminal communication can be accomplished, but instead we are merely talking about unexpected correlations between detection events;
Then he establishes the simple fact that if you allow a handshake to occur within a static time block (or if you wish you can call it as a transaction occurring forward and backward in time), we can explain the case of "enforcement of correlations in spatially separated measurements" (Hopefully it's clear but if not - he means Bell Experiments);
And here's a simple schematic of the concept again:
There's no superposition or entanglement here - there is only the idea of completely static spacetime, where some patterns "look like" enforced correlations, but superposition does not actually occur.
PeterDonis said:
Is not at all what I'm saying. I'm just saying that superposition--and also entanglement, which is really more the issue, as my previous post made clear--is part of standard QM (various interpretations put different meanings on it, but they all have it), so you can't claim that you are somehow making superposition just go away and also claim that you are discussing an established model. You can't have it both ways.
I'm afraid there are multiple topics getting a bit mixed up here. First of all, entanglement is a concept that was pushed into the limelight by EPR paradox and Bell Theorem.
References to that claim here. It is true that entangled properties are problematic in the very original QM formalism already - EPR merely brought the problem into clearer attention. Transactional Interpretation - the bog standard version - is simply an example of an interpretation where there is no instantaneous effects across spacelike separated events - other than as an illusion to observers that observe forward flow of time. The point here is that TI falls out directly from Minkowski's interpretation of Relativity, where the universe is seen as a static structure (nothing prevents only forward causal patterns along light-like planes)
This is why I said I'm surprised Einstein never made any comment about this simple possibility. I'm almost certain he would have understood it exists as a logical possibility, but was not perhaps too enamored of pushing it as a realistic interpretation of reality.
I'm afraid you may be also confusing TI with some of my other comments about the fact that non-locality appears as a direct consequence of insisting that objects like photons exist in free-flight. You see, TI is not the only way to preserve locality and realism, it's just one example of this. Now I'm sure there are numerous comments made in all sorts of published papers and articles alluding to that fact, but I would hope it's quite easy for anyone to think through themselves too if they merely understand the basic history of quantum mechanics. After all, it strikes at the very heart of wave-particle duality. But let me spell it out too (some thinking will be required);
EPR paradox (reference already provided) was formed as an argument against the idea that particles (such as photons) don't have a definitive state to themselves prior to observation. The many subtle flavors surrounding this idea are extensively discussed in the TI paper I referred to above. What EPR is trying to establish is the idea that there still are particles in reality, and they must have a definitive state prior to observation (exact 180 degree shift to Einstein's original stance about unobservable things).
Bell's Theorem is a response to EPR argument, and the fundamental issue it points out is that it is not possible to preserve local realism in any view where those particles exist prior to observation. You see, the state of the system is already completely described by a wave function, and the correlations we get would not be problematic to locality if we didn't have any particles in free flight (but instead we only had quantized detection events of wave energies). This assertion is nothing but bog-standard physics.
First of all, detection events of EM energies are best described as quantized. This is normally described as an electron being able to only absorb energy in discrete steps (which is often viewed as an indication of electron being a harmonic wave in itself). The history of this view starts from Max Planck solving "
Ultraviolet Catastrophe" by assuming quantized energy absorption. This is in fact a crucial moment in the history of Quantum Mechanics - not something I just made up.
Also the modern view of transparent materials is exactly that the configuration of the electrons in the material have a wide gap in the EM spectrum right on the visible light range, which means EM radiation cannot get absorbed (since the absorption is quantized). At the same time, optical refraction of said light is viewed as being caused by the light passing through the material as a wave - interfering with itself and yielding a bent path as a total sum of all the interference. This is also bog standard physics - not my idea.
References here.
So, if you've never really thought through the actual logic behind Bell's Theorem (and the experiments thereof), it might seem surprising to find this out. But if you have thought them through, you might be aware of something curious. Heck, the TI paper even points this simple fact out;
(p. 654)
The point there is that in normal wave picture, the "energy loss" (which in QM formalism represents the probability of observation) is a cosine correlation to the filtering angle. This is bog standard wave mechanics, not something I made up. If you have multiple filters in a chain, the energy loss of the next one is a function of the relative angle to the previous one. The coincidence rate is expected (and also observationally is) to be the same in the case of pushing energy through chained filters, or when placing the filters far apart and measuring entangled wave-fronts or photons.
In case of photons,
there's a well known problem that adding a diagonal filter in between two completely orthogonal filters will increase the rate of passing "photons" from 0% to 25%. This is an example of wave-particle duality - a case where particle view falls apart. In the meantime, a wave view yields exactly the expectations we get (to figure out the expectations you just need basic trigonometry - a cosine correlation)
So if you simply consider all of these ideas together - all already assumed to be valid by modern physics - you should be able to trace down the expectations of a standard Bell Experiment. A full wave picture of that experiment yields the same exact correlation expectation via chained filters, as it does via spatially separated filters. The probability of absorption at a detection site is dependent on the filter configuration via a simple cosine correlation, as can be traced with simple trigonometry analysis. The QM formalism expectation is purely a cosine correlation. Not that surprising, being that quantum mechanism is wave formalism.
See;
https://en.wikipedia.org/wiki/Bell's_theorem#/media/File:Bell.svg
The linear correlations in this picture refer specifically to the idea that particles have a definitive state in flight. The non-linear correlation refers to QM expectation, where only wave propagation is modeled.
That simply means that with a picture where you have no particles, your expectation is the wave expectation. Nothing more, nothing less. Not my idea, just pointing out features of Bell's Theorem. So you see, the only case where that picture of "enforced correlations" yields an implication of non-locality is the case where the detection events - which are presumed to be quantized by themselves, are presumed to have be caused by
particles instead of waves. The idea of superposition and "spooky action of a distance" requires the assumption of particles in free flight - pure wave picture does not yield those concepts.
This is the entire point of so-called wave-particle duality; that concept exists because particle description doesn't cut it in these situations, and wave picture in others. Various interpretations of QM really are fundamentally just about proposing where the limit between these behaviors lie. The idea that photons might not be a good concept is not very controversial either - of course many phycisists often make comments about how neither is really suitable. Also it's good idea to remind oneself of the sociological history of the concept of photons. Reference to a published article
here.
So to get back to what was my point about this - it is very peculiar that here everyone in the history of QM seemed to insist that things we cannot directly observe exist (even when we can't see behavior that would suggest they do), but at other times made a whole lot of fuss about unobservable things not exists. It is also interesting that everyone seems to think that wave-particle duality is an easier concept to reconcile with realism than the idea that there's no free-flight quantized carrier entities. I wouldn't consider it a foregone conclusion myself (<- but that's just an opinion right there)
What I would have asked everyone back in the day is, what happens to the excess energy of a "photon" when an "electron" cannot absorb it all.

(That's a reference to energy conservation - not my idea)
Sorry about the length, but you keep asking for references to pretty standard things, so I can't know what parts of physics you already have understanding of and what parts need to be spelled out... But it's always fun to discuss this so thanks!
Regards
-Anssi