I Are there signs that any Quantum Interpretation can be proved or disproved?

  • #51
vanhees71 said:
the idea that this is actually the "observable" is also not convincing, because that may be true in a "thermal sense", i.e., when you consider macroscopic observables, where the fluctuations are "negligibly small" because you "coarse grain" over large enough space-time volumes, and in this sense your interpretation is indeed really "thermal",
Thus it applies to the measurement results, which are read off from macroscopic thermal objects...
vanhees71 said:
but it doesn't apply to microscopic objects, for which we want to use and interpret quantum theory.
... even when what is measured is a microscopic degree of freedom, by design of the detector strongly correlated with some macroscopic detector property. This is precisely the condition that allows us to speak of a measurement.
 
Physics news on Phys.org
  • #52
@A. Neumaier: If we measure properties of macroscopic bodies, things like expectation values are usually not thought to be beables but epistemic quantities. In principle, such measurement processes should also admit a quantum description where your thermal interpretation treats expectation values as beables. How is this reconciled?
 
  • #53
timmdeeg said:
In his book "Einstein's Schleier" Zeilinger says analogously (? sinngemäß) it is sufficient to understand the wave function just as a mental construct so that its collapse doesn't happen in real space. I was never sure if that is his personal view. It seems to fit though to that what you call "orthodox minimal interpretation", right?
The detected positions correspond as if the collapse happened in real space. 1:1

That's as good as a model can get.

You can't just brush aside the evidence.
 
  • #54
kith said:
@A. Neumaier: If we measure properties of macroscopic bodies, things like expectation values are usually not thought to be beables but epistemic quantities. In principle, such measurement processes should also admit a quantum description where your thermal interpretation treats expectation values as beables. How is this reconciled?
Nothing in the abstract formalism forces us to interpret the trace of ##\rho A## as an expectation values. A historically unbiased name for this number is 'value of ##A## in the state ##\rho##' - this is the literal mathematical meaning when treating the state ##\rho## as a linear functional on an algebra of observables -, leaving the additional qualification 'expectation' to statistical interpretations.

In the thermal interpretation, the traditional name 'expectation value' is therefore just a historical leftover from the old days when the statistical interpretation was thought to be the only reasonable one. I often use 'q-expectation value' to emphasize that quantum expectation values have the name but not the meaning in common with statistical expectation values.
 
  • Like
Likes gentzen
  • #55
I'm not sure if I understand this correctly. Does the thermal interpretation say that all quantum mechanical quantities which are traditionally thought of as statistical are beables and that statistics is relevant only in the classical description of measurement devices?
 
  • #56
EPR said:
The detected positions correspond as if the collapse happened in real space. 1:1
But this notion (in contradiction to special relativity) means that energy spread out in real space would collapse to a point instantaneously. That exactly is Zeilinger's argument. QM doesn't claim the "real space" issue.
 
  • Like
Likes gentzen
  • #57
kith said:
I'm not sure if I understand this correctly. Does the thermal interpretation say that all quantum mechanical quantities which are traditionally thought of as statistical are beables and that statistics is relevant only in the classical description of measurement devices?
Not quite.

Whatever is traditionally a statistical expectation value is in the thermal interpretation a q-expectation value and hence a beable. But one can do statistics even on quantum beables, not only on classical ones. In this way one recovers in the thermal interpretation the statistical interpretation of quantum mechanics in those situations where it applies - namely when one has a large supply of instances in identically prepared states.
 
  • Like
Likes gentzen and kith
  • #58
@A. Neumaier: Would you say that the "thermal interpretation" is independent of whether or not the projection postulate holds?
 
  • #59
timmdeeg said:
But this notion (in contradiction to special relativity) means that energy spread out in real space would collapse to a point instantaneously. That exactly is Zeilinger's argument. QM doesn't claim the "real space" issue.
Yes. If you supposed the system had those properties(e.g. energy) before measurement, you'd get some nonlocality.
 
  • Like
Likes timmdeeg
  • #60
A. Neumaier said:
Thus it applies to the measurement results, which are read off from macroscopic thermal objects...

... even when what is measured is a microscopic degree of freedom, by design of the detector strongly correlated with some macroscopic detector property. This is precisely the condition that allows us to speak of a measurement.
My problem with the approach you call "thermal interpretation" still is that it is not clear what the operational meaning of your expectation values is, because you stress several times it's not considered to have the usual probabilistic meaning, but what then is the operational meaning?

What I like about the approach principally is that it makes the attempt to describe the meausurement process on a quantum theoretical basis. If this could worked out to a convincing physical picture, I think it would be real progress.

The advantage of the orthodox minimal interpretation is that it starts from clear operational concepts, i.e., the expectation values have a clear probabilistic meaning, and a measurement device is a real-world physical object not some abstract mathematical construction like a POVM. The latter is needed for the cases that you are qualitatively describing but it's never worked out how to construct the POVM for a given real-world (say quantum optical) apparatus like a beam splitter, mirrors, lenses, a photodector, etc. In the standard approach (see e.g., the textbooks by Scully and Zubairy or Garrison and Chiao), where all these elements are pragmatically described by effective quantized classical models. To some extent you can also derive it from quantum many-body theory from first principles though that is of course pretty tough.
 
  • Like
Likes dextercioby
  • #61
WernerQH said:
No, I am not an instrumentalist.
I'm an instrumentalist and a Bohmian. Looks contradictory, but it isn't. :smile:
(See the link in my signature.)
 
  • #62
If you are an instrumentalist you cannot be a Bohmian at the same time, or how are you measuring Bohm's trajectories for a single particle (sic!) in the real-world lab?
 
  • #63
Could you please clarify the difference between "instrumentalist" and "orthodox minimal interpretation" in a short way?
 
  • #64
timmdeeg said:
Could you please clarify the difference between "instrumentalist" and "orthodox minimal interpretation" in a short way?
You probably have to distinguish between the pejorative use of the term "instrumentalist", and between the positive aspects actual instrumentalists see in it. I was not always an instrumentalist. I only became one after studying and expensively using M. Born and E. Wolf. Principles of Optics. My guess is that Max Born was an instrumentalist, and Arnold Sommerfeld too. They were excellent in their mastery of mathematics (Sommerfeld's rigorous solution of light scattering on the perfectly conducting half plane is incredible), and cared about what they could compute and do with their mathematics.

Let me take incoherent light as an example. There is no such thing as perfectly incoherent light, at least not in theory. Yet there are many scenarios in practice that are most appropriately modeled by using the incoherent limit. So an instrumentalist like me might draw a picture of two isolated wave packets that miss each other, and write a reassuring text like "If each wave train is alone, it cannot interfere with another wave train" (see last slide of attached pdf). But I knew that it was a lie, it was not how I thought about it myself. For me, it was not important whether the wave trains missed each other or not. It was only important that their frequencies were not exactly identical, so that the interference effects would average out in the end. But will they really exactly average out? Of course not, but I don't care.
 

Attachments

  • Like
Likes dextercioby, vanhees71 and timmdeeg
  • #65
timmdeeg said:
This article seems to question the universal validity of the projection postulate.

If correct would this affect the interpretations of QM?

https://link.springer.com/article/10.1007/s10701-021-00452-x
A wave function is not a "thing". Neither is it a complete description of reality. Schonfeld seems to think otherwise and feels compelled to derive something inherently non-deterministic from Schrödinger´s equation. What strikes me as particularly unphysical is the idea that an MeV alpha-particle, setting out as a spherical wave, can be "lured" into a particular direction by processes at the eV-scale in a metastable medium.

Something that is rarely discussed is the difference between the time-dependent and the time-independent Schrödinger equation. The stationary solutions of the time-independent Schrödinger equation are completely uncontroversial, and almost everybody agrees that those wave functions represent a time average. (The only time-dependence is an irrelevant phase factor, and of course this does not imply that the electrons are at rest.) Such a wave function describes a statistical ensemble.

Many people instantly forget this when they turn to the time-dependent Schrödinger equation. The wave function then seems to acquire a new meaning: describing an individual electron. The time-dependent Schrödinger equation describes the continuous and deterministic evolution of "something", which is completely at odds with the abruptness and randomness of processes in the real world. Quantum theory is more than Schrödinger´s equation: it is a stochastic theory giving us averages and expectation values. "Measurement" and the resulting collapse of the wave function were introduced only to uphold the fiction that the wave function represents an individual system; they are superfluous adornments of the formalism.A wave function is the best description we have of a beam of completely polarized particles; it is "complete" in the sense that there is no more comprehensive description. But it represents only a statistical ensemble. I think that the discussion of the time-dependence of wave functions is misleading. (In the Heisenberg picture the time-dependence disappears altogether.) It´s also worth pointing out that the Born rule arose from the consideration of stationary solutions of the scattering problem (solutions of the time-independent Schrödinger equation).
 
  • Like
Likes timmdeeg
  • #66
timmdeeg said:
@A. Neumaier: Would you say that the "thermal interpretation" is independent of whether or not the projection postulate holds?
The thermal interpretation implies that the projection postulate holds only under special circumstances, called von Neumann measurements. It is well-known that many measurements are not of this kind; any of these disproves the general validity of the projection postulate.
 
  • Like
Likes vanhees71, timmdeeg and gentzen
  • #67
A. Neumaier said:
The thermal interpretation implies that the projection postulate holds only under special circumstances, called von Neumann measurements. It is well-known that many measurements are not of this kind; any of these disproves the general validity of the projection postulate.
Didn't the Eraser experiment specifically disprove the existence of these 'other kind of measurements'?
 
  • #68
I don't know which "eraser experiment" you mean, but what should it have disproven? To the contrary the examples I know are all in very good accordance with the quantum theoretical predictions, and for sure here the projection postulate doesn't hold (as for nearly any experiment involving photons), because the photons are not in an eigenstate of the measured observable after the measurement but absorbed by the detector ;-)).
 
  • #69
I worded it wrong, I guess. I meant that the quantum eraser experiment suggests that measurements do not rely on classical apparatus for collapse. And those types of measurements(von Neumann measurements), which are purely quantum measurements between two strictly quantum systems(not classical) likely do not constitute measurements. As they depend on the presence of which-way inormation and not on the presence of classical apparati.
Von Neumann at some point concluded that this setup was unsatisfactory and posited the somewhat controversial bring up of consciousness into the scheme.

Sorry, if this is an incomplete account. I am trying to better understand von Neumann's reasoning. If I have grasped it correctly.
 
  • #70
vanhees71 said:
If you are an instrumentalist you cannot be a Bohmian at the same time, or how are you measuring Bohm's trajectories for a single particle (sic!) in the real-world lab?
You can play an instrument like the violin and be a Bohmian. :p
 
  • Haha
Likes Delta2 and vanhees71
  • #71
vanhees71 said:
My problem with the approach you call "thermal interpretation" still is that it is not clear what the operational meaning of your expectation values is, because you stress several times it's not considered to have the usual probabilistic meaning, but what then is the operational meaning?

In orthodox quantum mechnaics, most Hermitian operators for a multiparticle systems cannot be prepared (only finitely many can, but there are uncountably many), but you still accept their probabilistic interpretation without an operational recipe.

Thus why do you require more from the thermal interpretation? Some of the beables have an operational meaning, e.g., smeared q-expectation values of the electromagnetic field, or 2-point functions through linear response theory.
vanhees71 said:
it's never worked out how to construct the POVM for a given real-world (say quantum optical) apparatus like a beam splitter, mirrors, lenses, a photodector, etc.
This is not true. There is a large literature how to calibrate POVMs to correspond to actual equipment using quantum tomography.
 
  • #72
A. Neumaier said:
Thus why do you require more from the thermal interpretation? Some of the beables have an operational meaning, e.g., smeared q-expectation values of the electromagnetic field, or 2-point functions through linear response theory.
My impression is that the kind of correlations you have in mind for your beables are similar to the mutual intensity function used in computational optics to handle partially coherent (quasi monochromatic) light. Those do have an operational meaning, but it is non-trivial and it needs to be explained how those can be measured given suitable circumstances. (The CoherentVsIncoherent.pdf attached to my previous comment about incoherent light and instrumentalism also contains definitions of mutual coherence and mutual intensity functions.)

The reference to linear response theory is "challenging" for me personally. I would now have to read the
two referenced papers, one with 13 pages and some relevant sections from one with 102 pages. But from browsing those papers, my suspicion is that the answer from those papers will not be good enough to help me really understand what you have in mind.

My impression is that others had similar problems, and tried to clearly explain why they don't get it:
Demystifier said:
I have already discussed the ontology problem of thermal interpretation (TI) of quantum mechanics (QM) several times in the main thread on TI.

For correlation-beables similar to the mutual intensity function, I additionally fear that only those correlations between observable whose (anti-)commutator nearly vanishes will have a comparable operational meaning. My guess is that the plan for the remaining beables is to fall back to:
Callen’s criterion: Operationally, a system is in a given state if its properties are consistently described by the theory for this state.

I am not sure whether this is good enough. I personally would probably prefer to throw out those beables whose only operational meaning is provided by Callen's criterion.
 
  • Like
Likes timmdeeg and Demystifier
  • #73
vanhees71 said:
If you are an instrumentalist you cannot be a Bohmian at the same time, or how are you measuring Bohm's trajectories for a single particle (sic!) in the real-world lab?
I don't measure Bohm's trajectories, just like an ordinary instrumentalist does not measure the wave function. The trajectories, like the wave function, are a tool (an "instrument"). But the trajectories are not merely a computational tool. They are much more a thinking tool, a tool that helps me think about quantum theory intuitively.
 
  • #74
A. Neumaier said:
In orthodox quantum mechnaics, most Hermitian operators for a multiparticle systems cannot be prepared (only finitely many can, but there are uncountably many), but you still accept their probabilistic interpretation without an operational recipe.

Thus why do you require more from the thermal interpretation? Some of the beables have an operational meaning, e.g., smeared q-expectation values of the electromagnetic field, or 2-point functions through linear response theory.

This is not true. There is a large literature how to calibrate POVMs to correspond to actual equipment using quantum tomography.
What do you mean by the "operators can't be prepared". Of course not, because there are no operators in the lab, but there are observables, and I can measure them in the lab with real-world devices with more or less accuracy. I also can prepare systems in states such that some observable(s) take more or less determined values (what I can't do is of course a preparation which would violate the uncertainty relations of the involved observables).

I'm not criticizing the vast literature about POVMs, I for sure don't know enough about. I'm criticizing your approach as a foundational description of what quantum mechanics is. For a physicist it must be founded in phenomenology, i.e., you never say what your expectation values are if not defined in the standard way. If you discard the standard definition, which is usually understood and founded in an operation way to phenomena, you have to give an alternative operational definition, which you however don't do. I'm pretty sure that the edifice is sound and solid mathematically, but it doesn't make sense to me as an introduction of quantum theory as a physical theory. This may well be due to my ignorance though.
 
  • #75
vanhees71 said:
What do you mean by the "operators can't be prepared".
I was referring to the common practice of calling self-adjoint operators observables.
vanhees71 said:
Of course not, because there are no operators in the lab, but there are observables, and I can measure them in the lab with real-world devices with more or less accuracy.
Some observables can be measured in the lab, and they sometimes correspond to operators, more often only to POVMs.

However, most observables corresponding to operators according to Born's rule cannot be observed in the lab!

vanhees71 said:
I'm criticizing your approach as a foundational description of what quantum mechanics is. For a physicist it must be founded in phenomenology,
Why? It must only reproduce phenomenology, not be founded in it.

When the atomic hypothesis was proposed (or rather revitalized), atoms were conceptual tools, not observable items. Theey simplified and organized the understanding of chemistry, hence their introduction was good science - though not founded in phenomenology beyond the requirement of reproducing the known phenomenology.

Similarly, energy is basic in the foundations of physics but has no direct phenomenological description. You need already theory founded on concepts prior to phenomenology to be able to tell how to measure energy differences.

Of course these prior concepts are motivated by phenomenology, but they are not founded in it. Instead they determine how phenomenology is interpreted.

vanhees71 said:
i.e., you never say what your expectation values are if not defined in the standard way.
They are numbers associated to operators. This is enough to working with them and to obtain all quantum phenomenology.

Tradition instead never says what the probabilities figuring in Born's rule (for arbitrary self-adjoint operators) are in terms of phenomenology since for most operators these probabilities cannot be measured. Thus there is the same gap that you demand to be absent, only at another place.

vanhees71 said:
If you discard the standard definition, which is usually understood and founded in an operation way to phenomena, you have to give an alternative operational definition, which you however don't do. I'm pretty sure that the edifice is sound and solid mathematically, but it doesn't make sense to me as an introduction of quantum theory as a physical theory.
As @gentzen mentioned in post #72, Callen's criterion provides the necessary and sufficient connection to phenomenology. Once Callen's criterion is satisfied you can do all of physics - which proves that nothing more is needed.

If you require more you need to justify why this more should be essential for physics to be predictive and explanative.
 
Last edited:
  • Like
Likes mattt and dextercioby
  • #76
gentzen said:
Callen’s criterion: Operationally, a system is in a given state if its properties are consistently described by the theory for this state.

I am not sure whether this is good enough. I personally would probably prefer to throw out those beables whose only operational meaning is provided by Callen's criterion.
The problem with this is that the foundations must be independent of the state of the art in experimental practice. But Callen's criterion gets stronger with improvements in experiments, Hence what are beables would change with time, which is not good for a foundation.

The criterion for beables in the thermal interpretation is simple and clear - both properties making it eminently suitable for foundations.
 
  • #77
A. Neumaier said:
I was referring to the common practice of calling self-adjoint operators observables.

Some observables can be measured in the lab, and they sometimes correspond to operators, more often only to POVMs.

However, most observables corresponding to operators according to Born's rule cannot be observed in the lab!Why? It must only reproduce phenomenology, not be founded in it.

When the atomic hypothesis was proposed (or rather revitalized), atoms were conceptual tools, not observable items. Theey simplified and organized the understanding of chemistry, hence their introduction was good science - though not founded in phenomenology beyond the requirement of reproducing the known phenomenology.
Of course they were founded in phenomenology, and only accepted as a hypothesis by chemist but not by many physicists. I think most physicists got only convinced by Einstein's work on thermodynamical fluctuations like his famous Brownian-motion paper or the critical opalescence paper etc.
A. Neumaier said:
Similarly, energy is basic in the foundations of physics but has no direct phenomenological description. You need already theory founded on concepts prior to phenomenology to be able to tell how to measure energy differences.

Of course these prior concepts are motivated by phenomenology, but they are not founded in it. Instead they determine how phenomenology is interpreted.They are numbers associated to operators. This is enough to working with them and to obtain all quantum phenomenology.

Tradition instead never says what the probabilities figuring in Born's rule (for arbitrary self-adjoint operators) are in terms of phenomenology since for most operators these probabilities cannot be measured. Thus there is the same gap that you demand to be absent, only at another place.
The traditional statistical interpretation of expectation value is very simple. You learn it on day one in the introductory physics lab. You measure a quantity several times at the same system under the same conditions ("ensemble") and take the average.
A. Neumaier said:
As @gentzen mentioned in post #72, Callen's criterion provides the necessary and sufficient connection to phenomenology. Once Callen's criterion is satisfied you can do all of physics - which proves that nothing more is needed.

If you require more you need to justify why this more should be essential for physics to be predictive and explanative.
 
  • #78
vanhees71 said:
Of course they were founded in phenomenology, and only accepted as a hypothesis by chemist but not by many physicists.
Motivated and perhaps suggested by, but not founded, since they were not observable, only their consequences matched experiment. You needed to assume unobservable - nonphenomenological -atoms, and deduce from theory for them predictions (most often actually retrodictions) that were in agreement with experiments.

Thus the relation between concepts and phenomenology is through Callen's principle, as in my thermal approach to quantum mechanics. Both are equally founded in phenomenology, and through the same argument: Theory predicts correct results, hence is appropriate.

vanhees71 said:
The traditional statistical interpretation of expectation value is very simple. You learn it on day one in the introductory physics lab. You measure a quantity several times at the same system under the same conditions ("ensemble") and take the average.
This gives expectation values of a few observables only.

But Born's rule makes claims about observables corresponding to arbitrary self-adjoint operators - most of which are inaccessible to experiment. Thus their phenomenological meaning is completely absent.

Moreover, in QFT one uses (and even you use!) expectation terminology for N-point functions which correspond to non-Hermitian operators, for which Born's rule and the statistical interpretation is inapplicable! Whereas the thermal interpretation has no problem with this.
 
  • Like
Likes mattt and dextercioby
  • #79
I don't argue about the mathematics of your interpretation. I only say I don't understand it's foundation in the phenomenology. The paper you cited doesn't give a concrete treatment how to get the POVMS for the examples you quote nor do you give an operational meaning of the POVMs in general. The orthodox treat does this by clearly saying what are the probabilities for a measurement result with an idealized detector. Of course, for real-world detectors you need to understand their limitations and describe them accordingly, but that's not part of the general formulation of a theory.

In QFT the N-point functions are not observables but functions you calculate (in some approximation like perturbation theory or resummed perturbation theory etc.) to get the observable quantities like the S-matrix, providing decay rates and cross sections which can be measured.

Also in classical electrodynamics we calculate pretty often quantities that are not describing observables like the scalar and vector potential of the em. field, because it's simpler than to directly calculate these observable fields.
 
  • Like
Likes physicsworks, gentzen and WernerQH
  • #80
vanhees71 said:
In QFT the N-point functions are not observables but functions you calculate (in some approximation like perturbation theory or resummed perturbation theory etc.) to get the observable quantities like the S-matrix, providing decay rates and cross sections which can be measured.
One can say the same about all q-expectations in the thermal interpretation. They don't need any further justification - their name is as coincidental as using the expectation terminology for the QFT n-point functions.

That q-expectations are considered to be beables only means that some of them can be accurately observed (namely if they are macroscopic and accessible to humans). This is enough to obtain observable quantities.
vanhees71 said:
The paper you cited doesn't give a concrete treatment how to get the POVMS for the examples you quote nor do you give an operational meaning of the POVMs in general.

Although I may have failed to emphasize it in the paper (it is stated only in a footnote - Footnote 8), this is not true:

The proof of Theorem 1.1 on p.8 of my paper Born’s rule and measurement shows how to get a POVM for an arbitrary experimentally realized response system. Prepare enough states with known density matrices (giving the ##\rho_{ij}##) and collect for them enough statistics to determine the probabilities (giving the ##p_k##) then solve the linear system (5) to get the POVM. This is the standard quantum tomography principle.

The operational meaning is given by the theorem itself, which says how to get predictions for the probabilities from an arbitrary POVM, which can be compared with experiment.

No idealization is involved, except for the limited accuracy necessarily inherent in statistical estimation, due to the limitations imposed by the law of large numbers.
 
Last edited:
  • #81
Well, if your q-expectations are only calculational tools as the n-point functions in QFT then there's no physics in this interpretation as all, because then nowhere it is said, what is to be compared to real-world observables and phenomenology. That's my problem with this new attempt of an interpretation. In other words, there are no "beables" in your interpretation left (though I hate this philosophical lingo without a clear physical meaning, unfortunately introduced by Bell).

On the other hand, now all of a sudden you admit exactly what I say the whole time: There's a probabilistic meaning in the formalism (of the q-expectations I think, but that's what you deny on the other hand always), and it's tested by preparing systems in a given state and by measuring observables with real-world experiments and analyze the outcome in a statistical way. Then there's nothing new with your interpretation but the usual practice of the scientific community, but then it's making sense as a physical theory.

Do I know understand it right that all you want to do is the use POVMs as the probabilistic foundation of QT instead of idealized projective measurements. If this is the case, then you have to sharpen this foundation such that physicists understand, what it has to do with their real-world measurements in the lab.

It's also of course not very convincing if you provide an interpretation depending on the technial state of the art of measurement devices. Physical theories are independent of this. Technological progress may enable to disprove a theory, and a good theory makes predictions that enable this possibility.
 
  • #82
A. Neumaier said:
gentzen said:
I personally would probably prefer to throw out those beables whose only operational meaning is provided by Callen's criterion.
The problem with this is that the foundations must be independent of the state of the art in experimental practice. But Callen's criterion gets stronger with improvements in experiments, Hence what are beables would change with time, which is not good for a foundation.
OK, independent of whether others had similar problems, I should try to only speak for myself. My mental images are heavily influenced by computational (often statistical) optics.

A corresponding question in the optics context would be the status of evanescent waves as beables. Already the question of whether a wave is evanescent has no sharp answer: a wave that is evancescent in vacuum can be optical inside a material. (So a photoresist layer in close proximity to a contact mask could still couple-in some of the evanescent waves present in the vacuum between the mask and the photoresist.)

But ... the refractive index of a photoresist typically will be around 1.7, sometimes maybe 1.9. OK, there are piezoelectric materials like PZT, whose relative permittivity can range from 300 to 20000 (depending upon orientation and doping), so the refractive index can range from 17 to 140. However, this does not help, because strongly evanescent waves would be unable to directly couple-in from vacuum. (Your remark about experimental practice could now mean the invention of some wonder material that improves the in-coupling of evanescent modes so that they are at least no longer exponentially suppressed.)

My conclusion from these concrete practical considerations is that evanescent waves don't abruptly lose their status as beables. However, strongly evanescent waves do get exponentially suppressed, there is no realistic way around it, and it might make sense to suppress their status as beables similarly. One further argument to suppress their status as beables is that the evanescent waves are less a property related to properties of the incoming partially coherent light, but more a property of the experimental setup, i.e. the contact mask in my example above.

That last point is somewhat related to why "I additionally fear that only those correlations between observable whose (anti-)commutator nearly vanishes will have a comparable operational meaning". In those cases, the q-correlations are much more related to the Hamiltonian than to the state of the system.
 
  • #83
It's a bit off-topic, but I wonder, how you can produce an evanescent wave in vacuo? There are of course evanescent waves in wave guides, but that's not vacuum.
 
  • Like
Likes gentzen
  • #84
vanhees71 said:
It's a bit off-topic, but I wonder, how you can produce an evanescent wave in vacuo?
Well, it might actually be a valuable clarification. My first thought was to use a diffraction grating whose pitch is so small that the first diffraction orders are already evanescent. If one now places a photoresist directly behind the mask, one might observe the intensity distribution resulting from the interference between the zeroth and first orders in the photoresist. One problem might be that the zeroth order is too dominant so that not much can be seen.

I better idea might be to use a glass substrate for the mask such that the evanescent wave you want to produce is still an optical wave inside the glass. Now use a diffraction grating (on a flat surface of your glass substrate) such that one of the first diffraction orders is an evanescent mode with exactly the opposite wave-number of the incident wave. Now you should be able to observe much more pronounced interference patterns in the photoresist.

This is related to the most trivial way to generate an evanescent wave in vacuum: Use a prism and tilt the incident wave sufficiently such that the refracted wave is already evanescent in vacuum.
 
  • #85
vanhees71 said:
Well, if your q-expectations are only calculational tools as the n-point functions in QFT then there's no physics in this interpretation as all, because then nowhere it is said, what is to be compared to real-world observables and phenomenology. That's my problem with this new attempt of an interpretation. In other words, there are no "beables" in your interpretation left (though I hate this philosophical lingo without a clear physical meaning, unfortunately introduced by Bell).
Well, beables are calculational tools for making predictions in quantum physics, just as atoms were for Chemists in the early days of modern chemistry.
vanhees71 said:
On the other hand, now all of a sudden you admit exactly what I say the whole time: There's a probabilistic meaning in the formalism (of the q-expectations I think, but that's what you deny on the other hand always), and it's tested by preparing systems in a given state and by measuring observables with real-world experiments and analyze the outcome in a statistical way.
There is a probabilistic meaning in POVMs - they describe the probabilistic part of quantum mechanics (i.e., the part describable by the minimal statistical interpretation). I never had any other place for them.

But the thermal interpretation goes far beyond POVMs, because measurement devices are not made out of POVMs but out of quantum matter. So there must be a microscopic explanation for the ''problem of definite outcomes'' - why detectors produce objectively identifiable (though random) signals in each particular measurement. The minimal statistical interpretation (with a subjective view of the state as representing knowledge) has no explanation for this - it is an irreducible additional input to its view of quantum physics. But the thermal interpretation answers this.
vanhees71 said:
Do I know understand it right that all you want to do is the use POVMs as the probabilistic foundation of QT instead of idealized projective measurements.
This is all I want to do with POVMs - because it gets rid of the idealization and at the same time simplifies the exposition of the foundations.

But with the thermal interpretation I want to do more. I want to give a good explanation for the empirical fact that one does not need to measure an ensemble of many identical iron cubes (as Born's rule requires) and find always essentially the same result to obtain reliably the properties of a single iron cube. As every engineer knows, a single measurement is reliable enough. The statistical interpretation is silent about the single case, even when it is macroscopic.
vanhees71 said:
If this is the case, then you have to sharpen this foundation such that physicists understand, what it has to do with their real-world measurements in the lab.
I described the connection with real-world measurements in the lab through quantum tomography, and explained it again in the present thread. What is not sharp enough in these foundations?
vanhees71 said:
It's also of course not very convincing if you provide an interpretation depending on the technical state of the art of measurement devices. Physical theories are independent of this.
My interpretation is also independent of this, as the paper shows. Dependent on the technical state of the art of measurement devices is only which POVMs are realizable, and with which accuracy.
 
  • Like
Likes gentzen and mattt
  • #86
A. Neumaier said:
But the thermal interpretation goes far beyond POVMs, because measurement devices are not made out of POVMs but out of quantum matter. So there must be a microscopic explanation for the ''problem of definite outcomes'' - why detectors produce objectively identifiable (though random) signals in each particular measurement. The minimal statistical interpretation (with a subjective view of the state as representing knowledge) has no explanation for this - it is an irreducible additional input to its view of quantum physics. But the thermal interpretation answers this.
Could you elaborate a bit on "So there must be a microscopic explanation for the ''problem of definite outcomes'' - why detectors produce objectively identifiable (though random) signals in each particular measurement" in plain language?

Is there some kind of interaction between the wavefunction and the detector on the microscopic level such that the detector "feels" the probability of a given outcome and creates it? (On this level atoms and molecules are vibrating and "feel" electromagnetic radiation, so that this question seems to make no sense unless new physics is involved).

In the double slit experiment the position of a dot on the screen corresponds to it's probability. Does according to the thermal interpretation the whole screen act as a detector in this case?
 
  • #87
timmdeeg said:
Could you elaborate a bit on "So there must be a microscopic explanation for the ''problem of definite outcomes'' - why detectors produce objectively identifiable (though random) signals in each particular measurement" in plain language?
There must be a microscopic explanation for the ''problem of definite outcomes'' because measurement devices are made out of quantum matter, so they are described by a quantum state. The observed pointer position is a property of the measurement device. According to the statistical interpretation all we can know about the quantum system constituted by the measurement device is encoded in its quantum state. The ''problem of definite outcomes'' is to show how this quantum state encodes the definite observed pointer position, and how the unitary dynamics postulated by quantum physics leads to such a definite observed pointer position.

The statistical interpretation has no answer for this but simply assumes it as an irreducible fact - in addition to the quantum dynamics and the state interpretation.
timmdeeg said:
Is there some kind of interaction between the wavefunction and the detector on the microscopic level such that the detector "feels" the probability of a given outcome and creates it? (On this level atoms and molecules are vibrating and "feel" electromagnetic radiation, so that this question seems to make no sense unless new physics is involved).
Quantum theory of course tells what this interaction is. But it does not tell how this interaction actually achieves the observed definite pointer position.
timmdeeg said:
In the double slit experiment the position of a dot on the screen corresponds to it's probability.
No. The dot corresponds to a particular position measured. The probability comes from counting the frequency and distribution of the dots.
timmdeeg said:
Does according to the thermal interpretation the whole screen act as a detector in this case?
It does according to every interpretation.
 
Last edited:
  • Like
Likes timmdeeg and mattt
  • #88
gentzen said:
Well, it might actually be a valuable clarification. My first thought was to use a diffraction grating whose pitch is so small that the first diffraction orders are already evanescent. If one now places a photoresist directly behind the mask, one might observe the intensity distribution resulting from the interference between the zeroth and first orders in the photoresist. One problem might be that the zeroth order is too dominant so that not much can be seen.

I better idea might be to use a glass substrate for the mask such that the evanescent wave you want to produce is still an optical wave inside the glass. Now use a diffraction grating (on a flat surface of your glass substrate) such that one of the first diffraction orders is an evanescent mode with exactly the opposite wave-number of the incident wave. Now you should be able to observe much more pronounced interference patterns in the photoresist.

This is related to the most trivial way to generate an evanescent wave in vacuum: Use a prism and tilt the incident wave sufficiently such that the refracted wave is already evanescent in vacuum.
I still don't understand how there can be evanescent em. waves in the vacuum. For me an evanescent wave is a non-propagating field like in a wave guide (a mode with a frequency below the cut-off frequency), but in the vacuum there is no such thing. The dispersion relation is always ##\omega=c k##, i.e., there are no evanescent modes in the vacuum.
 
  • #89
A. Neumaier said:
There must be a microscopic explanation for the ''problem of definite outcomes'' because measurement devices are made out of quantum matter, so they are described by a quantum state. The observed pointer position is a property of the measurement device. According to the statistical interpretation all we can know about the quantum system constituted by the measurement device is encoded in its quantum state. The ''problem of definite outcomes'' is to show how this quantum state encodes the definite observed pointer position, and how the unitary dynamics postulated by quantum physics dynamics leads to such a definite observed pointer position.

The statistical interpretation has no answer for this but simply assumes it as an irreducible fact - in addition to the quantum dynamics and the state interpretation.

Quantum theory of course tells what this interaction is. But it does not tell how this interaction actually achieves the observed definite pointer position.

No. The dot corresponds to a particular position measured. The probability comes from counting the frequency and distribution of the dots.

It does according to every interpretation.
In the orthodox minimal interpretation the problem of a definite outcome is that a macrostate (like a pointer position) is a very coarse-grained observable and the fluctuations are small compared to the resolution within this macroscopic observable is determined. I always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be intepreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.

The dot on a CCD screen or photo plate or the "trajectory of a particle" in a cloud chamber are good examples. These are highly coarse-grained macroscopic observables with a resolution well coarser than the quantum limits given by the uncertainty relation.
 
  • #90
vanhees71 said:
The dot on a CCD screen or photo plate or the "trajectory of a particle" in a cloud chamber are good examples. These are highly coarse-grained macroscopic observables with a resolution well coarser than the quantum limits given by the uncertainty relation.
I agree, but this alone does not solve the problem!

What remains unanswered by the statistical interpretation is why in the measurement of a single particle by the screen, the screen is in a macroscopically well-defined state rather than in a superposition of states where the different pixels are activated with the probabilities determined for Born's rule for the particles. For the latter is the result of applying the Schrödinger equation to the combined system (particle + screen)!

vanhees71 said:
I always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be intepreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.
The statistical interpretation can never turn a superposition of widely spread possible outcomes (any pixel on the screen) into a state where the outcome is definite. Nothing ever is definite in the statistical interpretation, the definiteness is assumed in addition to the quantum formalism.

The thermal interpretation does no yet claim to have fully solved this problem but paves the way to its solution, since it says that certain q-expectations (rather than certain eigenvalues) are the observed things. Hence the macroscopic interpretation is immediate since the highly coarse-grained macroscopic observables are such q-expectations.

The step missing is to prove from the microscopic dynamics of the joint system (particle + screen)
that these macroscopic observables form a stochastic process with the correct probabilities. Here the thermal interpretation currently offers only suggestive hints, mainly through reference to work by others.
 
Last edited:
  • Like
Likes dextercioby, gentzen, PeterDonis and 1 other person
  • #91
vanhees71 said:
I always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be intepreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.
Based on the current discussion, it occurred to me that the non-ensemble interpretation of q-expectations of the thermal interpretation could be combined with Callen's criterion to arrive at an "operational falsification" interpretation of expectations (probability). That interpretation would be closely related to the frequentist interpretation, but fix its problem related to the assumption/requirement of "virtual" ensembles that allow to arbitrarily often repeat identical experiments (which makes the frequentist interpretation non-operational and non-applicable to many practically relevant scenarios).

In order not to hijack this thread, I will open a separate thread with more explanations when I find the time.
 
  • #92
  • #93
vanhees71 said:
always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be interpreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.
In special cases, namely for the measurement of macroscopic properties, the q-expectations are directly related to an operational realization by a measurement device - they give the measured value of extensive quantities without any statistics. No expectations are involved in this case, a single measurement gives the value predicted by the theory.

It is only in the general case where one cannot give a relation to an operational realization by a measurement device except statistically. But this is not a drawback. Already in classical physics, one can relate certain classical observable functions of the state to experiment - namely those that do not depend very sensitively on the state. Those with sensitive dependence can only be related statistically.
 
  • #94
A. Neumaier said:
There must be a microscopic explanation for the ''problem of definite outcomes'' because measurement devices are made out of quantum matter, so they are described by a quantum state. The observed pointer position is a property of the measurement device. According to the statistical interpretation all we can know about the quantum system constituted by the measurement device is encoded in its quantum state. The ''problem of definite outcomes'' is to show how this quantum state encodes the definite observed pointer position, and how the unitary dynamics postulated by quantum physics leads to such a definite observed pointer position.
The solution is quite simple and straightforward. It is sufficient to look at a measurement from two points of view, with different cuts between classical and quantum part. Then we see that the intermediate part is described, in one cut, as a quantum object with a wave function, and in the other cut with a classical trajectory.

All one has to do is to accept this as the general picture - there is also a trajectory in the quantum part. The mathematics how to make both compatible is easy and well known - The Bohmian velocity defines the deterministic (in dBB) resp. average (in other realistic interpretations) velocity of that trajectory.
 
  • #95
Sunil said:
The solution is quite simple and straightforward. It is sufficient to look at a measurement from two points of view, with different cuts between classical and quantum part. Then we see that the intermediate part is described, in one cut, as a quantum object with a wave function, and in the other cut with a classical trajectory.
But Nature has no cut. Thus you only replaced the problem by the equivalent problem of explaining that we may replace the quantum description on one side of the cut by a classical description. Nobody ever has derived this from the pure quantum dynamics.
 
  • Like
Likes Lord Jestocost and vanhees71
  • #96
vanhees71 said:
I still don't understand how there can be evanescent em. waves in the vacuum. For me an evanescent wave is a non-propagating field like in a wave guide (a mode with a frequency below the cut-off frequency), but in the vacuum there is no such thing. The dispersion relation is always ##\omega=ck##, i.e., there are no evanescent modes in the vacuum.
If we write the dispersion relation as ##\omega^2/c^2=k_x^2+k_y^2+k_z^2## and assume that ##k_x## and ##k_y## are real, then we see that ##k_z^2## will get negative if ##\omega^2/c^2<k_x^2+k_y^2##. If ##k_z^2## is negative then ##k_z## is imaginary, which corresponds to an evanescent wave.

At a horizontal planar interface (perpendicular to the z-axis) between two homogeneous regions, ##k_x## and ##k_y## cannot change, because they describe the modulation of the electromagnetic field along the interface. So you can have an optical wave in a glass substrate with well defined ##k_x## and ##k_y## based on the direction of the wave. If the direction of the wave is sufficiently gracing with respect to a horizontal planar interface to vacuum, then it will become evanescent in the vacuum below the interface.
(The wave will quickly (exponentially) vanish with respect to increasing distance from the interface. Additionally, the time average of the z-component of the Poynting vector is zero, i.e. there is no energy transported in z-direction on average by the evanscent wave in vacuum.)
 
  • #97
A. Neumaier said:
But Nature has no cut. Thus you only replaced the problem by the equivalent problem of explaining that we may replace the quantum description on one side of the cut by a classical description. Nobody ever has derived this from the pure quantum dynamics.
Of course. You start with a "pure quantum" description of the world, which in fact does not exist. The minimal interpretation is, essentially, only a reduced Copenhagen interpretation, so it prefers not to talk about that cut, classical part, and all that, but it has the results of the experiments formulated in the language of experiments in classical physics, with resulting classical probabilities (instead of many worlds or so). And you add the explicit hypothesis that there are no "hidden variables", in particular that there is no trajectory, even if we see it if we use the classical description between the two cuts. Because this would not be "pure quantum". And you wonder that you are unable to recreate those trajectories out of nothing after forbidding their existence?

The straightforward solution is, of course, that Nature has no cut, thus, once we see trajectories, it follows that there will be trajectories even in the regions where we are unable to see them. This is not only possible, but straightforward, with the simple mathematics of dBB theory which defines the (average in statistical interpretations) velocity out of the phase of the wave function in configuration space, and which comes essentially without mathematical competitors.

Given that such a straightforward solution with trajectories exists, it would be IMHO reasonable to send all those who propose "pure quantum theory" home until they have done their homework of deriving the trajectories we see around us from their "pure quantum theory" which they like to forbid on the fundamental level.
 
  • Skeptical
Likes PeroK
  • #98
A. Neumaier said:
I agree, but this alone does not solve the problem!

What remains unanswered by the statistical interpretation is why in the measurement of a single particle by the screen, the screen is in a macroscopically well-defined state rather than in a superposition of states where the different pixels are activated with the probabilities determined for Born's rule for the particles. For the latter is the result of applying the Schrödinger equation to the combined system (particle + screen)!The statistical interpretation can never turn a superposition of widely spread possible outcomes (any pixel on the screen) into a state where the outcome is definite. Nothing ever is definite in the statistical interpretation, the definiteness is assumed in addition to the quantum formalism.

The thermal interpretation does no yet claim to have fully solved this problem but paves the way to its solution, since it says that certain q-expectations (rather than certain eigenvalues) are the observed things. Hence the macroscopic interpretation is immediate since the highly coarse-grained macroscopic observables are such q-expectations.

The step missing is to prove from the microscopic dynamics of the joint system (particle + screen)
that these macroscopic observables form a stochastic process with the correct probabilities. Here the thermal interpretation currently offers only suggestive hints, mainly through reference to work by others.
Indeed, "nothing is definite in the statistical interpretation", but that's no bug but a feature as the many highly accurate confirmations of the violation of Bell's inequalities show.

Also the famous double-slit experiment for single particles or photons confirm the predicted probability distributions for the detection of these particles or photons. That a single point on the screen is blackened for each particle registered is first of all an empirical fact. It is also well understood quantum mechanically as already shown as early as 1929 in Mott's famous paper about ##\alpha##-particle tracks in a cloud chamber.

I believe that your thermal interpretation is the answer as soon as you allow your q-expectation values to be interpreted in the standard probabilistic way, and of course you cannot describe the macroscopic observables by microscopic dynamics, because it is their very nature to be only a coarse-grained description of the relevant macroscopic degrees of freedom, and that's also the reason for it's classical behavior and the irreversibility of the measurement outcome.

If you see it as a problem to understand this irreversibility from a detailed microscopic dynamical description then also the same problem has to be considered unsolved within classical physics, but I don't know any physicist who does not accept the standard answer given by statistical physics (aka "the H theorem").
 
  • #99
A. Neumaier said:
In special cases, namely for the measurement of macroscopic properties, the q-expectations are directly related to an operational realization by a measurement device - they give the measured value of extensive quantities without any statistics. No expectations are involved in this case, a single measurement gives the value predicted by the theory.

It is only in the general case where one cannot give a relation to an operational realization by a measurement device except statistically. But this is not a drawback. Already in classical physics, one can relate certain classical observable functions of the state to experiment - namely those that do not depend very sensitively on the state. Those with sensitive dependence can only be related statistically.
But macroscopic properties are statistical averages over many microscopic degrees of freedom. It is not clear how to explain the measurement of such an observable without averages and the corresponding (quantum) statistics.

A single measurement, no matter whether you measure "macroscopic" or "microscopic" properties, never establishes a value, let alone, can test any theoretical prediction, as one learns in the first session of the introductory beginner's lab!
 
  • #100
Sunil said:
You start with a "pure quantum" description of the world, which in fact does not exist.
This is not a fact but your assumption. No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition. Once the latter is assumed, one must show how to define the classical in terms of the more comprehensive quantum. This is the measurement problem. You simply talk it away by making this assumption.
Sunil said:
The minimal interpretation is, essentially, only a reduced Copenhagen interpretation, so it prefers not to talk about that cut, classical part, and all that, but
... it postulates a classical world in addition to the quantum world. How the two can coexist is unexplained.
Sunil said:
with the simple mathematics of dBB theory which defines the (average in statistical interpretations) velocity [...]
Given that such a straightforward solution with trajectories exists
It does not exist for quantum field theory, which is needed for explaining much of our world!
 
  • Like
Likes PeroK, gentzen and vanhees71
Back
Top