# A Jürg Fröhlich on the deeper meaning of Quantum Mechanics

#### A. Neumaier

Summary
a pointer to a recent paper
I'd like to draw attention to a very recent paper by Jürg Fröhlich, a well-known mathematical physicist from the ETH Zürich. It starts out as follows:
Jürg Fröhlich said:
I consider it to be an intellectual scandal that, nearly one hundred years after the discovery of matrix mechanics by Heisenberg, Born, Jordan and Dirac, many or most professional physicists – experimentalists and theorists alike – admit to be confused about the deeper meaning of Quantum Mechanics (QM), or are trying to evade taking a clear standpoint by resorting to agnosticism or to overly abstract formulations of QM that often only add to the confusion. [...]
I felt that the subject had better remain a hobby until later in my career. [...]
But when I was approaching mandatory retirement I felt an urge to clarify my understanding of some of the subjects I had had to teach to my students for thirty years
Section 2 is titled ''Standard formulation of Quantum Mechanics and its shortcomings''. Surely @vanhees71 has very convincing reasons why this critique is irrelevant from his personal point of view. But the others might be interested.

Section 3 then presents a completion of QM, the ''ETH-Approach to QM''. It is too abstract to become popular - one more of many interpretations satisfying their authors but probably not a majority of quantum physicists.

Last edited:
Related Quantum Physics News on Phys.org

#### bhobba

Mentor
Interesting paper.

At the moment I can't quite figure the difference between it and Decoherent Histories.

Need to think more and hear others views,

Thanks
Bill

#### vanhees71

Gold Member
I don't have reasons why this critique may be irrelevant since again I'm not even able to understand the problem to begin with. The statement is on p. 7. Fröhlich argues for a two-spin-1/2 system in the singlet Bell state, with the state ket given by
$$|\Psi \rangle=\frac{1}{\sqrt{2}} (|1/2,-1/2 \rangle-|-1/2,1/2 \rangle).$$
Then he makes the statement that from the obvious fact that of course the expectation value of $S_z=s_z \otimes 1 + 1 \times s_z$ is 0, it would follow that there couldn't be correlations between measurements of the 2 spins, which are however of course present due to the entanglement (it's even the maximal entanglement one can get, since it's a Bell state).

The point is that you have to do measurements on such prepared spin pairs on an "even-by-event" basis to get the correlations, i.e., for each pair you have to measure the $s_z$ for both particles of the pair and then find the 100% correlation, i.e., if A measures +1/2, then B necessarily measures -1/2 and vice versa. Of course you can't learn this from the expectation values.

Of course, for each of the two observers what they get are just unpolarized particles, i.e., the spin component is utmost indetermined, and that's in accordance with the cited linked-cluster principle which is of course valid in a relativistic local QFT by construction.

So I don't even get the argument of the author, why there is a contradiction to the minimally interpreted QT or the many observations on systems like this (mostly with photons), which are well understood using the minimally interpreted QT, including the "funny" features of entanglement. It's all well understood and described by standard QT, e.g., in the Walborn quantum eraser experiment we have discussed very often in this forum. I don't need to repeat it.

I also don't understand the solution of what the author seems to consider a problem since I'm not familiar with the the mathematical-physics notation the author uses. I've not the time to learn this notation to understand the solution of a problem, whose statement I even cannot understand to begin with. I'm sure, this interpretation could becom popular (or more popular or at least discussed) when (a) the problem it tries to solve is clarified and (b) the mathematical-physics notation is translated down to more common theoretical-physicists notation.

#### ftr

It is too abstract to become popular
yet, he says"it furnishes quantum theory with a clear ontology”. I could not understand what that ontology entailed.

#### vanhees71

Gold Member
Well, "ontology" is also one of those unclear notions of the philosophers. It's not clear at all what "ontology" means. It heavily depends on what the individual philosopher thinks it is. For the physicist any observable fact about nature is enough "ontology". That's why I don't understand, where Fröhlich's problem with QM is located. The example he brings in the above cited paper is simply not what's observed. To the contrary, nowadays there are many experiments by the AMO community which prove it wrong: Since more than 30 years (with Aspect's first experiment concerning the Bell inequalities) with ever increasing precision the strong correlations due to entanglement (but indeed consistent with the linked cluster principle valid for local relativistic QFTs and thus valid for the standard model and particularly QED!) are empirically established. It's not only expectation values that can be measured, but the outcome of measurements on an event-by-event basis.

So far there's not the slightest hint that QT is incomplete within the realm it is formulated (it is only (!) incomplete with regard to the lack of a satisfactory quantum description of gravity). The "ontology" from a physicist's point of view then simply is provided by the notion of the quantum state, and this implies that there is something called "objective indeterminism" in nature, i.e., it is impossible to determine by preparation all observable definable on a given quantum system. Also the classical behavior of "everyday matter" is well understood: It's due to reducing the description to the relevant macroscopic degrees of freedom through "coarse-graining".

#### zonde

Gold Member
For the physicist any observable fact about nature is enough "ontology".
The "ontology" from a physicist's point of view then simply is provided by the notion of the quantum state,
Quantum state is not an observable fact about nature. You are contradicting yourself within single post.

If you stick to the idea that observable facts about nature are "ontological", QM is still rather ontologically unsatisfactory. That's because statistics is composite fact about nature. It requires some interpretation and grouping of similar situations. Elementary facts are single detections. And NRQM as well as QFT can speak only about statistics and it can not say how these statistics emerge from elementary facts while common explanation seems to fail.

#### vanhees71

Gold Member
You are right in saying that the ontlogy in QT is provided by both the notion of "state" and "observable". These notions together provide the ontlogy of QT, and there's no contradiction to this ontology by experiment. To the contrary, the more QT is tested the better it gets confirmed. Particularly Fröhlich's idea of proving some contradiction is unclear and not justified by any observation. To the contrary his example provides one of the most stringent tests of Q(F)Ts consistency as a theory (validity of the linked-cluster principle) as well as with observations, which Bell experimenta are a confirmation of Q(F)T with the highest significance ever reached between theory in experiment in the history of physics.

#### zonde

Gold Member
You are right in saying that the ontlogy in QT is provided by both the notion of "state" and "observable".
Never said that. The things you say are so incoherent that I have no idea how to reply.

#### vanhees71

Gold Member
What specifically is "incoherent"?

My statement is that the minimally interpreted QT (which is basically Copenhagen without collapse) is all "ontology" you need since it describes precisely what's observed today.

The fundamental point about which all these discussions about "interpretation" occur is the following:

(a) The state of a system is described by a positive semidefinite self-adjoint operator $\hat{\rho}$ with $\mathrm{Tr} \hat{\rho}=1$.
(b) All observables are described by self-adjoint operators $\hat{O}$. The possible outcome of (accurate) measurements are the generalized eigenvalues of these operators.
(c) The probability to find the value $o_i$ when measuring the observable $O$ is given by
$$P(o_i)=\sum_{\beta} \langle o_i,\beta|\hat{\rho}|o_i,\beta \rangle$$
with the usual treatmenof spectral values in the continuum. $\beta$ are a set of parameters (e.g., the eigenvalues of a complete set of observable operators that complement $\hat{O}$). Also here the usual treatment if there are continuous parts in these parameters is implied. For simplicity I use sums rather than integrals.

The ontology in this standard interpretation thus is that if a system is in a state described by $\hat{\rho}$, some observables may have determined values others not. In any case the probabilities for measuring a certain possible value is given by the formula above (the Born rule in its most general form).

In my opinion that's compatible with all observations done so far. There seems nothing missing in this description. It's used to also describe the most accurate observations concerning entangled states, where Fröhlich argues to be a problem about, and his argument doesn't match the corresponding experiments (like the polarization measurements on entangled photon pairs a la Aspect, the quantum eraser, etc.), because it just argues with some expectation value, which refers to an averaging over many measurements on identically prepared systems (short an ensemble), where of course information that can be gained according to QT is thrown away. Then of course the correlations predicted by entanglement are not observable anymore, because this information is not taken, but as is demonstrated by very accurate measurements on such Bell states, in fact one can get this information, which is the event-by-event outcome of measurements on the entangled observables (in Fröhlich's example the polarization state of both of the entangled photons for each such prepared photon pair). With this information, all predicted correlations (100% correlation when measuring polarization in the same or mutually orthogonal directions, violation of Bell's inequality for adequately chosen relative angles of the polarizers etc. etc.).

The outcome of these experiments is very clear: All predictions of QT have been empirically "verified" at a very large confidence level. Also many (if not all) loopholes brought forward so far have been excluded. No observations are unexplained by minimally interpreted QT.

In the relativistic realm also no violations of Einstein causality are present. This is only the case if a naive collapse argument of some Copenhagen-interpretation flavors is envoked (as far as I know, Bohr was at least very careful not to put to much weight in the collapse assumption). It's also not necessary to explain the said correlations due to entanglement on far-distant parts of a quantum system (like the polarization-entangled photon pair in Fröhlich's example, where the single-photon polarization measurements can be done at arbitrary far distances, as long as no photon is significantly disturbed on its way from the source to the detector which would of course destroy the entanglement and correlations before both measurements have been done). As Fröhlich states, the very construction of QED as a local microcausal relativistic QFT guarantees the validity of the linked-cluster principle, and this also applies then of course to the said entangled state of two photons. Performing the polarization measurements (more precisely the detection of each photon behind the polarization filters) in setups such that these detection events are space-like separated, for which mutual influence of one measurement by the other is excluded, still show all the predicted correlations, which is also in accordance with the linked-cluster principle. The conclusion is that, as stated within minimally interpreted QFT, the correlation is due to the state preparation and not by causal influences between the detection events.

So again, I don't see any "incoherence" in the minimal interpretation nor the necessity for "more ontology" than provided by it, at least I don't see it in Fröhlich's argument.

#### DarMM

Gold Member
So again, I don't see any "incoherence" in the minimal interpretation nor the necessity for "more ontology" than provided by it, at least I don't see it in Fröhlich's argument.
I think I might be able to explain other people's problems to you. There are basically three problems people have with QM.

1. There is no dynamical account of which measurement outcome occurs

2. There is no dynamical account of how the correlations present in entanglement are achieved

3. Imagine I measure the state $\frac{1}{\sqrt{2}}\left(|\uparrow\rangle + |\downarrow\rangle\right)$ and say my device measures spin-up. Then the state I use after is $|\uparrow\rangle$, but an external superobserver would use $\frac{1}{\sqrt{2}}\left(|\uparrow, D_{\uparrow}, L_A\rangle + |\downarrow, D_{\downarrow} L_B\rangle\right)$ with $D_{\uparrow}$ denoting the state of my device and $L_A$ being a state of the lab. That is I use a "collapsed" state, but the superobserver does not.
I think recent work, especially since Spekken's model in 2004 and clarifications of the Frauchiger-Renner argument, has shown that (3) is not really a problem or a contradiction, it only is if you accept the eigenstate-eigenvalue link which is a very old naive view of QM.

However (1) and (2) do seem like problems or at least something that needs completing in a further theory.

The only problem is that quantum mechanics involves non-classical correlations. That is correlations outside the polytope given by assuming that your variables all belong in a single sample space. You can show (Kochen-Specker, Colbeck-Renner, etc) that theories with correlations outside of this polytope by necessity lack a dynamical account for their outcomes or correlations.

So you either reinterpret the formalism in a non-statistical manner (Many-Worlds, Thermal Interpretation), add additional variables to restore the single sample space (but we know they have to be nonlocal or retrocausal) or just accept that there is no account.

Last edited:

#### DarMM

Gold Member
Interesting paper.

At the moment I can't quite figure the difference between it and Decoherent Histories.

Need to think more and hear others views,

Thanks
Bill
It's not very obvious from the paper but it is a form of Consistent Histories in a sense. However unlike Decoherent histories it doesn't frame consistency in terms of interference terms dying off, but in terms of a certain relation holding between the observables in the state $\omega$. This is slightly different as it can be "exact", but the major points are the same.

It's similar to how Jeffrey Bub's view is sort of consistent histories, but it takes consistency as being the emergence of a sub-algebra which satisfies the rules of a Boolean lattice.

So just different notions of consistency. What's interesting is that in a typical experiment all three conditions seem to hold, i.e. interference dies off, Frohlich's algebraic condition holds and a sub-algebra with lattice conditions as demanded by Bub emerges.

Later edit:
Personal conjecture: I wouldn't be surprised if basically Frohlich and Bub's views are two ways of phrasing the same thing, i.e. the emergence of a certain algebraic structure relative to the state is what permits you to reason that macroscopic equipment has obtained a definitive outcome state of which you are ignorant. Perhaps there's a theorem showing they are equivalent. We know from Spekkens model that interference terms don't mean there isn't a single outcome, so you don't need them to vanish.
You then need decoherence to show that macroscopic events obey classical statistics, not that they occur.

Last edited:

#### Lord Jestocost

Gold Member
2018 Award
There are basically three problems people have with QM.
1. There is no dynamical account of which measurement outcome occurs
2. There is no dynamical account of how the correlations present in entanglement are achieved
Are these “serious” problems of physics? Or are these “problems” merely an expression of indignation that the ultimate reality behind giving rise to our “perception” of events occuring on a space-time scene cannot be grasped with recourse to classical notions and concepts?

As Berthold-Georg Englert writes in "On quantum theory":

Abstract. Quantum theory is a well-defined local theory with a clear interpretation. No “measurement
problem” or any other foundational matters are waiting to be settled.

Last edited:

#### DarMM

Gold Member
Are these “serious” problems of physics? Or are these “problems” merely an expression of indignation that the ultimate reality behind giving rise to our “perception” of events occuring on a space-time scene cannot be grasped with recourse to classical notions and concepts?
I'm going to be very controlled in my response here, because I don't want this to veer into the usual stuff.

Are these serious problems? Well it depends on whether you think there has to be an account for how events occur, specifically outcomes of measurements on microscopic systems, or whether you think the current evidence from QM is enough for you to concede that you will never get one. The latter runs counter to why many people are interested in science, so it is not too surprising it is viewed as a problem.

I think saying "classical notions" undersells the problem some people have. It makes it sound as though they are attached to specific ideas like particles or fields. Where as the only "notion" they are holding to is an explanation/account at all.

Englert's quote (from https://arxiv.org/abs/1308.5290) just shows he is not bothered by this.
Like Gell-Mann, Griffiths, Bub, Bohr, Heisenberg, Hartle, Haag, etc he just swallows the bullet. You will never have an explanation. The End.

#### A. Neumaier

Englert's quote (from https://arxiv.org/abs/1308.5290) just shows he is not bothered by this.
Like Gell-Mann, Griffiths, Bub, Bohr, Heisenberg, Hartle, Haag, etc he just swallows the bullet. You will never have an explanation. The End.
The full abstract of his paper (published in Eur. Phys. J. D) says:
Berthold Englert said:
Quantum theory is a well-defined local theory with a clear interpretation. No "measurement problem" or any other foundational matters are waiting to be settled.
This essentially echoes the credo of @vanhees71. Englert's introduction says:
Berthold Englert said:
there is no experimental fact, not a single one, that contradicts a quantum-theoretical prediction. Yet, there is a steady stream of publications that are motivated by alleged fundamental problems: We are told that quantum theory is ill-defined, that its interpretation is unclear, that it is nonlocal, that there is an unresolved “measurement problem,” and so forth.
I find both statements in this second quote fully valid, and Englert's later explanations that these are only pseudo-problems not convincing.

#### DarMM

Gold Member
Yes and his "clear interpretation" detailed in the rest of the paper is (Neo-)Copenhagen, i.e. there is no explanation for measurement outcomes.

Without further no-go theorems we can't proceed further. Currently the attempts to add more variables to restore a single sample space (retrocausal and nonlocal theories) haven't been generalized to QFT and views that attempt to reinterpret the formalism non-statistically (Many Worlds and Thermal Interpretation) haven't been proven to give the correct observational statistics.

So we have to wait to see if one of these other views can be gotten to work in some way. Or perhaps wait for the development of no-go theorems that either forbid them or make them look completely unnatural and fine-tuned, forcing us to "swallow the bullet" of Copenhagen and its lack of explanations.

Time will tell.

Last edited:

#### A. Neumaier

I find both statements in this second quote fully valid, and Englert's later explanations that these are only pseudo-problems not convincing.
His final words in the paper are:
Berthold Englert said:
What, then, about the steady stream of publications that offer solutions for alleged fundamental problems, each of them wrongly identified on the basis of one misunderstanding of quantum theory or another? Well, one could be annoyed by that and join van Kampen [42] in calling it a scandal when a respectable journal prints yet another such article. No-one, however, is advocating censorship, even of the mildest kind, because the scientific debate cannot tolerate it. Yet, is it not saddening that so much of diligent effort is wasted on studying pseudo-problems?
Note that van Kampen's paper, which he cites here and which also promoted the thesis that there is no measurement problem, contains an error in its ''proof'' of this thesis.

Last edited:

#### charters

Currently the attempts to add more variables to restore a single sample space (retrocausal and nonlocal theories) haven't been generalized to QFT and views that attempt to reinterpret the formalism non-statistically (Many Worlds and Thermal Interpretation) haven't been proven to give the correct observational statistics.
Aharonov's "retrocausal" two time interpretation generalizes to QFT (equally as easily as MWI) and gets the correct observational statistics from a typicality assumption on the future boundary choice.

#### DarMM

Gold Member
Aharonov's "retrocausal" two time interpretation generalizes to QFT (equally as easily as MWI) and gets the correct observational statistics from a typicality assumption on the future boundary choice.
Note what I said. I didn't say that retrocausal theories can't get out the statistics, I said they aren't generalized to QFT fully, neither is the TSVF you are discussing. Kastner's work can be considered to have shown that it might be able to replicate aspects of QED, but I'm not aware of a full proof that it works in the QFT case.

Many Worlds has many issues with QFT, such as the absence of pure states for finite volume systems. And the Born rule has never been proven to hold.

Anyway, I'd be happy to discuss this on another thread. Either MWI or Retrocausal views.

#### ftr

has shown that (3) is not really a problem or a contradiction
Aren't 1) and 3) basically the same problem, or at least related.

#### DarMM

Gold Member
Aren't 1) and 3) basically the same problem, or at least related.
They are not the same problem. In general they are not related, for example theories like Spekkens model have (3), but not (1). That is you can have Wigner's friend style problem without having the measurement problem. That's why the measurement problem, labelled (1) above, is more important to QM and (3) isn't a real issue.

#### ftr

They are not the same problem. In general they are not related, for example theories like Spekkens model have (3), but not (1). That is you can have Wigner's friend style problem without having the measurement problem. That's why the measurement problem, labelled (1) above, is more important to QM and (3) isn't a real issue.
Well, collapse is an integral part of standard interpretation like CI although it is down played and all other interpretations are not that successful( at least this is my view of the general consensus) in circumventing it because of the probability interpretation. So all the power to TI.

EDIT: OK, it is possible to look at them as separate problems in some sense.

Last edited:

#### stevendaryl

Staff Emeritus
Are these “serious” problems of physics? Or are these “problems” merely an expression of indignation that the ultimate reality behind giving rise to our “perception” of events occuring on a space-time scene cannot be grasped with recourse to classical notions and concepts?
No, the difficulties of interpreting quantum mechanics are not due to the fact that it cannot be grasped in terms of classical notions.

As Berthold-Georg Englert writes in "On quantum theory":

Abstract. Quantum theory is a well-defined local theory with a clear interpretation. No “measurement
problem” or any other foundational matters are waiting to be settled.
I think he's just wrong about that.

#### A. Neumaier

I think he's just wrong about that.
Its controversial, even here on PF. Different people have different criteria for ''well-defined''and' 'clear' '. Those with loose criteria are easily satisfied, only those with strict ones see the problems. No amount of discussion will change this.

#### DarMM

Gold Member
Its controversial, even here on PF. Different people have different criteria for ''well-defined''and' 'clear' '. Those with loose criteria are easily satisfied, only those with strict ones see the problems. No amount of discussion will change this.
I think this is pretty accurate. Either you have a problem with the standard lack of explanation for how measurement outcomes come about and how nonclassical correlations are achieved or you don't. In the typical approach QM doesn't give any explanation for these things. You'll either think this is an insight (i.e. this is something which cannot be given a scientific explanation) or an incompleteness (there has to be a deeper theory telling us how they come about).

The way things currently stand, i.e. no-go theorems et al, leaves this as an issue of personal taste.

#### ftr

leaves this as an issue of personal taste.
However I do think that some research come very close to explaining it.

"Jürg Fröhlich on the deeper meaning of Quantum Mechanics"

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving