• #1
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
This Insight article presents the main features of a conceptual foundation of quantum physics with the same characteristic features as classical physics – except that the density operator takes the place of the classical phase space coordinates position and momentum. Since everything follows from the well-established techniques of quantum tomography (the art and science of determining the state of a quantum system from measurements) the new approach may have the potential to lead in time to a consensus on the foundations of quantum mechanics. Full details can be found in my paper

A. Neumaier, Quantum mechanics via quantum tomography, Manuscript (2022). arXiv:2110.05294v3

This paper gives for the first time a formally precise definition of quantum measurement that

is applicable without idealization to complex, realistic experiments;
allows one to derive the standard quantum...

Continue reading...
 
Last edited:
  • Like
Likes Jarvis323, mattt, ergospherical and 3 others

Answers and Replies

  • #2
vanhees71
Science Advisor
Insights Author
Gold Member
2021 Award
19,501
10,253
Great!

That confirms my (still superficial) understanding that now I'm allowed to interpret ##\hat{\rho}## and the trace operation as expectation values in the usual statistical sense, and that makes the new approach much more understandable than what you called before "thermal interpretation". I also think that the entire conception is not much different from the minimal statistical interpretation. The only change to the "traditional" concept seems to be that you use the more general concept of POVM than the von Neumann filter measurements, which are only a special case.

The only objection I have is the statement concerning EPR. It cannot be right, because local realistic theories are not consistent with the quantum-theoretical probability theory, which is proven by the violation of Bell's inequalities (and related properties of quantum-mechanically evaluated correlation functions, etc) through the quantum mechanical predictions and the confirmation of precisely these violations in experiments.

The upshot is: As quantum theory predicts, the outcomes of all possible measurements on a system, prepared in any state ##\hat{\rho}## (I take it that it is allowed also in your new conception to refer to ##\hat{\rho}## as the description of equivalence classes of preparation procedures, i.e., to interpret the word "quantum source" in the standard way) are not due to predetermined values of the measured observables. All the quantum state implies are the probabilities for the outcome of measurements. The values of observables are thus only determined by the preparation procedure if they take a certain value with 100% probability. I think within your conceptional frame work, "observable" takes a more general meaning as the outcome of some measurement device ("pointer reading") definable in the most general sense as a POVM.
 
  • #3
Demystifier
Science Advisor
Insights Author
Gold Member
12,436
4,777
What's the main new idea here? From this summary, which is written nicely and clearly, I have a feeling that I knew all this before. Do I miss something?
 
  • #4
vanhees71
Science Advisor
Insights Author
Gold Member
2021 Award
19,501
10,253
Indeed, I think it's just a reformulation of the minimal statistical interpretation, taking into account the more modern approach to represent observables by POVMS rather than the standard formulation with self-adjoint operators (referring to von Neumann filter measurements, which are a special case of POVMS).
 
  • Like
Likes Demystifier
  • #5
fresh_42
Mentor
Insights Author
2021 Award
15,928
14,353
... the more modern approach to represent observables by POVMS rather than the standard formulation with self-adjoint operators ...

As a layman in QM I looked up POVM and found a function ##\mu\, : \,\mathcal{A}\longrightarrow \mathcal{B(H)}## with ##0\leq \mu(A) \leq \operatorname{id}_{\mathcal{H}}## with self-adjoint operators as values. It seems to be the same difference as a distribution function is to a probability measure, i.e. more of a different wording than a different approach.

Do I miss something?
 
Last edited:
  • #6
Demystifier
Science Advisor
Insights Author
Gold Member
12,436
4,777
Do I miss something?
Physics? :wink:

More seriously, I don't know what the equation you wrote means, so I cannot say what do you miss.
 
  • #7
fresh_42
Mentor
Insights Author
2021 Award
15,928
14,353
Physics? :wink:

More seriously, I don't know what the equation you wrote means, so I cannot say what do you miss.
A function from a measure space to the space of bounded operators on a Hilbert space.
 
  • #8
RUTA
Science Advisor
Insights Author
1,342
389
Very nice description of how QM marries up with CM. In particular, its operational approach greatly clarifies the Born rule in terms of empiricism, which is the way I view physics as a physicist. I agree that the standard introduction contains otherwise “mysterious” mathematical abstractions. How does this resolve the mystery of entanglement?
 
Last edited:
  • #9
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
What's the main new idea here?
New compared to what?
From this summary, which is written nicely and clearly, I have a feeling that I knew all this before.
For example, where did you know from what I said in the very first sentence about quantum phase space coordinates?
Insight summary (first sentence) said:
This Insight article presents the main features of a conceptual foundation of quantum physics with the same characteristic features as classical physics – except that the density operator takes the place of the classical phase space coordinates position and momentum.
Do I miss something?
What I consider new for the general reader was specified at the beginning:
Insight summary said:
This Insight article [...] gives for the first time a formally precise definition of quantum measurement that
  • is applicable without idealization to complex, realistic experiments;
  • allows one to derive the standard quantum mechanical machinery from a single, well-motivated postulate;
  • leads to objective (i.e., observer-independent, operational, and reproducible) quantum state assignments to all sufficiently stationary quantum systems.
The paper shows that the amount of objectivity in quantum physics is no less than that in classical physics.
If you know how to do all this consistently you miss nothing. Otherwise you should read the full paper, where everything is argued in full detail, so that it cab be easily integrated into a first course on quantum mechanics.
 
  • #10
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
Very nice description of how QM marries up with CM. How does this resolve the mystery of entanglement?
The concept is nowhere needed in this approach to quantum mechanics, hence there is no mystery about it at this level.

Entanged states are just very special cases of density operators expressed in a very specific basis. They become a matter of curiosity only if one looks for extremal situations that can be prepared only for systems in which a very small number of degrees of freedom are treated quantum mechanically.
 
Last edited:
  • #11
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
As a layman in QM I looked up POVM and found a function ##\mu\, : \,\mathcal{A}\longrightarrow \mathcal{B(H)}## with ##0\leq \mu(A) \leq \operatorname{id}_{\mathcal{H}}## with self-adjoint operators as values. It seems to be the same difference as a distribution function is to a probability measure, i.e. more of a different wording than a different approach.
In the Insight article and the accompanying paper I only use the notion of a discrete quantum measure, defined as a finite family of Hermitian, positive semidefinite that sum to the identity.
This is the quantum version of a discrete probability distribution, a finite family of probabilities summing to one. Thus on the level of foundations there is no need for the POVM concept.

The concept of POVMs is unnecessarily abstract, but there are simple POVMs equivalent to discrete quantum measures; see Section 4.1 of my paper.
 
Last edited:
  • Like
  • Informative
Likes fresh_42, gentzen and vanhees71
  • #13
RUTA
Science Advisor
Insights Author
1,342
389
The concept is nowhere needed in this approach to quantum mechanics, hence there is no mystery about it at this level.

Entanged states are just very special cases of density operators expressed in a very specific basis. They become a matter of curiosity only if one looks for extremal situations that can be prepared only for systems in which a very small number of states are treated quantum mechanically.
Yes, entangled states produce CM results on average, but that statement simply ignores their violation of the Bell inequality, which can also be couched as a statistical, empirical fact. Indeed, the mystery of entanglement can also be shown empirically in very small (non-statistica) samples of individual measurements. This approach is therefore worthless for resolving that mystery. It does however marry up beautifully with the reconstruction of QM via information-theoretic principles, which does resolve the mystery of the qubit and therefore entanglement.
 
  • #14
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
But aren't these also special cases of POVMs as described in the Wikipedia

https://en.wikipedia.org/wiki/POVM
Yes, Wikipedia describes them (at the very top of the section headed 'Definition') as the simplest POVMs. But the general concept (as defined in the Definition inside this section of Wikipedia) is an abstract monster far too complicated for most physics students.
 
  • #15
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
Yes, entangled states produce CM results on average, but that statement simply ignores their violation of the Bell inequality, which can also be couched as a statistical, empirical fact. Indeed, the mystery of entanglement can also be shown empirically in very small (non-statistica) samples of individual measurements. This approach is therefore worthless for resolving that mystery.
Most things are worthless if you apply inadequate criteria for measuring their worth. The most expensive car is worthless if you want to climb a tree.

I didn't set out to resolve what you regard here as a mystery. It is not needed for the foundations but a consequence of the general formalism once it has been derived.
It does however marry up beautifully with the reconstruction of QM via information-theoretic principles, which does resolve the mystery of the qubit and therefore entanglement.
I don't see the qubit presenting a mystery. Everything about it was known in 1852, long before quantum mechanics got off the ground.
 
  • #16
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
Indeed, I think it's just a reformulation of the minimal statistical interpretation, taking into account the more modern approach to represent observables by POVMS rather than the standard formulation with self-adjoint operators (referring to von Neumann filter measurements, which are a special case of POVMS).
It is a minimal well-motivated new foundation for quantum physics including its statistical interpretation, based on a new postulate from which POVMs and everything else can be derived. And it has consequences far beyond the statistical interpretation, see the key points mentioned in post #9.
It seems to be the same difference as a distribution function is to a probability measure, i.e. more of a different wording than a different approach.

Do I miss something?
The point is that there are two quantum generalizations of probability, the old (von Neumann) one based on PVMs (in the discrete case orthogonal projectors summing to 1) and the more recent (1970+), far more generally applicable one, based on POVMs. See the Wikipedia article mentioned in post #14.
 
  • Like
Likes gentzen and vanhees71
  • #17
fresh_42
Mentor
Insights Author
2021 Award
15,928
14,353
Yes, Wikipedia describes them (at the very top of the section headed 'Definition') as the simplest POVMs. But the general concept (as defined in the Definition inside this section of Wikipedia) is an abstract monster far too complicated for most physics students.
The German version is quite short, but it doesn't seem to be too complicated.
 
  • #18
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
That confirms my (still superficial) understanding that now I'm allowed to interpret ##\hat{\rho}## and the trace operation as expectation values in the usual statistical sense,
There are two senses: One as a formal mathematical construct, giving quantum expectations, and
the other in a theorem stating that when you do actual measurements, the limit of the sample means agree with these theoretical quantum expectations.
and that makes the new approach much more understandable than what you called before "thermal interpretation".
I derive the thermal interpretation from this new approach. See Section 7.3 of my paper, and consider the paper to be a much more understandable pathway to the thermal interpretation, where in my book I still had to postulate many things without being able to derive them.
I also think that the entire conception is not much different from the minimal statistical interpretation. The only change to the "traditional" concept seems to be that you use the more general concept of POVM than the von Neumann filter measurements, which are only a special case.
The beginnings are not much different, but they are already simpler than the minimal statistical interpretation - which needs nontrivial concepts from spectral theory and a very nonintuitive assertion called Born's rule.
The only objection I have is the statement concerning EPR. It cannot be right, because local realistic theories are not consistent with the quantum-theoretical probability theory, which is proven by the violation of Bell's inequalities (and related properties of quantum-mechanically evaluated correlation functions, etc) through the quantum mechanical predictions and the confirmation of precisely these violations in experiments.
Please look at my actual claims in the paper rather than judging from the summary in the Insiight article! EPR is discussed in Section 5.4. There I claim elements of reality for quantum expectations of fields operators, not for Bell-local realistic theories! Thus Bell inequalities are irrelevant.
I take it that it is allowed also in your new conception to refer to ##\hat{\rho}## as the description of equivalence classes of preparation procedures, i.e., to interpret the word "quantum source" in the standard way)
No. A (clearly purely mathematical) construction of equivalence classes is not involved at all!

A quantum source is a piece of equipment emanating a beam - a particular laser, or a fixed piece of radioactive material behind a filter with a hole, etc.. Each quantum source has a time-dependent state ##\rho(t)##, which in the stationary case is independent of time ##t##.


All the quantum state implies are the probabilities for the outcome of measurements.
The quantum state implies known values of all quantum expectations (N-point functions). This includes smeared field expectation values that are (for systems in local equilibrium) directly measurable without any statistics involved. It also includes probabilities for statistical measurements.
I think within your conceptional frame work, "observable" takes a more general meaning as the outcome of some measurement device ("pointer reading") definable in the most general sense as a POVM.
It takes a meaning independent of POVMs.

  • In classical mechanics where observables are the classical phase space variables ##p,q## and everything computable from them; in particular the kinetic and potential energy, forces, etc..
  • In quantum mechanics observables are the quantum phase space variables ##\rho## (or its matrix elements) and everything computable from them, in particular, the N-point functions of quantum field theory. For example, 2-point functions are often measurable through linear response theory.
 
  • #19
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
The German version is quite short, but it doesn't seem to be too complicated.
Not for a mathematician, who is familiar with measure theory and has mastered the subtleties of countable additivity....

But to a physics student you need to explain (and motivate in a physics context) the notions of a measure space, which is a lot of physically irrelevant overhead!
The German version of Wikipedia then simplifies to the case a discrete quantum measure, which is already everything needed to discuss measurement!
 
  • #20
vanhees71
Science Advisor
Insights Author
Gold Member
2021 Award
19,501
10,253
No. A (clearly purely mathematical) construction of equivalence classes is not involved at all!

A quantum source is a piece of equipment emanating a beam - a particular laser, or a fixed piece of radioactive material behind a filter with a hole, etc.. Each quantum source has a time-dependent state ##\rho(t)##, which in the stationary case is independent of time ##t##.
The point is the interpretation. In the latter formulation, that's precisely what I mean when I say that ##\hat{\rho}## is an "equivalence class of preparation procedures". It's an equivalence class, because very different equipment can result in the same "emanating beam".
The quantum state implies known values of all quantum expectations (N-point functions). This includes smeared field expectation values that are (for systems in local equilibrium) directly measurable without any statistics involved. It also includes probabilities for statistical measurements.
This I don't understand: A single measurement leads to some random result, but not the expectation value of these random results.
It takes a meaning independent of POVMs.

  • In classical mechanics where observables are the classical phase space variables ##p,q## and everything computable from them; in particular the kinetic and potential energy, forces, etc..
  • In quantum mechanics observables are the quantum phase space variables ##\rho## (or its matrix elements) and everything computable from them, in particular, the N-point functions of quantum field theory. For example, 2-point functions are often measurable through linear response theory.
Now I'm completely lost again. In the usual formalism the statistical operator refers to the quantum state and not to an observable. To determine a quantum state you need more than one measurement (of a complete set of compatible observables). See Ballentine's chapter (Sect. 8.2) on "state determination".
 
  • #21
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
The point is the interpretation. In the latter formulation, that's precisely what I mean when I say that ##\hat{\rho}## is an "equivalence class of preparation procedures". It's an equivalence class, because very different equipment can result in the same "emanating beam".
It results in different emanating beams, though their properties are the same.

Its an equivalence class only in the same irrelevant sense as in the claim that ''momentum is an equivalence class of preparations of particles in a classical source''. Very different equipment can result in particles with the same momentum.

Using mathematical terminology to make such a simple thing complicated is quite unnecessary.
This I don't understand: A single measurement leads to some random result, but not the expectation value of these random results.
A single measurement of a system in local equilibrium leads to a fairly well-determined value for a current, say, and not to a random result.
Now I'm completely lost again.
Because my new approach goes beyond your minimal interpretation. You should perhaps first read the paper rather than base a discussion on just reading the summary exposition. There is a reason why I spent a lot of time to give detailed, physical arguments in the paper!
 
Last edited:
  • #22
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
To determine a quantum state you need more than one measurement
Yes, that's what quantum tomography is about.

To accurately determine a momentum vector one also needs more than one measurement.

Thus I don't see why your comment affects any of my claims.
 
Last edited:
  • #23
228
144
But I also have a deeper objection: the Everett interpretation takes quantum theory in its present form as the currency, in terms of which everything has to be explained or understood, leaving the act of observation as a mere secondary phenomenon. In my view we need to find a different outlook in which the primary concept is to make meaning out of observation and, from that derive the formalism of quantum theory.

So you think that the many-universes approach may still be useful?

Yes, I think one has to work both sides of the railroad track.

But in the meantime you're siding with Bohr.

Yes. As regards the really fundamental foundations of knowledge, I cannot believe that nature has 'built in', as if by a corps of Swiss watchmakers, any machinery, equation or mathematical formalism which rigidly relates physical events separated in time. Rather I believe that these events go together in a higgledy-piggledy fashion and that what seem to be precise equations emerge in every case in a statistical way from the physics of large numbers; quantum theory in particular seems to work like that.

But do you think that quantum theory could be just an approximate theory and that there could be a better theory?

First, let me say quantum theory in an every-day context is unshakeable, unchallengeable, undefeatable - it's battle tested. In that sense it's like the second law of thermodynamics which tells us that heat flows from hot to cold. This too is battle tested - unshakeable, unchallengeable, invincible. Yet we know that the second law of thermodynamics does not go back to any equations written down at the beginning of time, not to any 'built in' machinery - not to any corps of Swiss watchmakers - but rather to the combination of a very large number of events. It's in this sense that I feel that quantum theory likewise will some day be shown to depend on the mathematics of very large numbers. Even Einstein, who opposed quantum theory in so many ways, expressed the point of view that quantum theory would turn out to be like thermodynamics.
 
  • #24
A. Neumaier
Science Advisor
Insights Author
8,026
3,893
But I also have a deeper objection: the Everett interpretation takes quantum theory in its present form as the currency, in terms of which everything has to be explained or understood
Your deeper objection seems to have no substance that would one allow to make progress.

Whatever is taken as the currency in terms of which everything has to be explained or understood, it might be something effective due to an even deeper currency. We simply must start somewhere, and your deeper objection will always apply.

But according to current knowledge, quantum theory is a sufficient currency. Unlike in earlier ages, quantum theory explains the properties of physical reality (whatever it is, but in physics it certainly includes measurement devices!)

There are no experimental phenomena not accounted for by quantum physics, which can taken to be the physics of the standard model plus modifications due to neutrino masses and semiclassical gravity plus some version of Born's rule, plus all approximation schemes used to derive the remainder of physics. Thus everything beyond that is just speculation without experimental support.
 
  • #25
196
19
There are no experimental phenomena not accounted for by quantum physics,
Except maybe the mind bugging need of reconciling gravity with quantum mechanics, both of which have solid experimental verifications.
 

Related Threads on Quantum Physics via Quantum Tomography: A New Approach to Quantum Mechanics

Replies
42
Views
2K
Replies
8
Views
884
Replies
4
Views
1K
Replies
5
Views
2K
  • Last Post
Replies
15
Views
4K
  • Last Post
6
Replies
127
Views
7K
Replies
81
Views
3K
  • Last Post
7
Replies
154
Views
21K
  • Last Post
4
Replies
92
Views
16K
Replies
4
Views
5K
Top