The collapse of a quantum state as a joint probability construction

In summary, the paper argues that we can understand the relationship between classical and quantum physics as about how we systematically model joint and incompatible probability measures. There is also a suggested introduction of a "super-Heisenberg picture", for which both the unitary evolution and collapse are absorbed into measurement operators, with the state being time-invariant, in contrast to the Schrödinger picture (for which both unitary evolution and collapse are applied to the state, measurements are time-invariant) and the Heisenberg picture (for which unitary evolution is applied to measurements and collapse is applied to the state), which can be understood as a largely mathematical no-collapse approach that introduces less in the way of metaphysical
  • #1
Peter Morgan
Gold Member
274
77
TL;DR Summary
This is an invitation to discuss how well the ideas in "The collapse of a quantum state as a joint probability construction", in JPhysA 2022, allow us to rethink the measurement problem.
The titular paper can be found here, https://doi.org/10.1088/1751-8121/ac6f2f, and on arXiv as https://arxiv.org/abs/2101.10931 (which is paginated differently, but the text and equation and section numbers are the same). Please see the abstract, but in part this 24 page paper argues that we can understand the relationship between classical and quantum physics as about how we systematically model joint and incompatible probability measures.
There is also a suggested introduction of a "super-Heisenberg picture", for which both the unitary evolution and collapse are absorbed into measurement operators, with the state being time-invariant, in contrast to the Schrödinger picture (for which both unitary evolution and collapse are applied to the state, measurements are time-invariant) and the Heisenberg picture (for which unitary evolution is applied to measurements and collapse is applied to the state), which can be understood as a largely mathematical no-collapse approach that introduces less in the way of metaphysical claims about the mathematics.
For a background to this paper that some people have found helpful, I attach a PDF of a talk I gave at a workshop a month ago at INFN Frascati.
 

Attachments

  • Frascati2022-AsGiven.pdf
    934.4 KB · Views: 71
  • Like
  • Informative
Likes gentzen, Maarten Havinga and jbergman
Physics news on Phys.org
  • #2
An interesting paper. I see echoes of some consistent histories there too, which also concerns itself with sequences of measurements. E.g. Consider the lead-up to equation (9). We can start with a set of histories of joint measurements ##\{C_{ij}\}## where ##C_{ij} = \hat{R}_{ij}(t_3) \hat{P}_j^{(B)}(t_2)\hat{P}_i^{(A)}(t_1)## where ##R_{ij}## is the projector to the pointer subspace corresponding to the record of the joint measurement ##(\alpha_i, \beta_j)## then as per the paper, we can generate the probability density $$p(u,v) = \sum_{i,j}\delta(u-\alpha_i)\delta(v-\beta_j)\rho(C^\dagger_{ij}C_{ij})$$ which is quite similar to (9).
 
Last edited:
  • Like
Likes Peter Morgan
  • #3
Thanks, @Morbert. Certainly I have noticed and others have noticed similarities in what I'm currently developing and the consistent histories interpretation that was already quite well-developed by 1990, say. I see many small differences, but because my ideas are currently spread through a number of articles and I can't claim to have a fully developed interpretation it's not easy to point them out. I think Generalized Probability Theory and the mathematics of measurement theory have developed considerably since 1990, and Koopman's Hilbert space formalism for Classical Mechanics gives us opportunities for comparing classical and quantum mechanics in the same Hilbert space formalism that has only come out in the open in the last few years. I'm trying to leverage that more recent mathematics without being constrained too much by the consistent histories or other interpretations.
[Koopman's formalism dates from 1931, was immediately used by von Neumann and Berkhoff to prove versions of the ergodic theorem, then was almost forgotten until Sudarshan noted in 1976 that it's useful for chaos theory. As far as I know, however, the wider implication that it is classically natural to take the classical measurement theory to be noncommutative, ensuring that the Poisson bracket is an integral part of classical mechanics instead of an afterthought, was not noted until my paper in Annals of Physics 2020, "An algebraic approach to Koopman classical mechanics", https://doi.org/10.1016/j.aop.2020.168090, on arXiv as https://arxiv.org/abs/1901.00526.]​

The equation you point out is indeed a very reasonable generalization of my (9), which I think of as just a particular presentation of the standard Lüders transformer. As far as I know the realization that, because we have to construct joint probabilities to model a series of consecutive measurements (either using my (9), your generalization, or ...), we can introduce a different state and different, commuting operators to model the same series of consecutive measurements, giving us a robust classical model for any experiment, is new, but I think it is also very old: the super-Heisenberg picture, as I call it, is just a mathematical way of doing what the Copenhagen interpretation told us we must do: describe our experiments classically and then construct a quantum mechanical model. In the super-Heisenberg picture, everything is just ordinary probability, but it is clearly classically useful to introduce Generalized Probability Theory to accommodate different experimental contexts, making it natural to transform to the Schrödinger or Heisenberg or other pictures.

I hope it's clear that I'm trying to leverage the mathematics as far as possible, hopefully without making too many mistakes or missteps that can't be fixed. Insofar as any established interpretation is a reasonable mathematical development of the generalized probability theory that the Hilbert space and operators formalism gives us, what I'm doing should be analogous to other interpretations. I hope! So yes, there are similarities with consistent histories but there are also similarities with Copenhagen and others. I'm attempting something a little different, however, by introducing a new picture, not a new interpretation, thereby coming quite close, I think, to unifying classical and quantum pictures. This is not a solution of the measurement problem, but I think it gives us a mostly new way to rethink it.
 
  • #4
What are the drawbacks of consistent histories? If you started from improving that interpretation, which problems in that interpretation are you trying to solve? Or if not, what details do you not like about the consistent histories interpretation?

I'm just wondering. As you mention, an analysis is better than an interpretation. Any other improvements?
 
  • Like
Likes Peter Morgan
  • #5
Maarten Havinga said:
What are the drawbacks of consistent histories? If you started from improving that interpretation, which problems in that interpretation are you trying to solve? Or if not, what details do you not like about the consistent histories interpretation?

I'm just wondering. As you mention, an analysis is better than an interpretation. Any other improvements?
As for any reasonable interpretation of QM, I think consistent histories is OK, because it's careful in its use of the mathematics of QM. I suspect it hasn't had as much impact as other interpretations partly because the mathematics it introduces is quite elaborate. In common with other interpretations of QM, however, I think it tries to make contact with Classical Mechanics, CM, as it existed in 1926. As I put it in the paper, this is to straw-man CM. We should be able to describe experimental results that are obtained in different experimental contexts systematically within a more general CM, which I describe and call CM+ in "An algebraic approach to Koopman Classical Mechanics", which I link to in my previous reply. We could say that I'm moving the goalposts.
What is the same is that CM+ has the same measurement theory as QM, so that most of the reasons why we can't use CM do not apply to CM+. If CM+ was exactly the same as QM, I think CM+ would not be interesting, but it isn't the same and I think we can learn something from it.

I think it's a crucial part of the analysis that once we have CM+ in hand, we also have to understand what the difference is between quantum and thermal fluctuations, which clearly can't be the same. Quantum Field Theory fortunately gives us the necessary clue, which is that the vacuum state, which I take to be all about quantum fluctuations, is Poincaré invariant, whereas thermal fluctuations in QFT are not. This is a difference that is entirely understandable in classical physics, but it is only understandable if we work in at least 1+1 dimensions, because only then can we define the Poincaré group. That means, I think, that we have to work with a field theory: a particle property theory will not work. That is a huge change, but we've become used to saying that our best theory is a Quantum Field Theory, so I think it is not an insurmountable change. [If you're OK watching video (2× is your friend), you could watch how I try to make sense of the violation of Bell inequalities in terms of fields at about the ten minute mark in a talk I gave at IQOQI Vienna in March 2021 (for about 6-7 minutes). The more recent PDF I include above is for a shorter talk, in which I didn't have time to elaborate as much on the field aspect.]
The difference between quantum and thermal noise is quite similar to the difference between the spectrum of white noise and the spectrum of red noise, say, except that instead of it being only about the frequency distribution of the noise in the time dimension it's about the wave number distribution in both space and time dimensions.​

If I have this more-or-less right I think the ideas should make it into the mainstream over the next ten years. Obviously I'm trying to persuade serious people to take the ideas seriously, but my writing is not very clear and I'm definitely coming out of left field, so it will be slow. I have to admit what must be clear to everybody, that I'm an enthusiast for all this, so I may have so much wrong that nobody will ever take my work seriously, however I believe that classical and quantum physics have been so clearly converging in the physics literature of the last 20 years that someone else will tell the story better than I can soon enough.
 
  • Like
Likes Maarten Havinga

Similar threads

  • Quantum Interpretations and Foundations
Replies
6
Views
530
  • Quantum Interpretations and Foundations
Replies
5
Views
861
  • Quantum Interpretations and Foundations
Replies
7
Views
713
  • Quantum Interpretations and Foundations
Replies
25
Views
1K
  • Quantum Interpretations and Foundations
11
Replies
376
Views
10K
  • Quantum Interpretations and Foundations
Replies
12
Views
2K
  • Quantum Interpretations and Foundations
Replies
15
Views
2K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
5K
  • Quantum Interpretations and Foundations
Replies
1
Views
530
  • Quantum Interpretations and Foundations
Replies
2
Views
1K
Back
Top