Stochastic signal analysis in QM

In summary: The weak way is that we need signals in wires, fiber optics, exotic materials, etc. in order to build our experimental apparatus, so obviously measuring and recording any experimental result is going to involve all that stuff.The strong way is that, by involving all that stuff in our experimental apparatus, we are somehow producing phenomena that would not happen in its absence.The weak way is that we need signals in wires, fiber optics, exotic materials, etc. in order to build our experimental apparatus, so obviously measuring and recording any experimental result is going to involve all that stuff.The strong way is that, by involving all that stuff in our experimental apparatus, we are somehow producing phenomena that would not happen in its absence
  • #1
Peter Morgan
Gold Member
274
77
[Moderator's note: This discussion has been spun off from another thread since it was getting into a more technical area only tangentially related to the original thread's topic. Some posts have been edited slightly for the new context.]

They're too technical for the question here, and only very indirectly connected to the OP's issue of collapse, but for Leon Cohen, there's Found. Phys. 18, 983 (1986), Proc. IEEE 77, 941(1989), and Time-Frequency Analysis.

My published papers, FWIW, are Physics Letters A 321 (2004) 216–224; Physics Letters A 338 (2005) 8–12; J. Phys. A: Math. Gen. 39 (2006) 7441–7455; JOURNAL OF MATHEMATICAL PHYSICS 48, 122302 (2007); EPL, 87 (2009) 31002 (non-paywalled versions can be accessed on arXiv through an author search). I think of all except the first as about the relationship between quantum fields and random fields, the latter of which can be quite directly understood as a specific formalism for stochastic signal analysis. I would not think of them as articulating an interpretation, however they could be read by a generous reader as moving towards an interpretation.
Also, by me somewhat deprecated, an invited chapter in "R. Abtamowicz and P. Lounesto (eds.), Clifford Algebras and Spinor Structures, pp28l-300" (which derived from my MSc thesis); an invited conference paper, AIP Conference Proceedings 889, 187 (2007); and a rather trivial comment, Eur. J. Phys. 32 (2011) L1–L2.
You'll see that there's a gap since 2009, which I think or pretend to myself is because I've been thinking hard about how to proceed in my research rather than just repeating what I did before.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
I have been calling this general viewpoint a "stochastic signal analysis interpretation of quantum field theory". Part of the impetus is what I see as very much what the OP [of the original thread] mentioned, in this,

kw1 said:
Yes, we set up a detector to 'measure' some quantum phenomenon. But, particle or wave (or both), it would seem that photons, or whatever, are interacting with the world all the time and everywhere

which is why I responded to the OP [of the original thread] (there's much to like in the quotes for "'measure' " and in the "or whatever"), that prima facie everything we know from modern experiments comes from analysis of signals in wires, fiber optics, and exotic materials (which are somehow coupled to whatever the local surroundings of the wire et alia may be, but also, moderating the OP's "all the time and everywhere", to a lesser extent with the wider surroundings —it used to be that we read meters and recorded the readings in lab notebooks). We present that information as a state —a normalized positive linear form— over an algebra of observables, which leads to the construction of a Hilbert space and a folium of states, which, I claim, is very close to what any sophisticated stochastic signal analysis does (indeed, is arguably the same as, in the relevant case of the electromagnetic field, but that is part of the content of my most recent paper on the arXiv, so not citable). The content of the Leon Cohen papers I cited above can be understood to be that negative values for Wigner functions are entirely natural in classical signal analysis, which in turn is obviously, at least to a sophisticated reader, because the algebra of observables of classical signal analysis is noncommutative.
 
Last edited by a moderator:
  • #3
Peter Morgan said:
prima facie everything we know from modern experiments comes from analysis of signals in wires, fiber optics, and exotic materials (which are somehow coupled to whatever the local surroundings of the wire et alia may be, but also, moderating the OP's "all the time and everywhere", to a lesser extent with the wider surroundings

Ok, but there are two (actually three--see below :wink:) ways to interpret this. One, which I'll call the "weak" way, is the humdrum observation that we need signals in wires, fiber optics, exotic materials, etc. in order to build our experimental apparatus, so obviously measuring and recording any experimental result is going to involve all that stuff. I don't think anyone disputes this; in fact most people probably think it too obvious and mundane to be worth mentioning.

The other way, which I'll call the "strong" way, is that, by involving all that stuff in our experimental apparatus, we are somehow producing phenomena that would not happen in its absence. But even here there are two possible variants: the "weakly strong" variant, which says that we are "producing" these phenomena only because, as @hilbert2 noted in post #4, we do things to quantum systems in our experiments (like isolate them really, really well from their environment) that don't happen to them in "natural" circumstances; or the "really strong" variant, which says that the so-called "systems" we say we are experimenting on (electrons, photons, etc.) actually have nothing significant to do with the phenomena we measure and record--it's all in the signals in wires, fiber optics, exotic materials, etc. In other words, the experimental apparatus is not just measuring something that happens to another system; it is creating the measured phenomena itself.

I can't tell which of these variants you are arguing for. Can you elucidate?
 
  • #4
PeterDonis said:
Ok, but there are two (actually three--see below :wink:) ways to interpret this. One, which I'll call the "weak" way, is the humdrum observation that we need signals in wires, fiber optics, exotic materials, etc. in order to build our experimental apparatus, so obviously measuring and recording any experimental result is going to involve all that stuff. I don't think anyone disputes this; in fact most people probably think it too obvious and mundane to be worth mentioning.

The other way, which I'll call the "strong" way, is that, by involving all that stuff in our experimental apparatus, we are somehow producing phenomena that would not happen in its absence. But even here there are two possible variants: the "weakly strong" variant, which says that we are "producing" these phenomena only because, as @hilbert2 noted in post #4, we do things to quantum systems in our experiments (like isolate them really, really well from their environment) that don't happen to them in "natural" circumstances; or the "really strong" variant, which says that the so-called "systems" we say we are experimenting on (electrons, photons, etc.) actually have nothing significant to do with the phenomena we measure and record--it's all in the signals in wires, fiber optics, exotic materials, etc. In other words, the experimental apparatus is not just measuring something that happens to another system; it is creating the measured phenomena itself.

I can't tell which of these variants you are arguing for. Can you elucidate?
Nice. I can try. The too obvious and mundane to mention is where (my) thinking begins. What's interesting to me about the "weak" way is that it perhaps really is agreeable to all, whether wearing a quantum or a classical hat. But then we get into how we analyze those signals (elsewhere I've pointed out that we could record the currents in every signal wire every picosecond, say, but we instead lossily compress that huge amount of information as, for example, the times when the current in any signal wire transitioned from near-zero current to definitely non-zero current, the time of an event). This seems to single out the question to ask as "what is different between classical signal analysis and quantum signal analysis?" I think this question has to be asked, which I think puts me not in the "weak" camp. For about the last 15-20 years, I've phrased that question as seeking to characterize the relationship between quantum fields and random fields. Most recently, the mathematics has become relatively clear: the relationship is close to but not quite isomorphism (in a clear sense, but I can't briefly describe it here).
One key to this is the Koopman-von Neumann approach to presenting classical mechanics in a formalism in which an algebra of operators acts on a Hilbert space (B. O. Koopman, Proc. Nat. Acad. Sci. 17, 315 (1931); J. von Neumann, Ann. Math. 33, 587 (1932); the latter is in German, if you know of an English translation, please let me know), the point being that the algebra of operators that are generated by the Poisson bracket in that approach is noncommutative. This point is made in quite a few recent published papers (here I'll copy-and-paste from my arXiv paper's bibliography): V. Kisil, Geometry, Integrability and Quantization, 18, 11 (2017) (arXiv:1611.05650v1 [math-ph]); D. Mauro, Phys. Lett. A 315, 28 (2003); M. de Gosson, ``Symplectic Geometry and Quantum Mechanics'' (Birkhäuser, Basel, 2006); M. A. de Gosson and B. J. Hiley, Found. Phys. 41, 1415 (2011). There's another, very recent paper that I would like to but won't link to, arXiv only as of now but it would be published if I were the referee, from Federico Zalamea.

I definitely don't think in terms of the "really strong" variant. The signal we measure exhibits different statistics if we put the apparatus in a dark room, giving us what is commonly called "the dark rate", or if, speaking colloquially, we turn on a light in the room that contains the apparatus, or if we make any other substantive change. The statistics change whenever we change the environment. Now it's back to asking whatever can be causing that change. I'm open to being really positivistic about what whatever is, saying there's nothing there —because we can't, by definition, measure what is outside our measurement apparatus—, but I don't generally think in that way.

I'm not really a "weakly strong" person, either, though that seems closest. The random field can be interpreted in terms of classical probability, of which I prefer an ensemble or Bayesian interpretation because I think those are what physicists do. If we're thinking old-school classical mechanically, we could measure the random field without affecting the random field state. A Koopman-von Neumann Hilbert space presentation of classical mechanics, however, allows us to model noncommuting measurements, such as considering results in the Fourier basis instead of in the position basis, so that the state in that Hilbert space formalism is arguably changed by measurements, with all the same discussions and worries one sees in QM/QFT.
I'm throwing this discussion together in response to you. I love doing this, because every time I discuss it I think of different things to say, different aspects come to the fore because they just do, my mind changes in small or large ways and it's enormously productive for me. It's not unconsidered, but it is research, if you like, in which case I shouldn't be doing it here, yet it's all standing, however precariously, on the shoulders of literature of the last 90 years. Anyway, my arXiv paper is more considered and, I hope, more easily understood; I've acknowledged half a dozen people, a few of them decent mathematical physicists, who have made comments that have changed the text in nontrivial ways, and so far no-one has suggested that the math needs to change in any way. If you're willing to really go at this, however, I should post a copy of the paper as I now have it, because the current arXiv v3 has no mention of Koopman-von Neumann.
 
Last edited by a moderator:
  • #5
Peter Morgan said:
I should post a copy of the paper as I now have it

Can you post an update to arxiv? That would be easiest.
 
  • #6
Peter Morgan said:
[Moderator's note: This discussion has been spun off from another thread since it was getting into a more technical area only tangentially related to the original thread's topic. Some posts have been edited slightly for the new context.]

They're too technical for the question here, and only very indirectly connected to the OP's issue of collapse, but for Leon Cohen, there's Found. Phys. 18, 983 (1986), Proc. IEEE 77, 941(1989), and Time-Frequency Analysis.

That's interesting, I didn't know Cohen wrote a QM paper. I am familiar with his book on signal analysis, and did know that he mentions that things like the Wigner (quasi)distribution are also used in QM.
 
  • Like
Likes Peter Morgan
  • #7
PeterDonis said:
Can you post an update to arxiv? That would be easiest.
I had been holding off posting a v4 until I submitted to JMathPhys, which I had been holding off doing until the paper was perfect. I think it is and it'll have to be ready enough for me to do everything tomorrow. Hey ho.
atyy said:
That's interesting, I didn't know Cohen wrote a QM paper. I am familiar with his book on signal analysis, and did know that he mentions that things like the Wigner (quasi)distribution are also used in QM.
The Foundations of Physics 18, 983 (1986) paper in many ways set me off on the road I've been on since the mid-90's, so I have a very soft spot for Leon Cohen. Most of his work is about deterministic classical signal analysis, AFAIK, but this one paper is stochastic. I find that paper hard to cite, however, because its math is kinda too low level, but looked at generously, I think of it as quite closely related to the GNS-construction, or even as an elementary introduction to why you might want to use the GNS-construction.
 
  • #8
v4 of arXiv:1709.06711 is now on the arXiv, and it's submitted to JMathPhys, so now is the time to point out where the nonsense is. The math is OK, I'm pretty sure, because it's elementary, but where the math takes our interpretation of QFT is less sure.

I don't make a connection to de Broglie-Bohm in the paper, but there is a weak similarity in that the math of §II and of Appendix B also points out a hidden structure, a Hilbert space presentation of a random field over Minkowski space, that's present for quantum fields that have a complex structure —complex scalar fields and the electromagnetic field (using the Hodge operator). That random field allows a manifestly Lorentz invariant classical stochastic signal analysis interpretation of those quantum fields as random fields. One of the major failings of the de Broglie-Bohm interpretation is avoided, the move into the math of the Hamilton-Jacobi formalism, which compares rather unfavorably with the math of Hilbert spaces, and Lorentz invariance is preserved, so that's good too. Indeed the math is so similar that it's hardly worth bothering.

Also unmentioned in the paper is that in the last few weeks I've been playing with the idea that it's rhetorically useful to say that working with random fields makes complementarity somewhat evaporate: wave/field duality doesn't have quite the same sense of opposites as wave/particle duality. I'm still unclear whether this is just too cute. Perhaps it's also misleading, because of course there are still operators that don't commute; but in a random field setting we can attribute that noncommutativity entirely to the Fourier transform for the elementary case, just as in the Leon Cohen papers on time-frequency analysis, or to any coordinate transform that is generated by the Poisson bracket in the general case (which, as a friend on Facebook loves to tell me, leads us to Pontryagin duality, where I'm somewhat out of my mathematical depth —take your pick from https://www.google.com/search?q=Pontryagin+duality).

The final minus sign in the ##{}^\bullet## involution for EM field bivector test functions,
$${}^\bullet:\mathcal{F}\rightarrow\mathcal{F};f\mapsto f^\bullet,\widetilde{f^\bullet}(k)=\frac{1}{2}(1+\mathrm{i}\star)\tilde f(k)+\frac{1}{2}(1-\mathrm{i}\star)\tilde f(-k),$$ BTW, which is most of the instrument for the connection between quantum and random fields, seems likely to be the reason no-one has thought of doing this, because of the justifiable focus for quantum fields on the spectrum condition because of the correspondence principle, although the literature is full of approaches that are somewhat like this when you start looking. Critically, for random fields the Hamiltonian function has to be bounded below, but the Hamiltonian operator that is generated by the Poisson bracket is unbounded.
 
Last edited:

What is stochastic signal analysis in QM?

Stochastic signal analysis in quantum mechanics is a branch of physics that deals with the mathematical analysis and interpretation of signals that exhibit random behavior. It involves the use of statistical methods to analyze and model these signals.

What are some applications of stochastic signal analysis in QM?

Stochastic signal analysis in quantum mechanics has many applications, including in the study of quantum information processing, quantum cryptography, and quantum sensing. It is also used in the analysis of quantum systems such as quantum computers and quantum communication systems.

What are the main challenges in stochastic signal analysis in QM?

There are several challenges in stochastic signal analysis in quantum mechanics, including the need for specialized mathematical models and techniques, the complexity of quantum systems, and the difficulty in accurately measuring and interpreting quantum signals.

What techniques are commonly used in stochastic signal analysis in QM?

Some common techniques used in stochastic signal analysis in quantum mechanics include spectral analysis, time series analysis, and statistical inference methods. These techniques are used to analyze and model the behavior of quantum signals and extract useful information from them.

What are the benefits of using stochastic signal analysis in QM?

Stochastic signal analysis in quantum mechanics allows for a deeper understanding of the behavior and properties of quantum systems, which can lead to advancements in fields such as quantum computing and quantum communication. It also provides a framework for analyzing and interpreting complex and random quantum signals, which can aid in the development of new technologies.

Back
Top