I How Does Quantum Mechanics Relate to Quantum Field Theory in Particle Physics?

  • Thread starter Thread starter fxdung
  • Start date Start date
  • Tags Tags
    Qft Qm Relation
Click For Summary
Quantum Field Theory (QFT) is fundamentally based on Quantum Mechanics (QM), with QFT serving as an application of QM principles to fields rather than particles. The relationship between QFT and QM is complex, as QFT involves quantizing fields and using mathematical tools like Feynman diagrams and Green functions, which are also applicable in QM. While QFT textbooks often focus on mathematical developments without explicitly discussing QM, foundational concepts like the path integral and commutation relations are rooted in QM. The discussion highlights that the axioms of QM do not directly translate to relativistic QFT, as the latter does not accommodate classical measurement processes. Overall, both theories share a common mathematical structure but differ significantly in their treatment and implications.
  • #151
vanhees71 said:
Of course, we all know, how it is defined,
$$\langle E \rangle=\langle \psi|\hat{H} \langle \psi=\int_{\mathbb{R}^3} \mathrm{d}^3 \vec{x} \psi^*(x) \left [-\frac{\Delta}{2m}+V(x) \right ] \psi(x).$$
This is unambiguously defined in the mathematical foundations and has nothing to do with any "interpretation weirdness".
The question was not how it is defined (which is part of the QM calculus) but why Born's rule which just says that ##|\psi(x)|^2## is the probability density of ##x## implies that the right hand side is the expected measurement value of ##H##. It is used everywhere but is derived nowhere, it seems to me.
 
Physics news on Phys.org
  • #152
That's the strength of Dirac's formulation compared to the wave-mechanics approach. A pure state is represented by a normalized state vector ##|\psi \rangle## (more precisely the ray, but that's irrelevant for this debate). Then ##\hat{H}## has a complete set of (generalized) eigenvectors ##|E \rangle## (let's for simplicity also forget about the common case that the Hamiltonian is non-degenerate). Then the probability that a system prepared in this state has energy ##E## is according to the Born rule given by
$$P(E)=|\langle E|\psi \rangle|^2,$$
and thus
$$\langle E \rangle = \sum_E P(E) E=\sum_E \langle \psi|E \rangle \langle E|\hat{H} \psi \rangle = \langle \psi|\hat{H} \psi \rangle.$$
The latter expression can now be written in any other representation you like. In the position representation you have, e.g.,
$$\langle E \rangle = \int \mathrm{d}^3 \vec{x}_1 \int \mathrm{d}^3 \vec{x}_2 \langle \psi |\vec{x}_1 \rangle \langle \vec{x}_1|\hat{H}|\vec{x}_2 \rangle \langle \vec{x}_2 |\psi \rangle=\int \mathrm{d}^3 \vec{x}_2 \int \mathrm{d}^3 \vec{x}_2 \psi^*(\vec{x}_1) H(\vec{x}_1,\vec{x}_2) \psi(\vec{x}_2).$$
Now you only have to calculate the matrix element. For the potential it's very simple:
$$V(\vec{x}_1,\vec{x}_2)=\langle \vec{x}_1|V(\hat{x})|\vec{x}_2 \rangle=V(x_2) \delta^{(3)}(\vec{x}_1-\vec{x}_2).$$
For the kinetic part, it's a bit more complicated, but also derivable from the Heisenberg algebra of position and momentum operators.

The first step is to prove
$$\langle \vec{x}|\vec{p} \rangle=\frac{1}{(2 \pi)^{3/2}} \exp(\mathrm{i} \vec{x} \cdot \vec{p}).$$
For simplicity I do this only for the 1-component of position and momentum. That the simultaneous generalized eigenvector of all three momentum components factorizes is clear.

Since ##\hat{p}## is the generator of spatial translations, it's intuitive to look at the operator
$$\hat{X}(\xi)=\exp(\mathrm{i} \xi \hat{p}) \hat{x} \exp(-\mathrm{i} \xi \hat{p}).$$
Taking the derivative wrt. ##\xi## it follows
$$\frac{\mathrm{d}}{\mathrm{d} \xi} \hat{X}(\xi)=-\mathrm{i} \exp(\mathrm{i} \xi \hat{p}) [\hat{x},\hat{p}] \exp(-\mathrm{i} \xi \hat{p}).$$
From the Heisenberg commutation relations this gives
$$\frac{\mathrm{d}}{\mathrm{d} \xi} \hat{X}=1 \; \Rightarrow \; \hat{X}=\hat{x}+\xi \hat{1}.$$
So we have
$$\hat{x} \exp(-\mathrm{i} \xi \hat{p}) |x=0 \rangle=\exp(-\mathrm{i} \xi \hat{p}) \hat{X}(\xi) |x=0 \rangle=\xi \exp \exp(-\mathrm{i} \xi \hat{p}) |x=0 \rangle.$$
Then you have
$$\langle x|p \rangle=\langle \exp(-\mathrm{i} x \hat{p}) x=0|p \rangle=\langle x=0|p \rangle \exp(+\mathrm{i} p x)=N_p \exp(\mathrm{i} p x).$$
The constant ##N_p## is determined by the normalization of the momentum eigenstate as
$$\langle p|p' \rangle=\delta(p-p')=\int \mathrm{d} x \langle p|x \rangle \langle x|p' \rangle=\int \mathrm{d} x N_{p}^* N_p \exp[\mathrm{i}x(p'-p)]=2 \pi |N_{p}|^2 \delta(p-p') \; \Rightarrow \; N_{p}=\frac{1}{\sqrt{2 \pi}}.$$
Of course, the choice of phase is arbitrary.

Now we can also evaluate the expectation value of kinetic energy easily
$$\left \langle \frac{\vec{p}^2}{2m} \right \rangle=\int \mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p} \frac{p^2}{2m} \langle \psi|\vec{p} \rangle \langle \vec{p}| \vec{x} \rangle \langle \vec{x} |\psi \rangle=\int \mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p} \frac{p^2}{2m} \frac{1}{(2 \pi)^{3/2}} \exp(-\mathrm{i} \vec{p} \cdot \vec{x}) \langle \psi|\vec{p} \rangle \psi(x)= \int \mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p} \left [-\frac{\Delta}{2m} \frac{1}{(2 \pi)^{3/2}} \exp(-\mathrm{i} \vec{p} \cdot \vec{x}) \right ] \langle \psi|\vec{p} \rangle \psi(x) = \int \mathrm{d}^3 \vec{x} \int \mathrm{d}^3 \vec{p} \langle \psi|\vec{p} \rangle \langle \vec{p}|\vec{x} \rangle \left (\frac{-\Delta}{2m} \right) \psi(\vec{x}) = \int \mathrm{d}^3 \vec{x} \psi^*(\vec{x}) \left (-\frac{\Delta}{2m} \right) \psi(\vec{x}).$$
So it's not just written down but derived from the fundamental postulates + the specific realization of a quantum theory based on the Heisenberg algebra. To derive the latter from the Galilei group alone is a bit more lengthy. See Ballentine, Quantum Mechanics for that issue (or my QM 2 lecture notes which, however, are in Germany only: http://fias.uni-frankfurt.de/~hees/publ/hqm.pdf ).
 
  • #153
vanhees71 said:
That's the strength of Dirac's formulation compared to the wave-mechanics approach.[...] So it's not just written down but derived from the fundamental postulates
From Dirac's postulates (and only if ##H## has no continuous spectrum) but not from Born's. Are Dirac's postulates somewhere available online?
 
Last edited:
  • #154
But there is only one set of postulates in the standard (aka Dirac-von Neumann) formulation. The statistical postulate is:
1. The set of experimentally obtained values of observable A are the spectral values of the self-adjoint operator Â
2. If the state of the system for which one measures A is {p_k, \psi_k}, then the probability to get a_n from disc(Â) is P (a_n) = sum_k p_k <psi_k | P_n| psi_k>, while the probability density of the point alpha from the parametrization space of cont(Â) is P (alpha) = sum_k p_k <psi_k | P_alpha| psi_k>.

The projectors are defined in terms of the Dirac bra/ket spectral decomposition of Â.
 
Last edited:
  • #155
dextercioby said:
But there is only one set of postulates in the standard (aka Dirac-von Neumann) formulation. The statistical postulate is:
1. The set of experimentally obtained values of observable A are the spectral values of the self-adjoint operator Â
2. If the state of the system for which one measures A is {p_k, \psi_k}, then the probability to get a_n from disc(Â) is P (a_n) = sum_k p_k <psi_k | P_n| psi_k>, while the probability density of the point alpha from the parametrization space of cont(A) is P (alpha) = sum_k p_k <psi_k | P_alpha| psi_k>.

The projectors are defined in terms of the Dirac bra/ket spectral decomposition of Â.
This version doesn't cover the argument used by vanhees71 in case H has a continuous spectrum, where the sum must be replaced by an integral.

Which version is in Dirac's book? Or do different editions have different versions? Is one of them applicable to ##H=p^2/2m##?
 
  • #156
Of course it does. The continuous spectrum is addressed by an integration in parametrization space. The integral is Riemannian, the parametrization space is a subset of R. The spectral decomposition \Sum_n P_n + \int d alpha P_alpha = Î. This expression makes sense in the rigged Hilbert space formulation of QM, advocated by Arno Böhm and his coworkers.
 
  • #157
dextercioby said:
Of course it does. The continuous spectrum is addressed by an integration in parametrization space.
The question is whether the continuous case (with a Stieltjes integral in place of the sum and the interpretation of matrix elements as probability densities) is in the postulates as formulated by Dirac, or if it is just proceeding by analogy - which would mean that the foundations were not properly formulated.

The rigged Hilbert space is much later than Dirac I think - Gelfand 1964?
 
  • #158
A. Neumaier said:
The question is whether the continuous case (with a Stieltjes integral in place of the sum and the interpretation of matrix elements as probability densities) is in the postulates as formulated by Dirac, or if it is just proceeding by analogy - which would mean that the foundations were not properly formulated.

The rigged Hilbert space is much later than Dirac I think - Gelfand 1964?

No, the foundations were properly formulated by von Neumann in 1931, indeed using Stieltjes integrals to define the spectral measures. Dirac's book of 1958 has no precise statement of a set of axioms, yet it has been customary to denote the standard axioms by the name of Dirac and von Neumann (especially for the state reduction/collapse axiom).

The rigged Hilbert spaces were invented by Gel'fand and Kostyuchenko in 1955 and described at large in the 1961 book (4th volume of the famous generalized functions) which was translated to English in 1964. It is not known to me if Arno Böhm knew Russian, it may have been that the book had been first translated to German, or simply someone helped with the translation from Russian. The first use of RHS to QM was made by Arno Böhm in 1964 in a preprint (unfortunately poorly scanned) at the International Center of Theoretical Physics in Trieste.
 
  • #159
A. Neumaier said:
The question is whether the continuous case (with a Stieltjes integral in place of the sum and the interpretation of matrix elements as probability densities) is in the postulates as formulated by Dirac, or if it is just proceeding by analogy - which would mean that the foundations were not properly formulated.

The rigged Hilbert space is much later than Dirac I think - Gelfand 1964?

There are two types of foundations - physical and mathematical. Throughout, I have meant physical while you have often meant mathematical.

The physical foundations were properly formulated by Bohr, Dirac, Heisenberg and von Neumann. Each took a slightly different view, but the key point is that quantum mechanics is a practical operational theory which only makes probabilistic predictions. The wave function, collapse etc are not real. And most importantly, quantum mechanics has a measurement problem.

The mathematical foundations were not complete at the time of von Neumann. POVMs and collapse for continuous variables came later. However, these mathematical tidying up changed no physical concept.
 
  • Like
Likes Demystifier
  • #160
A. Neumaier said:
The question was not how it is defined (which is part of the QM calculus) but why Born's rule which just says that ##|\psi(x)|^2## is the probability density of ##x## implies that the right hand side is the expected measurement value of ##H##. It is used everywhere but is derived nowhere, it seems to me.

No, of course you cannot derive it from the literal Born rule. When we say Born rule nowadays, we mean the generalization, eg. Dirac, von Neumann, and later work, and eg. what vanhees71 did in post #152.
 
  • Like
Likes Demystifier
  • #161
A. Neumaier said:
From Dirac's postulates (and only if ##H## has no continuous spectrum) but not from Born's. Are Dirac's postulates somewhere available online?
I don't know, what you mean by Dirac's vs. Born's postulates. I think the best source for Dirac's point of view is still his famous textbook. What's known as Born's rule is that the modulus squared of the wave function, no matter with respect to which basis, gives the probabilities for the discrete values and probability distributions for the continuous values of the spectrum of the self-adjoint operator. Dirac's handling of distributions (in the sense of generalized functions) was a la physics, i.e., no rigorous. Making it rigorous lead the mathematicians to the development of modern functional analysis. The first mathematically rigorous formulation in form of Hilbert-space theory goes back to John von Neumann, but his physics is a catastrophe, leading to a lot of esoterical debates concerning interpretation. His interpretation is Copenhagen + necessity of a conscious being to take note of the result of a measurement. So it's solipsism in some sense and lead to the famous question by Bell, when the first "collapse" might have happened after the big bang, whether an amoeba is enough to observe something or whether you need some more "conscious" being like a mammal or a human ;-)).
 
  • #162
vanhees71 said:
The first mathematically rigorous formulation in form of Hilbert-space theory goes back to John von Neumann, but his physics is a catastrophe, leading to a lot of esoterical debates concerning interpretation.
Now I am confused. Do you consider his interpretation in terms of consciousness to be a part of physics? That's confusing because at other places you seem to claim the opposite, that such interpretations are not physics.

Or maybe, which I would more naturally expect from you, you would like to divide his work into three aspects: mathematics, physics, and interpretation? But in that case it would not be fair to call his physics a catastrophe. His insight that measurement involves entanglement with wave functions of macroscopic apparatuses is an amazing physical insight widely adopted in modern theory of quantum measurements, irrespective of interpretations.
 
  • #163
Of course, von Neumann's interpretation is no physics but esoterics. I'm totally baffled that somebody of his caliber could come to such an idea. I think his merits concerning QT are completely mathematical, namely to have put it on a solid mathematically strict ground in terms of Hilbert-space theory (mostly in the formulation as "wave mechanics".
 
  • #164
vanhees71 said:
Of course, von Neumann's interpretation is no physics but esoterics. I'm totally baffled that somebody of his caliber could come to such an idea.
I agree on this.

vanhees71 said:
I think his merits concerning QT are completely mathematical,
But disagree on that. I think he had physical merits too.
 
  • #165
Demystifier said:
I agree on this.

But the greatness of von Neumann is that he saw clearly, like Bohr and Dirac, that Copenhagen has a measurement problem. The great merit of these physicists is that they are very concerned about physics, unlike Peres (which is a marvellous book), but is completely misleading in not stating the measurement problem clearly, and even hinting that it does not exist in the Ensemble interpretation.

Also I don't think von Neumann's idea of consciousness causing collapse is that different from Bohr or even Landau and Lifshitz's classical/quantum cut, which is a subjective cut. It's the same as Dirac agreeing that there is an observer problem - somehow there has to be an observer/consciousness/classical-quantum cut, which are more or less the same thing.
 
  • Like
Likes Demystifier
  • #166
That's the great miracle. After all this time people think that there is a measurement problem, but where is it when accepting the minimal interpretation?

Where is the necessity of a classical/quantum cut or a collapse? I just need real-world lab equipment and experimentalists able to handle it to do measurements on whatever system they can prepare in whatever way, make a model within QT and compare my prediction to the oustcome of the measurements. Both my prediction and the mesurements are probabilistic and statistical, respectively. The more than 90 years of application of QT to real-world experimental setups and real-world observations are a great success story So where is the real physics problem? There may be a problem in some metaphysical sense, depending on the believe or world view of the one or the other physicist, but no problem concerning the natural-science side of affairs.
 
  • #167
vanhees71 said:
That's the great miracle. After all this time people think that there is a measurement problem, but where is it when accepting the minimal interpretation?

Where is the necessity of a classical/quantum cut or a collapse? I just need real-world lab equipment and experimentalists able to handle it to do measurements on whatever system they can prepare in whatever way, make a model within QT and compare my prediction to the oustcome of the measurements. Both my prediction and the mesurements are probabilistic and statistical, respectively. The more than 90 years of application of QT to real-world experimental setups and real-world observations are a great success story So where is the real physics problem? There may be a problem in some metaphysical sense, depending on the believe or world view of the one or the other physicist, but no problem concerning the natural-science side of affairs.

A simple way to see it is that even in the minimal interpretation, one has deterministic unitary evolution and probabilistic evolution due to the Born rule. If one extends deterministic evolution to the whole universe, then there is no room for probability. So the wave function cannot extend to the whole universe. Deciding where it stops, and when the boundary between deterministic evolution and stochastic evolution is is the classical/quantum cut.
 
  • #168
I've never claimed that QT is applicable to a single "event" like the "entire universe" ;-)).
 
  • #169
Minimal ensemble interpretation is not a solution of the measurement problem. It is a clever way of avoiding talk about the measurement problem.
 
  • #170
vanhees71 said:
I've never claimed that QT is applicable to a single "event" like the "entire universe" ;-)).
How about single electron?
 
  • #171
QT makes probabilistic predictions about the behavior of a single electron. You can take a single electron and prepare it very often in the same state and statistically analyse the result to test the probabilistic predictions. A single measurement on a single electron doesn't tell much concerning the validity of the probabilistic predictions.
 
  • Like
Likes Demystifier
  • #172
vanhees71 said:
I've never claimed that QT is applicable to a single "event" like the "entire universe" ;-)).

Yes, so one needs an ensemble of subsystems of the universe. The choice of subsystem is the classical/quantum cut.
 
  • #173
This is a bit too short an answer to be convincing. Why is choosing a subsystem of the universe the classical/quantum cut? Matter as we know it cannot be described completely by classical physics at all. So how can just taking a lump of matter as the choice of a subsystem define a classical/quantum cut?
 
  • #174
vanhees71 said:
This is a bit too short an answer to be convincing. Why is choosing a subsystem of the universe the classical/quantum cut? Matter as we know it cannot be described completely by classical physics at all. So how can just taking a lump of matter as the choice of a subsystem define a classical/quantum cut?

Well, if you agree that quantum mechanics cannot describe the whole universe, but it can describe subsystems of it, then it seems that at some point quantum mechanics stops working.
 
  • Like
Likes Demystifier
  • #175
vanhees71 said:
That's the great miracle. After all this time people think that there is a measurement problem, but where is it when accepting the minimal interpretation?

The Born interpretation itself to me seems to require a choice of basis before it can be applied. The rule gives the probability for obtaining various values for the results of measurements. I don't see how you can make sense of the Born rule without talking about measurements. How can you possible compare QM to experiment unless you have a rule saying: If you do such and such, you will get such and such value? (or: if you do such and such many times, the values will be distributed according to such and such probability)
 
  • #176
vanhees71 said:
This is a bit too short an answer to be convincing. Why is choosing a subsystem of the universe the classical/quantum cut? Matter as we know it cannot be described completely by classical physics at all. So how can just taking a lump of matter as the choice of a subsystem define a classical/quantum cut?

Well, I'm not sure that the cut needs to be classical/quantum, but in order to compare theory with experiment, there needs to be such a thing as "the outcome of an experiment". The theory predicts that you have a probability of P of getting outcome O, then it has to be possible to get a definite outcome in order to compile statistics and compare with the theoretical prediction. But for the subsystem described by quantum mechanics, there are no definite outcomes. The system is described by superpositions such as \alpha |\psi_1\rangle + \beta |\psi_2\rangle. So it seems to me that we distinguish between the system under study, which we treat as evolving continuously according to Schrodinger's equation, and the apparatus/detector/observer, which we treat as having definite (although nondeterministic) outcomes. That's the split that is sometimes referred to as the classical/quantum split, and it seems that something like it is necessary in interpreting quantum mechanics as a probabilistic theory.
 
  • #177
atyy said:
Well, if you agree that quantum mechanics cannot describe the whole universe, but it can describe subsystems of it, then it seems that at some point quantum mechanics stops working.
Sure. But what has this to do with the quantum/classical cut. Classical physics is also not working!
 
  • #178
stevendaryl said:
The Born interpretation itself to me seems to require a choice of basis before it can be applied. The rule gives the probability for obtaining various values for the results of measurements. I don't see how you can make sense of the Born rule without talking about measurements. How can you possible compare QM to experiment unless you have a rule saying: If you do such and such, you will get such and such value? (or: if you do such and such many times, the values will be distributed according to such and such probability)
Sure, it requires a choice of basis, but that's the choice of what you measure, because you have to choose the eigenbasis of the operator representing the observable you choose to meausure. There's nothing very surprising.

QT subscribes only to the 2nd formulation in parentheses: "if you do such and such many times, the values will be distributed according to such and such probability." That's precisely how QT in the minimal formulation works: "doing such and such" is called "preparation" in the formalism and defines what a (pure or mixed state) is, and "the values" refer to an observable you choose to measure. The prediction of QT is that in the given state the probability (distribution) to find a value of this measured observable is given by Born's rule.
 
  • #179
vanhees71 said:
Sure. But what has this to do with the quantum/classical cut. Classical physics is also not working!

Yes, classical/quantum cut does not literally mean classical. It just means where we take QM to stop working, and where we get definite outcomes.
 
  • #180
stevendaryl said:
Well, I'm not sure that the cut needs to be classical/quantum, but in order to compare theory with experiment, there needs to be such a thing as "the outcome of an experiment". The theory predicts that you have a probability of P of getting outcome O, then it has to be possible to get a definite outcome in order to compile statistics and compare with the theoretical prediction. But for the subsystem described by quantum mechanics, there are no definite outcomes. The system is described by superpositions such as \alpha |\psi_1\rangle + \beta |\psi_2\rangle. So it seems to me that we distinguish between the system under study, which we treat as evolving continuously according to Schrodinger's equation, and the apparatus/detector/observer, which we treat as having definite (although nondeterministic) outcomes. That's the split that is sometimes referred to as the classical/quantum split, and it seems that something like it is necessary in interpreting quantum mechanics as a probabilistic theory.
Sure, but where is there a problem? The very success of very accurate measurements in accordance with the predictions of QT shows that there is no problem. To understand how a measurement apparatus works, ask the experimentalists/engineers who invented it, which model of the apparatus they had in mind to construct it. It's almost always classical, and that the classical approximation works is shown by the very success of the apparatus to measure what it is supposed to measure.

Another question is, how to understand the classical behavior of macroscopic objects from QT, including that of measurement devices (which are, of course, themselves just macroscopic objects, obeying the same quantum laws of nature as any other). I think that this is quite well understood in terms of quantum statistics and appropriate effective coarse-grained descriptions of macroscopic observables derived from QT.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
22
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
6K
  • · Replies 4 ·
Replies
4
Views
425
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 182 ·
7
Replies
182
Views
15K
Replies
26
Views
3K
  • · Replies 15 ·
Replies
15
Views
568
  • · Replies 31 ·
2
Replies
31
Views
3K