Haag's Theorem & Wightman Axioms: Solving Problems in QFT?

  • A
  • Thread starter ftr
  • Start date
  • Tags
    Qft
In summary, Haag's theorem and Wightman axioms show that the vacuum sector of a relativistic QFT in 4 space-time dimensions must look different from a Fock space. Recently, a colleague pointed me to a review of Haag's theorem and related works which supports this claim. However, it is a pity that teachers of QFT do not discuss asymptotic states of interacting particles more often, as this would help to clear up some of the uncertainty surrounding the theory.
  • #1
ftr
624
47
QFT seems to be a bit sick with cluster decomposition assumption ..etc. So here comes Haag's theorem and Wightman axioms to the rescue, or do they? So what do these cures actually say differently than the generic QFT . Do they solve any practical problems, if not why the fuss, millennium prize and all.
 
Physics news on Phys.org
  • #2
Only relativistic QFT in 4 space-time dimensions is sick because interacting fields are not well-defined in any logically coherent sense.

Theoretical physicists predicting particle properties work around using either finite lattice approximations, which is a quantum field analogue of solving partial differential equations using finite difference methods breaking all spacetime symmetries, or low order formal renormalized perturbation theory, which is to a logically precise specification like Euler's treatment of functions through formal power series a few hundred years ago. These methods seem to work and give sensible physical predictions, but they have no proper logical foundation.

The Wightman axioms spell out how the vacuum sector of such a theory should look like according to our present best guess, but (unlike in lower dimensions where there are many constructions) no interacting relativistic QFT in 4 space-time dimensions has been constructed in the more than 50 years of existence of the Wightman axioms. Neither are there nonexistence theorems. The simplest tractable case is widely held to be 4D Yang-Mills theory, which is the reason why there is a millennium prize on solving it. See https://www.physicsoverflow.org/217...osons-infinite-and-discrete?show=21846#a21846 for more comments on the latter.

Cluster decomposition is an important principle without which physics would be impossible. It basically says that far away from other particles particles behave independently of each other. This is the basic reason why we can look at (properly prepared) small objects independently of other objects. For its physical basis see on the informal level Weinberg's QFT book, Vol. I, Chapter 5. On the rigorous level, it guarantees the uniqueness of the vacuum state; see any book on algebraic quantum field theory.

Haag's theorem is hardly related to that; it only says that the CCR representation of the free field cannot extend to interacting fields, the interacting Hilbert space must look quite different from a Fock space. More precisely, while they have to be isomorphic as Hilbert spaces (all infinite-dimensional separable Hilbert spaces are isomorphic), the additional structure needed to set up quantum fields cannot be respected by such an isomorphism.
 
Last edited:
  • Like
Likes TheCanadian, Spinnor, haushofer and 3 others
  • #3
Recently a colleague after a semester-long discussion on the foundations of QFT pointed me to this very nice review on Haag's theorem and related works:

https://link.springer.com/article/10.1007/s10670-005-5814-y
http://philsci-archive.pitt.edu/2673/1/earmanfraserfinalrevd.pdf

Don't worry! Although it says, it's about philosophy, it's in fact physics.

Whenever a serious teacher of QFT comes to explain asymptotic states of interacting particles, the S-matrix, the LSZ reduction formalism s/he gets doubts and discusses it with colleagues, of course, with not too much success; it's indeed an unsolved problem to find a mathematically rigorous formulation of 4D relativistic QFT with interacting particles.

FAPP QFT, as used by HEP phenomenologists, works with great success (even too much since the Standard Model stands up against all experimental efforts to disprove it; although recently LHCb came up with the next attack, but only at a bit over 2% confidence level yet ;-)). The way out is a practical approach: Just put the system in a box with periodic boundary conditions, use perturbation theory and renormalize. Then you get results with up to 12 significant digits accuracy. As Arnold said the other way out is the lattice approach and high computing power (at least in QCD, where you get, e.g., the hadron mass spectrum pretty nicely out too, and that's clearly beyond perturbation theory).

That's the approach Weinberg chooses in his book: Simply ignore Haag et al, which however is a pity since I think one should be aware of it!
 
  • #4
vanhees71 said:
put the system in a box with periodic boundary conditions
This works for kinetic or hydrodynamic questions. But the resulting compactification destroys scattering theory since under these conditions the spectrum becomes wholly discrete.

vanhees71 said:
Then you get results with up to 12 significant digits accuracy.
This ultra-high accuracy is not obtained in this way but through NRQED: One uses a nonrelativistic $1/c^2$ expansion to get rid of the problems with the relativistic theory and then works essentially with nonrelativistic quantum mechanics with relativistic corrections.
 
  • #5
Thanks both for your replies.

vanhees71 said:
mathematically rigorous formulation of 4D relativistic QFT with interacting particles

Since I am not a professional theorist, my opinion probably don't add to much. But I feel that "mathematically rigorous" should be physically rigorous. I am well aware of the successes, but I think it is mainly due to correctness of QM itself. The QFT formulation is heroic to be sure , but I have read the story of the development. A very messy process of experimentation(with it own problems, bumps.etc) and a theoretician trying to score.

But let me ask this ,I think the main issue with QFT is removing infinities and renormalization of the mass and charge, I think these are PHYSICAL problems and not "mathematical", do you agree.
 
  • #6
A. Neumaier said:
This ultra-high accuracy is not obtained in this way but through NRQED

ftr said:
but I think it is mainly due to correctness of QM itself.

WE wrote this within minutes. Although I am no expert, I have deduced that.
 
  • #7
A. Neumaier said:
This ultra-high accuracy is not obtained in this way but through NRQED: One uses a nonrelativistic $1/c^2$ expansion to get rid of the problems with the relativistic theory and then works essentially with nonrelativistic quantum mechanics with relativistic corrections.
Where do you need the non-relativistic approximation to evaluate the anomalous magnetic moment of the electron?
 
  • #8
ftr said:
But let me ask this ,I think the main issue with QFT is removing infinities and renormalization of the mass and charge, I think these are PHYSICAL problems and not "mathematical", do you agree.
Renormalization has nothing to do with infinite radiation corrections per se. It comes to the rescue, but even if everything were perfectly finite, you'd need to renormalize to adjust the free parameters of the theory (wave-function normalization, masses, coupling constants) to observations.
 
  • #9
Practical QFT is eg. the standard model of particle physics. Most physicists nowadays understand it to be a low energy theory, ie. to make good predictions at the energies we use in experiments, it is not necessary for the theory to exist at all energies. This is the Wilsonian viewpoint, and it is a great advance for understanding why renormalization is ok, and has nothing to do with removing infinities etc. The main problem in this practical way of thinking is that it needs a non-perturbatively defined quantum theory, eg. lattice gauge theory, but while lattice QED is thought to be ok, chiral fermions on the latttice are still problematic.

The Millenium prize has to do with 4D relativistic QFTs that exist at all energies (and not just at low energies).
 
  • Like
Likes vanhees71
  • #10
ftr said:
it is mainly due to correctness of QM itself.
This doesn't explain the high accuracy achieved! QM without QED has not such a high predictive accuracy.

vanhees71 said:
Where do you need the non-relativistic approximation to evaluate the anomalous magnetic moment of the electron?
To first loop order you don't need it but you don't get very high accuracy. For highest accuracy you cannot work with the standard Feynman approach; it is too daunting a computational task.
atyy said:
4D relativistic QFTs that exist at all energies (and not just at low energies).
Because of Poincare invariance, any relativistic QFT (in flat spacetime) will exist at all energies and (and not just at low energies).
 
  • #11
A. Neumaier said:
To first loop order you don't need it but you don't get very high accuracy. For highest accuracy you cannot work with the standard Feynman approach; it is too daunting a computational task.
I thought Kinoshita has calculated (g-2)/2 to order ##\alpha_{\text{em}}^5##. Are there non-relativistic approximations involved? I don't see anything mentioned like this, e.g., here

https://arxiv.org/abs/1205.5368

Because of Poincare invariance, any relativistic QFT (in flat spacetime) will exist at all energies and (and not just at low energies).

But the trouble with some QFTs (including QED) at high energies, at least perturbatively (Landau pole). The modern consensus is that the Standard Model is a low-energy approximation of some other theory, of which we have no clue (not even if it exists at all).
 
  • #12
A. Neumaier said:
This doesn't explain the high accuracy achieved! QM without QED has not such a high predictive accuracy.

I know you need QED for precision, I was just thinking that QT is real, QFT is just a technique. May one day be united:smile:
 
  • #13
ftr said:
I was just thinking that QT is real, QFT is just a technique

What are you basing this on?
 
  • #14
PeterDonis said:
What are you basing this on?

It just looks to me that QFT takes QM massages it and turns it into a calculational tool.
 
  • #15
ftr said:
It just looks to me that QFT takes QM massages it and turns it into a calculational tool.

You're leaving out at least three key physical phenomena that QFT can predict and ordinary non-relativistic QM can't:

(1) The existence of processes where particles are created or destroyed;

(2) The existence of antiparticles;

(3) The connection between spin and statistics.
 
  • Like
Likes vanhees71
  • #16
ftr said:
I know you need QED for precision, I was just thinking that QT is real, QFT is just a technique. May one day be united:smile:

QFT is a type of QT.

Condensed matter uses non-relativistic QFT. Non-relativistic QFT can be derived from the non-relativistic Schroedinger equation for many identical particles.
 
  • Like
Likes vanhees71
  • #17
vanhees71 said:
Are there non-relativistic approximations involved?
No. The paper you cited calculates some contributions up to the 10th order and even includes hadronic contributions (phenomenologically included into QED) since it already affects the 10th decimal. Indeed, I checked the references with the actual calculation details that no expansion in inverse powers of c is made. Thus the results are covariant.

I had mixed up the computation of ##g-2## with the computation of the hyperfine structure (Lamb shift), where NRQED is essential to get high precision.
 
  • Like
Likes vanhees71
  • #19
Well, if you need a cutoff to define the theory then you admit that it's an effective theory valid up to the cutoff scale at most. The Epstein-Glaser approach avoids UV divergences, because it's more careful in dealing with products of distribution valued operators, and thus it solves a mathematical problem, but not the physical problem of the Landau pole. As far as I know, QED is an example for a QFT more likely not to exist in a strict sense than, e.g., QCD, which is asymptotic free, but for no realistic (i.e., (1+3)-dim. theory of interacting quantum fields) there has been a proof for it to exist in a strict sense. On the other hand FAPP this doesn't matter much, because we have anyway only a limited energy available, and it's quite likely that our contemporary Standard Model will fail at high enough energies somewhere. Today, nobody known where that scale might be. Maybe there's really a dessert up to the Planck scale, where one can definitely expect something should happen concerning quantum effects of gravity. Then HEP with accelerators is doomed, funding wise :-(.
 
  • #20
ftr said:
But let me ask this ,I think the main issue with QFT is removing infinities and renormalization of the mass and charge, I think these are PHYSICAL problems and not "mathematical", do you agree.
No. A lattice approximation with periodic boundary conditions defines a conceptually and mathematically valid and unproblematic theory. It does not have infinities, and does not need renormalization (it is, instead, possibly part of the process of renormalization - namely a regularized theory).

Such a lattice theory can be interpreted as some approximation of the field theory, and gives all the observable (physical) results, with some accuracy. So, the lattice theory is mathematically fine, and it is (at least if the computing power is sufficient, the lattice fine enough and the volume big enough) also physically fine. In principle, a lattice theory could be even a candidate for a more fundamental theory, given that it has neither physical nor mathematical problems, and QFT would be simply the large distance approximation of such a fundamental lattice theory.

What is not fine is the metaphysics. The lattice theory breaks Lorentz covariance, and some people consider Lorentz covariance as something obligatory, for some metaphysical reasons - relativistic symmetry being a fundamental insight or so.
 
  • Like
Likes jimmy1010100 and vanhees71
  • #21
A. Neumaier said:
This ultra-high accuracy is not obtained in this way but through NRQED: One uses a nonrelativistic $1/c^2$ expansion to get rid of the problems with the relativistic theory and then works essentially with nonrelativistic quantum mechanics with relativistic corrections.

The 12 decimal places of accuracy being talked about is a very specific quantity--the magnetic moment of the electron, right?
 
  • #22
atyy said:
The main problem in this practical way of thinking is that it needs a non-perturbatively defined quantum theory, eg. lattice gauge theory, but while lattice QED is thought to be ok, chiral fermions on the latttice are still problematic.
I think this is not a serious problem. For vector gauge fields, we have Wilson lattice gauge fields, which realize exact gauge symmetry. Exact gauge symmetry is what we need to describe massless gauge bosons, thus, QCD or QED. But they are anyway not chiral.

For the chiral gauge actions, a lattice realization which gives only approximate gauge symmetry seems fine. Not? If not, why not? Massive gauge fields being non-renormalizable is not an issue if we do effective field theory - they will be, in the large distance limit, equivalent to the closest renormalizable theory, so, they will look (have to look) like the renormalizable example we know, with exact but broken gauge symmetry.

But the chiral gauge action is the only problem here. Else, we have nothing chiral in the SM, everything are pairs of Dirac fermions.
 
  • #23
A. Neumaier said:
Haag's theorem is hardly related to that; it only says that the CCR representation of the free field cannot extend to interacting fields, the interacting Hilbert space must look quite different from a Fock space. More precisely, while they have to be isomorphic as Hilbert spaces (all infinite-dimensional separable Hilbert spaces are isomorphic), the additional structure needed to set up quantum fields cannot be respected by such an isomorphism.

As I understand it, perturbation theory typically involves treating the interacting states as a perturbation of free-particle states. So does Haag's theorem imply that there is something fundamentally wrong with doing this? (In that case, the fact that it works pretty well is a little mysterious.)
 
  • #24
stevendaryl said:
As I understand it, perturbation theory typically involves treating the interacting states as a perturbation of free-particle states. So does Haag's theorem imply that there is something fundamentally wrong with doing this? (In that case, the fact that it works pretty well is a little mysterious.)
It works pretty well because in a lattice approximation with periodic boundary conditions no problem exists, thus, as far as a technique can be applied to a finite lattice theory it has to work fine. Haag' s theorem becomes relevant only beyond the lattice theory.
 
  • Like
Likes vanhees71
  • #25
vanhees71 said:
Well, if you need a cutoff to define the theory then you admit that it's an effective theory valid up to the cutoff scale at most. The Epstein-Glaser approach avoids UV divergences, because it's more careful in dealing with products of distribution valued operators, and thus it solves a mathematical problem, but not the physical problem of the Landau pole. As far as I know, QED is an example for a QFT more likely not to exist in a strict sense than, e.g., QCD, which is asymptotic free, but for no realistic (i.e., (1+3)-dim. theory of interacting quantum fields) there has been a proof for it to exist in a strict sense. On the other hand FAPP this doesn't matter much, because we have anyway only a limited energy available, and it's quite likely that our contemporary Standard Model will fail at high enough energies somewhere. Today, nobody known where that scale might be. Maybe there's really a dessert up to the Planck scale, where one can definitely expect something should happen concerning quantum effects of gravity. Then HEP with accelerators is doomed, funding wise :-(.
The Epstein-Glaser approach has no cutoff and hence no problem with the Landau pole. Everything is Poincare covariant at all stages, and nothing at all can go wrong at high energies. The renormalization parameter ##\Lambda## that has to be chosen in this approach has not the slightest effect on the infinite order results, only on the finite loop approximations. The only forbidden thing is to pick ##\Lambda## close to the Landau pole, since there the Landau singularity strikes. For all other choices of ##\Lambda##, exactly the same theory, namely QED (in the case under discussion) results, since its parameters are the physical mass ##m## and the physical charge ##e##, and not ##\Lambda##.

Note that all this is true independent of the fact that QED (or the standard model) does not reflect Nature at very short distances. This discrepancy is completely unrelated to Landau's old arguments.
 
Last edited:
  • Like
Likes vanhees71
  • #26
Denis said:
A lattice approximation [...] does not need renormalization
This is a gross misunderstanding of the nature of lattice theories. These all need finite renormalization, which is done by matching the parameters defining the lattice and the interaction with the physical data through appropriate data fits. The coupling constants needed to achieve this match depend very sensitively on the lattice spacing!
 
  • Like
Likes jimmy1010100 and Denis
  • #27
stevendaryl said:
The 12 decimal places of accuracy being talked about is a very specific quantity--the magnetic moment of the electron, right?
yes. Only very few quantities are measured to such a high precision (and the corresponding extremely large computational effort). For most observable quantities much less accuracy suffices for matching experiments.
 
  • #28
stevendaryl said:
As I understand it, perturbation theory typically involves treating the interacting states as a perturbation of free-particle states. So does Haag's theorem imply that there is something fundamentally wrong with doing this? (In that case, the fact that it works pretty well is a little mysterious.)
Yes. There is something fundamentally wrong! The glaring evidence for this are the infinities that delayed the development of QED (started in 1928) by about 20 years.

The cure is the renormalization program. It sacrifices the Hilbert space (at finite times) to restore finiteness and predictability. In lower dimensions, mathematical physicists have been able to reconstruct the correct (non-Fock) Hilbert space and thus also restored consistency with ordinary quantum mechanics. In 4 dimensions the resolution of this problem is still wide open; at present, formal, non-converging power series expansions replace actual functions. But formal power series do not form a Hilbert space. The real difficulty is to construct a good inner product of whatever takes the role of the state vectors.
 
Last edited:
  • #30
Demystifier said:
The Haag's theorem is an artifact of taking the infinite-volume limit too seriously.
No. It is not an artifact, since only the infinite volume limit gives physically correct results. Even in lattice calculations, one must try to take the limit by extrapolation to get agreement with experiment.

The infinite volume is a necessary feature of Poincare invariance, which is the very basis of all relativistic particle physics.

Demystifier said:
It is brilliantly explained
Brilliant explanations may still be far off the mark! The power is in the facts, not in brilliant words.
 
  • #31
A. Neumaier said:
No. It is not an artifact, since only the infinite volume limit gives physically correct results. Even in lattice calculations, one must try to take the limit by extrapolation to get agreement with experiment.

The infinite volume is a necessary feature of Poincare invariance, which is the very basis of all relativistic particle physics.
I disagree, especially with the claim than only infinite volume gives physically correct results. In addition, Poincare invariance (translation invariance to be more precise) on large distances is broken by general relativity. Infinite volume is just an approximation which often works fine when wavelength of particles is much smaller than distance on which translation invariance may be broken, e.g. by cosmological effects.
 
Last edited:
  • #32
Demystifier said:
Poincare invariance (translation invariance to be more precise) on large distances is broken by general relativity.
But the standards model, on which all particle physics is based, is completely independent of general relativity.

Drop Poincare invariance, and you have essentially no constraints on the kind of action to consider. The terms kept in standard model are distinguished solely by Poincare invariance, renormalizability, and the assumed internal group, both extracted from experimental data and very well verified.
 
  • #33
A. Neumaier said:
But the standards model, on which all particle physics is based, is completely independent of general relativity.

Drop Poincare invariance, and you have essentially no constraints on the kind of action to consider. The terms kept in standard model are distinguished solely by Poincare invariance, renormalizability, and the assumed internal group, both extracted from experimental data and very well verified.
Nothing of this represents evidence that we live in an infinite volume.
 
  • Like
Likes vanhees71
  • #34
Demystifier said:
Nothing of this represents evidence that we live in an infinite volume.
This doesn't matter; I didn't claim that. Instead I claimed that
A. Neumaier said:
The infinite volume is a necessary feature of Poincare invariance, which is the very basis of all relativistic particle physics.
 
  • Like
Likes dextercioby
  • #35
A. Neumaier said:
This doesn't matter; I didn't claim that.
You claimed that "only the infinite volume limit gives physically correct results". Do you still stand with this claim?
 

Similar threads

  • Quantum Physics
Replies
4
Views
1K
  • Quantum Physics
Replies
34
Views
3K
Replies
13
Views
2K
Replies
108
Views
13K
Replies
64
Views
4K
Replies
3
Views
807
Replies
6
Views
1K
Replies
5
Views
1K
  • Quantum Interpretations and Foundations
Replies
15
Views
2K
  • Quantum Physics
2
Replies
47
Views
4K
Back
Top