Insights Causal Perturbation Theory - Comments

Click For Summary
Causal perturbation theory is discussed as a mathematically well-defined framework for constructing quantum field theories, though it only yields an asymptotic series and lacks a rigorous treatment of infrared limits. The conversation highlights the historical contributions of Bogoliubov and Shirkov, as well as Epstein and Glaser, in developing this theory, with emphasis on the need for rigorous treatment of time ordering and infrared problems. Participants debate the relationship between causal perturbation theory and Wilsonian effective field theories, suggesting that while causal perturbation theory reproduces standard results, it cannot be framed within a cutoff approach. The discussion also touches on the implications of causal perturbation theory for understanding the nature of particles like electrons as infraparticles. Overall, the theory is positioned as a valuable tool for approximating results in quantum field theory, despite its limitations in rigor and completeness.
  • #31
A. Neumaier said:
In the light of the recent discussion starting here, I updated this Insight article, adding in particular detail to the section ''Axioms for causal quantum field theory''.
Is there a typo or omission in the last sentence of this paragraph:
Unfortunately, models proving that QED (or other interacting local quantum field theories) exists have not yet been constructed. On the other hand, there are also no arguments proving rigorously that such models exist. For a fully rigorous solution – a problem which for interacting 4-dimensional relativistic quantum field theories is open.
?
 
Physics news on Phys.org
  • #32
strangerep said:
Is there a typo or omission in the last sentence of this paragraph:
?
Thanks for pointing it out. Indeed, the paragraph was garbled. I corrected it and added some more information. Now the paragraph reads
Unfortunately, models proving that QED (or another interacting local quantum field theory in 4 spacetime dimensions) exists have not yet been constructed. On the other hand, constructions are available in 2 and 3 spacetime dimensions, and no arguments are known proving rigorously that such models cannot exist in 4 dimensions. Finding a fully rigorous construction for an interacting 4-dimensional local quantum field theory or proving that it cannot exist is therefore a widely open problem. My bet is that a rigorous construction of QED will be found one day.
 
Last edited:
  • Like
Likes Tendex and vanhees71
  • #33
A. Neumaier said:
However, the whole procedure makes perturbative sense also without these requirements.
In particular, for quantum field theory in curved space-time one sacrifices condition 1, with success; see work by Stefan Hollands.

Which papers of Stefan Hollands? I took a quick look at https://arxiv.org/abs/1105.3375 which introduces an ultraviolet cutoff, then takes it to infinity, so it seems a bit different from causal perturbation theory which "nowhere introduces nonphysical entities (such as cutoffs, bare coupling constants, bare particles or virtual particles)".
 
  • Like
Likes vanhees71
  • #34
atyy said:
Which papers of Stefan Hollands? I took a quick look at https://arxiv.org/abs/1105.3375 which introduces an ultraviolet cutoff, then takes it to infinity, so it seems a bit different from causal perturbation theory which "nowhere introduces nonphysical entities (such as cutoffs, bare coupling constants, bare particles or virtual particles)".
Yes, it is different. Hollands is not doing causal perturbation theory since, as I said, in his work condition 1 (i.e., covariance) is sacrificed, by using a cutoff.

Note that post #29 was more generally about starting with asymptotic Fock space, not about causal perturbation theory itself, where preserving conditions 1 and 2 throughout the construction is essential. I added a clarifying sentence.
 
  • Like
Likes atyy
  • #35
A couple of general comments on this "Causal perturbative theory", the first is rather cosmetic. Being as discussed in the Lattice QED thread not an exactly or completely causal(in the sense of microcausal) theory due to its perturbative limitations, isn't referring to it as a "Causal theory" a bit of a misnomer? I know there are historical reasons starting with the work of Bogoliubov and I guess that it refers to the global causality of asymptotic states rather than to the "in principle" exact (micro)causality usually explained in regular QFT textbooks when explaining the locality axiom but still, maybe that's why the title of Scharf's book uses first the less confusing phrase "Finite QED".

The other comment refers to the insistence on underlining the absence of cutoffs or series truncations(as a rigorous renormalization BPHZ schema) as some constructive property of the theory, given that as also commented in the other thread, in a perturbative setting, i.e. renormalized order by order, validity only term by term is quite a "truncation" of the theory and therefore we are indeed dealing at best with effective field theories, perhaps better defined mathematically but not constructing any theory that gets us closer to the nonperturbative local QFT. Perhaps an appropriate mathematical analogy is with numerical brute force searches of Riemann hypothesis zeros outside the critical line, that always remain equally infinitely far from confirming the hypothesis.

In this sense if one assumes from the start the existence of the non-perturbative local QFT(this is what Dyson and Feynman did), this seems to me like an empty exercise in rigor and old perturbation theory was fine, and if one doesn't assume it it is mostly useless.
 
  • Like
Likes vanhees71
  • #36
Tendex said:
isn't referring to it as a "Causal theory" a bit of a misnomer?
It is generally used; I am not responsible for the name. I think it is called causal since causality dictates the axioms and the derived conditions for the distribution splitting.
Tendex said:
validity only term by term is quite a "truncation" of the theory
Of course; I never claimed anything else. By making approximations, any computational scheme necessarily truncates the theory and hence violates locality. This even holds in 2D and 3D, where local QFTs have been constructed rigorously but computations still need to employ approximations.

My emphasis was on that all truncations are covariant and hence relativistic in the standard sense of the word. Only locality is slightly violated.
Tendex said:
not constructing any theory that gets us closer to the nonperturbative local QFT.
The complete, infinite order construction satisfies all axioms, and hence the local commutation rules, in the sense of formal power series. In this sense it is closer to nonperturbative local QFT.
Tendex said:
if one assumes from the start the existence of the non-perturbative local QFT(this is what Dyson and Feynman did), this seems to me like an empty exercise in rigor and old perturbation theory was fine
Old perturbation theory only defines the perturbative S-matrix, but not the operators, and hence not the finite time dynamics. Thus it lacks much of what makes a quantum theory well-defined.
Tendex said:
Perhaps an appropriate mathematical analogy is with numerical brute force searches of Riemann hypothesis zeros outside the critical line, that always remain equally infinitely far from confirming the hypothesis.
This is a valid comparison. Indeed, the Riemann hypohesis, global existence of solutions of the the Navier-Stokes equations, and the construction of an interacting local QFT in 4D are three of the 6 open MIllnium problems. They share the fact that numerically, everything of interest is established without doubt but the mathematical techniques to produce rigorous arguments are not sufficiently developed to lead to success on this level.
 
Last edited:
  • Like
Likes Tendex
  • #37
Then I have to ask again: What else do you gain in (vacuum) QFT than the S-matrix?

In the standard approach, the S-matrix is what describes the observable effects like cross sections and decay rates. It deals with transition-probability rates between asymptotic free initial states (where you have a physical definition of the states as "particles") and asymptotic free final states.

Is there an idea, that there are physically observable interpretations of states defined by the "transient" field operators and if so what is it and how can it be measured?
 
  • #38
vanhees71 said:
What else do you gain in (vacuum) QFT than the S-matrix?
I had already answered this:

BPHZ perturbation theory, say, only defines the perturbative S-matrix, but neither operators nor Wightman N-point functions, and hence no finite time dynamics. Moreover nothing for states different from the ground states. Thus it lacks much of what makes a quantum theory well-defined. Quite independent of what can be measured.

For comparison, if all that ordinary quantum mechanics could compute for a few particle system were its ground state and the S-matrix, we would have the status quo of 1928, very far from the current state of the art in few particle quantum mechanics.
 
  • #39
Tendex said:
not an exactly or completely causal (in the sense of microcausal) theory due to its perturbative limitations, isn't referring to it as a "Causal theory" a bit of a misnomer? I know there are historical reasons starting with the work of Bogoliubov and I guess that it refers to the global causality of asymptotic states rather than to the "in principle" exact (micro)causality usually explained in regular QFT textbooks when explaining the locality axiom
I think the name ''microcausality'' for the spacelike commutation rule, though quite common, is the real misnomer. The commutation rule is rather characteristic of locality (''experiments at the same time but different places can be independently prepared'') , as indicated by the title of Haag's book. Locality is intrinsically based on spacelike commutation and cannot be discussed without it or directly equivalent properties.

On the other hand, the relation between causality and spacelike commutation is indirect, restricted to relativistic QFT. Moreover, the relation works only in one direction since spacelike commutation requires a notion of causality for its definition, while causality can be discussed easily without spacelike commutation.

Indeed, causality (''the future does not affect the past'') is conceptually most related to dispersion relations (where causal arguments enter in an essential way throughout quantum mechanics, even in the nonrelativistic case) and to Lorentz invariance (already in classical mechanics where ''microcausality'' is trivially valid). Both figure very prominently in causal perturbation theory.
Tendex said:
maybe that's why the title of Scharf's book uses first the less confusing phrase "Finite QED".
Scharf's QED is finite order by order, just as causal perturbation theory is causal order by order. Since Scharf does not produce finite results in the limit of infinite order you should complain against the appropriateness of the label ''finite'' with the same force as you complain against the label ''causal'' in causal perturbation theory.
 
Last edited:
  • Like
Likes dextercioby
  • #40
A. Neumaier said:
I had already answered this:

BPHZ perturbation theory, say, only defines the perturbative S-matrix, but neither operators nor Wightman N-point functions, and hence no finite time dynamics. Moreover nothing for states different from the ground states. Thus it lacks much of what makes a quantum theory well-defined. Quite independent of what can be measured.

For comparison, if all that ordinary quantum mechanics could compute for a few particle system were its ground state and the S-matrix, we would have the status quo of 1928, very far from the current state of the art in few particle quantum mechanics.
Yes, that's formally clear. What I don't see is what you gain physics wise. What is the physical observable related to finite-time dynamics. I also don't understand, why you say that in standard PT you don't get Wightman functions. Of course you can calculate them perturbatively within the usual formalism.
 
  • Like
Likes Tendex
  • #41
vanhees71 said:
Yes, that's formally clear. What I don't see is what you gain physics wise. What is the physical observable related to finite-time dynamics.
What do you man? The Hamiltonian together with the Schrödinger equation tell how a state changes in a finite time. But in the textbook formalism, the Hamiltonian comes out infinite and cannot be used.
vanhees71 said:
I also don't understand, why you say that in standard PT you don't get Wightman functions. Of course you can calculate them perturbatively within the usual formalism.
The textbook formalism only gives the time-ordered N-point functions. How do you time-unorder them?
 
  • #42
The question is, what I gain from finite-time states in QFT. The problem is first, how to interpret them. All you need are transition probabilities between observable states, which have a physical meaning. The closest to something finite in time are, e.g., long-base-line experiments for neutrinos, which you handle with the usual S-matrix theory using wave packets as asymptotic initial an final states or coincidence measurements of multi-photon states. Also here you need the corresponding correlation functions, which can be evaluated perturbatively, and afaik that's all that's needed in quantum optics to describe the observables.

The "fixed-ordered Wightman functions" should be calculable by deriving the corresponding Feynman rules for them, but again my question: to calculate which observable predictions do you need them for?
 
  • Like
Likes Tendex
  • #43
vanhees71 said:
The question is what I gain from finite-time states in QFT.
If you are only interested in the interpretation of collision experiments, nothing. But if you want to do simulations in time of what happens, you need it. You are doing such simulations, so it is strange why you ask.

As you well know, this goes far beyond BPHZ, which is the textbook material that was under discussion above. Simulations in time are usually done with the CTP formalism, which can produce all required information in a nonperturbative way, given some pertubatively computed input.

The latter must be renormalized, which is done in an a nonrigorous hoc way. Presumably it can be placed on a more rigorous basis by using the causal techniques.

This may even resolve some of the causality issues reported in the CTP literature. (This was maybe around 10 years ago; I haven't followed it up, are these problems satisfactorily resolved by now?)
 
  • #44
vanhees71 said:
The "fixed-ordered Wightman functions" should be calculable by deriving the corresponding Feynman rules for them
But there is no Dyson series for them, so Feynman rules cannot be derived in the textbook way.

And CTP only produces 2-point Wightman functions, but not the ##N##-point functions for ##N>2##.
 
  • #45
A. Neumaier said:
I think the name ''microcausality'' for the spacelike commutation rule, though quite common, is the real misnomer. The commutation rule is rather characteristic of locality (''experiments at the same time but different places can be independently prepared'') , as indicated by the title of Haag's book. Locality is intrinsically based on spacelike commutation and cannot be discussed without it or directly equivalent properties.

On the other hand, the relation between causality and spacelike commutation is indirect, restricted to relativistic QFT. Moreover, the relation works only in one direction since spacelike commutation requires a notion fo causality for its definition, while causality can be discussed easily without spacelike commutation.

Indeed, causality (''the future does not affect the past'') is conceptually most related to dispersion relations (where causal arguments enter in an essential way throughout quantum mechanics, even in the nonrelativistic case) and to Lorentz invariance (already in classical mechanics where ''microcausality'' is trivially valid). Both figure very prominently in causal perturbation theory.
I basically agree with the above. There is a degree of mixing between the different terms in certain contexts though. In the free theory like in the classical you mention there is a causality and microcausality trivial overlapping, that relativistically is reinforced by the symmetries that include time reversals and spacetime translations, the latter is also in the putative non-perturbative local QFT. There you have the triad of Poincaré covariance, locality and unitarity inextricably united through the analytic dispersion relations extended from the Kramers relations to the whole complex plane.

Of course in the perturbative theory this has to be spoiled a little, and (always assuming the existence of the non-perturbative theory it approximmates which is what allows the necessary deformation of the free fields) locality splits from the other two.

Scharf's QED is finite order by order, just as causal perturbation theory is causal order by order. Since Scharf does not produce finite results in the limit of infinite order you should complain against the appropriateness of the label ''finite'' with the same force as you complain against the label ''causal'' in causal perturbation theory.
Well, I think I complained about the concepts behind it(which is what I care about) enough in other threads to get my point across. My comment as I said was purely about the word "causal" and its connotations and conflation with locality and (micro)causality that are known to create endless confusions in Bell-like discussions.
 
  • #46
Wightman propagators obviously belong to the axiomatic QFT proposal, I guess by what you get with "the standard perturbative theory" vanhees71 refers to the plain Feynman propagator.
 
  • #47
Tendex said:
Wightman propagators obviously belong to the axiomatic QFT proposal, I guess by what you get with "the standard perturbative theory" vanhees71 refers to the plain Feynman propagator.
Yes, but he had asked what can be computed from approximate field operators that are not accessible by standard perturbative theory. I gave as examples the unordered N-point correlation functions. These are defined independent of the Wightman axioms; the latter just specify their desired covariance and laocality conditions.
 
  • #48
Since CPT is UV-complete, does it suggest a physically motivated regularization scheme for old-fashioned QFT?
 
  • #49
HomogenousCow said:
Since CPT is UV-complete, does it suggest a physically motivated regularization scheme for old-fashioned QFT?
It is a physically motivated renormarization scheme replacing old-fashioned QFT, making regularization unnecessary. Why should one still want to regularize?
 
  • #50
A. Neumaier said:
If you are only interested in the interpretation of collision experiments, nothing. But if you want to do simulations in time of what happens, you need it. You are doing such simulations, so it is strange why you ask.

As you well know, this goes far beyond BPHZ, which is the textbook material that was under discussion above. Simulations in time are usually done with the CTP formalism, which can produce all required information in a nonperturbative way, given some pertubatively computed input.

The latter must be renormalized, which is done in an a nonrigorous hoc way. Presumably it can be placed on a more rigorous basis by using the causal techniques.

This may even resolve some of the causality issues reported in the CTP literature. (This was maybe around 10 years ago; I haven't followed it up, are these problems satisfactorily resolved by now?)
Yes, I'm doing simulations what's going on in heavy-ion collisions, but it's far from the claim that you have a physical interpretation of "transient states". Take, e.g., the calculations I've done for dilepton production, i.e., the production of ##\text{e}^+ \text{e}^-##- and ##\mu^+ \mu^-##-pairs from a hot and dense strongly interacting medium.

The medium itself is described by either a fireball ("blastwave") parametrization of a hydrodynamic medium or in a coarse-graining approach with a relativistic transport-model simulation. In any case one maps the many-body system's evolution to a local-thermal-equilibrium situation.

For the dilepton-production rates we use spectral functions from an equilibrium-qft calculation. The QFT observable here is the thermal electromagnetic-current autocorrelation function, i.e., the "retarded" expectation value of ##\hat{j}^{\mu}(x) \hat{j}^{\nu}(y)## wrt. the grand-canonical ensemble statistical operator. This can be evaluated either in terms of the Matsubara formalism and then analytically continued to the corresponding retarded two-point function or you directly use the Schwinger-Keldysh real-time formalism to directly evaluate this retarded two-point function. What enters the dilepton-production-rate formula is the imaginary part of the retarded two-point function, the famous McLerran formula.

So everything from QFT is within the standard formalism of thermal correlation functions. There's no need to physically interpret transient states all "particles" observed are calculated in the sense of the usual concept of asymptotic free states.

The description of the bulk medium is in terms of semi-classical transport theories, behind which when looked at them from the point of view of many-body QFT, also is the interpretation of "particles" in terms of asymptotic free states.
 
  • #51
A. Neumaier said:
It is a physically motivated renormarization scheme replacing old-fashioned QFT, making regularization unnecessary. Why should one still want to regularize?
Well, in a way you also regularize by using the "smeared" operators, and that's very physical. Already in classical electrodynamics plane waves are mode functions, i.e., a calculational tool to solve the Maxwell equations for physically realistic fields, i.e., for fields with finite total energy, momentum, and angular momentum. It's just using generalized eigenfunctions of the appropriate self-adjoint operators (in this case the d'Alembert opertor).
Indeed, of all the attempts to make QFT mathematically rigorous this causal-perturbation-theory approach with "smearing" the distribution-like operators is the most physically plausible, and I have no quibbles with it in principle. I only don't see, what one gains from it physics wise, i.e., to calculate physically observable quantities, which cannot be calculated within the standard scheme. In standard PT dim. reg. is very convenient as a regularization scheme, and the renormalized theory is anyway independent of the regularization scheme. You can also do the subtraction in a BPHZ-like manner without intermediate regularization, but that can be tricky, and dim. reg. is just more convenient.
 
  • #52
vanhees71 said:
So everything from QFT is within the standard formalism of thermal correlation functions. There's no need to physically interpret transient states all "particles" observed are calculated in the sense of the usual concept of asymptotic free states.

The description of the bulk medium is in terms of semi-classical transport theories, behind which when looked at them from the point of view of many-body QFT, also is the interpretation of "particles" in terms of asymptotic free states.
Transient states are important but they are states of the interacting quantum field, not particle states. Particle states are only meaningful as asymptotic states. I didn't claim any relation of field operators to an unphysical interpretation of transient states as particle states but are mor complex objects.

But I claimed that N-particle states other than those occurring in the textbook description of QFT are relevent. You confirm this by referrring to retarded correlation functions of currents:
vanhees71 said:
The QFT observable here is the thermal electromagnetic-current autocorrelation function, i.e., the "retarded" expectation value of
These are not covered by the BPHZ approach to renormalization
vanhees71 said:
in a way you also regularize by using the "smeared" operators
No; these are just arbitrary functions, dummy parameters in the causal approach, analogous to the ##x## in a perturbative calculation of ##e^x##.
vanhees71 said:
In standard PT dim. reg. is very convenient as a regularization scheme, and the renormalized theory is anyway independent of the regularization scheme.
It is a physically meaningless regularization scheme, as ##4-\epsilon## dimensional space is unphysical, and not even mathematically well-defined. Moreover, the independence of the regularization scheme is an assumption, not something proved - on the contrary, there are disputes since in certain situations there seem to be disagreements.
 
  • #53
At least in equilibrium many-body QFT you can show that order by order all you need to renormalize are the vacuum pieces of the proper vertex functions. In this sense BPHZ is sufficient.

Do you have an example, where two properly applied regularization schemes lead to different results for the renormalized quantities?

I thought it's self-evident that with fixed renormalization conditions the proper vertex functions are unique and then the physical quantities like S-matrix elements, pole masses defining physical masses of particles/resonances, etc. are independent of this scheme. The reason is that you can use, in principle, some subtraction scheme a la BPHZ without any intermediate regularization, using just the renormalization conditions for the proper vertex functions.
 
  • #54
vanhees71 said:
At least in equilibrium many-body QFT you can show that order by order all you need to renormalize are the vacuum pieces of the proper vertex functions. In this sense BPHZ is sufficient.

Do you have an example, where two properly applied regularization schemes lead to different results for the renormalized quantities?
It depends on the meaning of the undefined term 'properly applied'. By convention, regularization schemes are deemed properly applied when they lead to the same results for the renormalized quantities. This is the only known criterion.

Sometimes one gets different results, however...

Then one has to investigate why they differ and which result (if any) is to be trusted. I believe that I read a couple of papers with examples where dimensional regularization was thought to be in error. But I need more time to check this and to retrieve the references.

vanhees71 said:
I thought it's self-evident that with fixed renormalization conditions the proper vertex functions are unique and then the physical quantities like S-matrix elements, pole masses defining physical masses of particles/resonances, etc. are independent of this scheme.
It is self-evident only until counterexamples are found, which force one to be more specific about how to ''properly apply'' the technique beyond what can be found in standard sources.

It would be self-evident if the schemes were rigorously derived from an undisputed rigorous definition of the theory. But the latter does not exixt yet.

It is similar to what happens with self-adjointness. You acknowledge in your lecture notes that it is a necessary property of Hamiltonians. But you seem to take it as self-evident that all Hamiltonians actually used have this property. At least you never give sufficient conditions that would allow readers to check for themselves the self-adjointness of Hamiltonians written down formally. Usually, the property holds, but there are exceptions, and they are heuristically recognized only by producing faulty results. Few physicists care about giving proof of the self-adjointness of the Hamiltonians they use. I even wonder how you would check one for self-adjointness.
 
Last edited:
  • #55
Well yes, we are physicists, not mathematicians.

One example, where you find pretty often wrong statements in introductory textbooks is the box with rigid boundary conditions and the claim there were a momentum operator. ;-).

I still don't know, which examples you have in mind, where the standard regularization techniques lead to errorneous results.

One example may be the ##\gamma^5## problem in dim reg and the chiral anomaly, but this has to be solved anyway by arguing which current has to be anomalously broken and which one must stay conserved.
 
  • #56
vanhees71 said:
I still don't know, which examples you have in mind, where the standard regularization techniques lead to erroneous results.
It is called more politely "renormalization ambiguities".
Wu said:
The conventional scale-setting procedure assigns an arbitrary range and an arbitrary systematic error to fixed-order pQCD predictions. In fact, this ad hoc procedure gives results which depend on the choice of the renormalization scheme, and it is in conflict with the standard scale-setting procedure used in QED. Predictions for physical results should be independent of the choice of scheme or other theoretical conventions.
This is a quote from the abstract of https://arxiv.org/pdf/1302.0599.pdf. It says explicitly that this is a desirable assumption, not an achieved fact. See also
Even when it can be done, showing equivalence of two renormalization schemes is usually a highly nontrivial matter leading to a publication. This means that assessing whether a renormalization procedure is "properly applied" is in all but the simplest cases more an art than a science.
 
  • Like
  • Informative
Likes dextercioby and vanhees71
  • #57
Yes, sure, that's a much more puzzling problem than what I had in mind with my statement above. What I meant is that for a given Feynman diagram for a proper vertex function you get a unique answer given a renormalization scheme usually involving a renormalization scale, independent of the intermediate regularization you use. So these proper vertex functions and the S-matrix elements within the chosen renormalization scheme are independent from a chosen regularization but of course dependent on the chosen renormalization scheme and dependent on the renormalization scale.

The S-matrix is of course only independent on the renormaliation scale to the order of the expansion parameter (couplings or ##\hbar##, number of loops...) taken into account, and one can resum the leading logarithms by using RG equations to define the running couplings to minimize that dependence.

The problem of the uncertainty concerning the dependence on the renormalization scheme and the dependence on the corresponding renormalization scale is of course also not strictly solved (as isn't the problem how to define an exact QFT in 1+3 dimensions). To estimate the uncertainty there are some hand-waving rules, e.g., by setting the scale around the energy scale of the experiment you want to describe and varying the renormalization scale by some factors around this scale, leading to some "uncertainty band".

In thermal pQCD often the scale is chosen at ##2\pi T## (##T## temperature) and varied around this value by a factor 2 up and down. It's also a problem that even for the bulk thermodynamical quantities and the equation of state the perturbative series (as far as it can be evaluated anyway) is "not well convergent".

All these problems are far from being solved. Has "causal perturbation theory" new ansatzes for this problem? That would of course be very interesting, but at the end don't you just get the proper vertex functions of the standard formalism within a special way to regularize or a special renormalization scheme somehow defined by the "smearing procedure"?

Perhaps I should have a closer look at Scharf's book again. When I looked at it the last time some years ago, I had the impression that it's just another technique to get rid of the problems of UV divergences leading at the end to the same results as standard physicists' methods, which are a lot simpler to use.
 
  • #58
Does the careful splitting of causal distributions explained in chapter 3 of Scharf's book involve the same dispersion relations and analytic continuation techniques (given as a reference to the vol. 2 of Reed and Simon Methods of mathematical physics) as those used for local quantum fields in the Wightman axioms and also in Green's functions to guarantee positive energy under time reversal?
 
  • Like
Likes vanhees71
  • #59
Tendex said:
Does the careful splitting of causal distributions explained in chapter 3 of Scharf's book involve the same dispersion relations and analytic continuation techniques (given as a reference to the vol. 2 of Reed and Simon Methods of mathematical physics) as those used for local quantum fields in the Wightman axioms and also in Green's functions to guarantee positive energy under time reversal?
As far as I can tell, yes. These techniques are very general.
 
  • Like
Likes vanhees71
  • #60
But then it's indeed equivalent to the standard techniques for evaluating proper vertex functions within a given renormalization scheme.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 108 ·
4
Replies
108
Views
17K
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K