# Insights Interview with Mathematician and Physicist Arnold Neumaier - Comments

Tags:
1. Dec 11, 2017

### A. Neumaier

Last edited by a moderator: Dec 11, 2017
2. Dec 11, 2017

### dextercioby

A warm Thank you to all 4 of you for making this read possible.

3. Dec 11, 2017

### Staff: Mentor

I cant express my gratitude enough for this extremely insightful and informative interview.

Arnold has always been one of my favorite posters that I have learnt an enormous amount from.

In particular his efforts to dispel, and explain in detail, myths like the reality of virtual particles is much appreciated, and IMHO very very important because of the huge amount of confusion about the issue, especially amongst beginners.

Thanks
Bill

4. Dec 11, 2017

### ftr

Arnold, I am interested in your view on EPR. Also, the physical alternative to ^ virtual particles" seem to be a taboo. It seems to me that non locality is of essence.

5. Dec 12, 2017

### A. Neumaier

See the following extended discussions:

https://www.physicsforums.com/threa...is-not-weird-unless-presented-as-such.850860/

The physical alternative is the interpretation of virtual particles as contributions to an infinite series of scattering amplitude approximations. There is nothing nonlocal in relativistic QFT. See my post on extended causality.

Last edited: Dec 12, 2017
6. Dec 12, 2017

### DrChinese

Nice definitions. I assume you consider an entangled system (of 2 or more particles) to be an "extended object", correct?

7. Dec 12, 2017

### A. Neumaier

Yes.

Even a single particle is a slightly extended object. In quantum field theory, there are no point particles - only ''pointlike'' particles (which means particles obtained from point particles through renormalization - which makes them slightly extended).

On the other hand, entangled systems of 2 particles can be very extended objects, if the distance of the center of mass of the particles is large, as in long distance entanglement experiments. But these very extended objects are very fragile (metastable in the thermal interpretation, due to decoherence by the environment), and it takes a lot of experimental expertise to maintain them in an entangled state.

Last edited: Dec 12, 2017
8. Dec 12, 2017

### thephystudent

Thanks for the insight!

Do you believe gravity will turn out to be fundamentally an emergent force as Verlinde's picture (or other forces, e.g. for the coulomb force https://arxiv.org/abs/1704.04048) ?
Or would you consider such ideas rather meaningless by low falsiability?

9. Dec 12, 2017

### A. Neumaier

That Coulomb's (nonrelativitic) force comes out of (relativistic) QED is well-known and no surprise. Getting gravity as emergent is in my opinion quite unlikely but not impossible. In fact there have been attempts to do so. (See this link and the surrounding discussion.)

10. Jan 2, 2018

### Urs Schreiber

I feel that this statement deserves more qualification:

What Scharf argues is that from just the mathematics of perturbative QFT, there is no technical problem with a "non-renormalizable" Lagrangian: The non-renormalizability simply means (by definition) that infinitely many constants need to be chosen when renormalizing. While this may be undesireable, it is not mathematically inconsistent (unless one adopts some non-classical foundation of mathematics without, say, the axiom of choice; but this is not the issue that physicists are commonly concerned with). Simply make this choice, and that's it then.

I feel that the more popular Wilsonian perspective on this point is really pretty much the same statement, just phrased in different words: The more popular statement is that gravity makes sense as an effective field theory at any given cutoff energy, and that one needs to measure/fix further counterterms as one increases the energy scale.

So I feel there is actually widespread agreement on this point, just some difference in terminology. But the true issue is elsewhere: Namely the above statements apply to perturbation theory about a fixed gravitational background. The true issue of quantizing gravity is of course that the concept of background causality structure as used in Epstein-Glaser or Haag-Kastler does not apply to gravity, since for gravity the causal structure depends on the fields. For this simple reason it is clear that beyond perturbation theory, gravity definitely does not have "canonical quantization" if by this one means something fitting established axioms for QFT.

Instead, if quantum gravity really does exist as a quantum field theory (instead of, say, as a holographic dual of a quantum field theory), then this necessarily needs some slightly more flexible version of the Epstein-Glaser and/or Haag-Kastler-type axioms on causality: One needs to have a concept of causality that depends on the fields itself.

I am aware of one program aiming to address and solve this:
I think this deserves much more attention.

Last edited: Jan 2, 2018
11. Jan 3, 2018

### A. Neumaier

This has nothing to do with the axiom of choice; the number of constants to choose is countably infinite only. The level of mathematical and physical consistency is precisely the same as when, in a context where people are used to working with polynomials defined by finitely many parameters, someone suggests to use instead power series. The complaint is completely unfounded that power series are not predictive since one needs infinitely many parameters to specify them. It is well-known how to specify infinitely many parameters by a finite formula for them!
In causal perturbation theory there is no cutoff, so Wilson's point of view is less relevant. Everything is constructed exactly; there is nothing effective. Gravity is no exception! Of course one can still make approximations to simplify a theory to an approximate effective low energy theory in the Wilson sense, but this is not intrinsic in the causal approach. (Nobody thinks of power series as being only effective approximations of something that fundamentally should be polynomials in a family of fundamental variables.)

This is not really a problem. It is well-known that, just as massless spin 1 quantization produces gauge invariance, so massless spin 2 quantization produces diffeomorphism invariance. Hence as long as two backgrounds describe the same smooth manifold when the metric is ignored, they are for constructive purpose equivalent. Thus one may simply choose at each point a local coordinate system consisting of orthogonal geodesics, and you have a Minkowski parameterization in which you can quantize canonically. Locality and diffeomorphism invariance will make the construction behave correctly in every other frame.

Last edited: Jan 3, 2018
12. Jan 3, 2018

### Urs Schreiber

Careful with sweeping statements.

Sure, but nevertheless, it needs a choice axiom, here the axiom of countable choice. Not that it matters for the real point of concern in physics, I just mentioned it for completeness.

Don't confuse the way to make a choice with the space of choices. A formula is a way to write down a choice. But there are still many formulas.

Incidentally, this is what the string perturbation series gives: a formula for producing a certain choice of the infinitely many counterterms in (some extension) of perturbative gravity. To some extent one may think of perturbative string theory as parameterizing (part of) the space of choices in choosing renormalizaton parameters for gravity by 2d SCFTs. If these in turn arise as sigma models, this gives a way to parameterize these choices by differential geometry. It seems that the subspace of choices thus parameterized is still pretty large, though ("landscape"). Unfortunately, despite much discussion, little is known for sure about this.

Not in perturbation theory, but that's clear.

Last edited: Jan 3, 2018
13. Jan 3, 2018

### Urs Schreiber

Yes, maybe a key problem is to understand the relation between the Haag-Kastler axioms (local nets) for the algebras of quantum observables with similar axioms for the operator product expansion. These days there is a growing community trying to phrase QFT in terms of factorization algebras and since these generalize vertex operator algebras in 2d, they are to be thought of as formalizing OPEs in Euclidean (Wick rotated) field theory. Recently there is a suggestion on how to relate these to causal perturbation theory/pAQFT:
but I suppose many things still remain to be understood.

14. Jan 3, 2018

### A. Neumaier

No. The parameters come with a definite grading and finitely many parameters for each grade, hence one can make the choice constructive (in many ways, e.g., by forcing all parameters of large grade to vanish).
Of course. But making choices does not require a nonconstructive axiom. Each of the possible choices deserved to be called a possible theory of quantum gravity, so there are many testable constructive choices (and possibly some nonconstructive ones if one admits a corresponding axiom).

Making the right choice is, as with any theorem building, a matter of matching Nature through experiment. But of course, given our limited capabilities, there is not much to distinguish different proposed formulas for choosing the parameters; see C.P. Burgess, Quantum Gravity in Everyday Life: General Relativity as an Effective Field Theory. Only one new parameter (the coefficient of curvature^2) appears at one loop, where Newton's constant of gravitation becomes a running coupling constant with $$G(r) = G - 167/30\pi G^2/r^2 + ...$$ in terms of a renormalization length scale $r$, which is already below the current observability limit.

15. Jan 3, 2018

### Urs Schreiber

It's two different perspectives on the same phenomenon of pQFT. This is discussed in section 5.2 of Brunetti-Dütsch-Fredenhagen 09.

16. Jan 3, 2018

### A. Neumaier

Yes. The Stueckelberg renormalization group is what is left from the Wilson renormalization semigroup when no approximation is permitted. In causal perturbation theory one only has the former. It describes a parameter redundancy of any fixed theory.

Approximating a theory by a simpler one to get better numerical access is a completely separate thing from constructing the theory in the first place. Causal perturbation theory clearly separates these issues, concentrating on the second.

17. Jan 3, 2018

### Urs Schreiber

The space of choices is of finite dimension in each degree, but it's is not a finite set in each degree. In general, given a function $p : S \to \mathbb{Z}$ one needs a choice principle to pick a section unless there is secretly some extra structure around. Maybe that's what you are claiming.

This sounds like you are claiming that there is a preferred section (zero section), when restricting to large elements. This doesn't seem to be the case to me, in general. But maybe I am missing something.

18. Jan 3, 2018

### Urs Schreiber

Not sure what this is debating, I suppose we agree on this. What I claimed is that the choice of renormalization constants in causal perturbation theory (in general infinite) corresponds to the choice of couplings/counterterms in the Wilsonian picture. It's two different perspectives on the same subject: perturbative QFT.

19. Jan 3, 2018

### vanhees71

Well, conventional BPHZ also has no UV cutoff but a renormalization scale only.

Of course, a theory which is not Dyson renormalizable (like, e.g., chiral perturbation theory), needs necessarily a kind of "cut-off scale" since to make sense of the theory you have to read it as expansion in powers of energy-momentum. Since it's a low-energy theory you need to tell what's the "large scale" (in $\chi$PT it's $4 \pi f_{\pi} \simeq 1 \; \text{GeV}$).

20. Jan 3, 2018

### A. Neumaier

I am claiming that each particular choice made by a particular formula gives a valid solution. In particular, I gave the simplest constructive choice that leads to a valid solution, namely setting to zero enough higher order parameters in a given renormalization fixing scheme (that relates the choices available to values of higher order coefficients in the expansion of some observables, just like the standard renormalization scales are fixed by matching physical constants such as masses, charges, and the gravitational constant).This gives one constructive solution for each renormalization fixing scheme, each of them for the foreseeable future consistent with the experiments.

There are of course many other constructive solutions, but there is no way to experimentally distinguish between them in my lifetime. Thus my simple solution is adequate, and in view of Ockham's razor best. You may claim that the dependence on the renormalization fixing scheme is ugly, but this doesn't matter - Nature's choices don't have to be beautiful; only the general theory behind it should have this property.

Maybe there are additional principles (such as string theory) giving extra structure that would lead to other choices, but being experimentally indistinguishable in the foreseeable future, there is no compelling reason to prefer them unless working with them is considerably simpler than working with my simple recipe.

Last edited: Jan 3, 2018
21. Jan 3, 2018

### A. Neumaier

Wilson's perspective is intrinsically approximate; changing the scale changes the theory. This is independent of renormalized perturbation theory (whether causal or not), where the theory tells us that there is a vector space of parameters from which to choose one point that defines the theory. The vector space is finite-dimensional iff the theory is renormalizable.

In the latter case we pick a parameter vector by calculating its consequences for observable behavior and matching as many key quantities as the vector space of parameters has dimensions. This is deemed sufficient and needs no mathematical axiom of choice of any sort. Instead it involves experiments that restrict the parameter space to sufficiently narrow regions.

In the infinite-dimensional case the situation is similar, except that we would need to match infinitely many key quantities, which we do not (and will never) have. This just means that we are not able to tell which precise choice Nature is using.

But we don't know this anyway for any physical theory - even in QED, the best theory we ever had, we know Nature's choice only to 12 digits of accuracy or so. Nevertheless, QED is very predictive.

Infinite dimensions do not harm predictability elsewhere in physics. In fluid dynamics, the solutions of interest belong to an infinite-dimensional space. But we are always satisfied with finite-dimensional approximations - the industry pays a lot for finite element simulations because its results are very useful in spite of their approximate nature. Thus there is nothing bad in not knowing the infinite-dimensional details as long as we have good enough finite-dimensional approximations.

22. Jan 3, 2018

### A. Neumaier

Yes, this is quite similar to causal perturbation theory.

23. Jan 3, 2018

### A. Neumaier

This is nice in that it illustrates the concepts on free scalar fields, so that one can understand them without all the technicalities that come later with the renormalization. I don't have yet a good feeling for factorization algebras, though.

24. Jan 3, 2018

### Urs Schreiber

I'd eventually enjoy a more fine-grained technical discussion of some of these matters, to clear out the issues. But for the moment I'll leave it at that.

By the way, not only may we view the the string perturbation series as a way to choose these infinitely many renormalization parameters for gravity by way of other data, but the same is also true for "asymptotic safety". Here it's the postulate of being on a finite-dimensional subspace in the space of couplings that amounts to the choice.

25. Jan 4, 2018

### Urs Schreiber

The space of choices of renormalization parameters at each order is not a vector space, but an affine space. There is no invariant meaning of "setting to zero" these parameters, unless one has already chosen an origin in these affine spaces. The latter may be addressed as a choice of renormalization scheme, but this just gives another name to the choice to be made, it still does not give a canonical choice.

You know this, but here is pointers to the details for those readers who have not see this:

In the original Epstein-Glaser 73 the choice at order $\omega$ happens on p. 27, where it says "we choose a fixed auxiliary function $w \in \mathcal{S}(\mathbb{R}^n)$ such that...". With the choice of this function they build one solution to the renormalization problem at this order (for them a splitting of distributions) which they call $(T^+, T^-)$. With this "origin" chosen, every other solution of the renormalization at that order is labeled by a vector space of renormalization constants $c_\alpha$ (on their p. 28, after "The most general solution"). It might superficially seem the as if we could renormalize canonically by declaring "choose all $c_\alpha$ to be zero". But this is an illusion, the choice is now in the "scheme" $w$ relative to which the $c_\alpha$ are given.

In the modern reformulation of Epstein-Glaser's work in terms of extensions of distributions in Brunetti-Fredenhagen 00 the analogous step happens on p. 24 in or below equation (38), where at order $\omega$ bump functions $\mathfrak{w}_\alpha$ are chosen. The theorem 5.3 below that states then that with this choice, the space of renormalization constants at that order is given by coefficients relative to these choices $\mathfrak{w}_\alpha$.

One may succintly summarize this statement by saying that the space of renormalization parameters at each order, while not having a preferred element (in particular not being a vector space with a zero-element singled out) is a torsor over a vector space, meaning that after any one point is chosen, then the remaining points form a vector space relative to this point. That more succinct formulation of theorem 5.3 in Brunetti-Fredenhagen 00 is made for instance as corollary 2.6 on p.5 of Bahns-Wrochna 12.

Hence for a general Lagrangian there is no formula for choosing the renormalization parameters at each order. It is in very special situations only that we may give a formula for choosing the infinitely many renormalization parameters. Three prominent such situations are the following:

1) if the theory is "renormalizable" in that it so happens that after some finite order the space of choices of parameters contain a unique single point. In that case we may make a finite number of choices and then the remaining choices are fixed.

2) If we assume the existence of a "UV-critical hypersurface" (e.g. Nink-Reuter 12, p. 2), which comes down to postulating a finite dimensional submanifold in the infinite dimensional space of renormalization parameters and postulating/assuming that we make a choice on this submanifold. Major extra assumptions here. If they indeed happen to be met, then the space of choices is drastically shrunk.

3) We assume a UV-completion by a string perturbation series. This only works for field theories which are not in the "swampland" (Vafa 05). It transforms the space of choices of renormalization parameters into the space of choices of full 2d SCFTS of central charge 15, the latter also known as the "perturbative landscape". Even though this space received a lot of press, it seems that way too little is known about it to say much at the moment. But that's another discussion.

There might be more, but the above three seem to be the most prominent ones.