A. Neumaier said:
Standard renormalized QED at 6 loops is a perfectly well-defined covariant quantum field theory that gives excellent predictions.
Is it a theory at all? It is nothing but an approximation for a particular experiment, namely scattering of particles which start and end with free particles far away.
A. Neumaier said:
Its only defect is that it (extremely slightly) violates the axioms of Wightman. Since you discard wightman's axioms as well, you have no reasons left to consider QED ad ill-defined. Thus you should care about standard QED.
No, it is not even a consistent theory. And I do not care about accuracy of an approximation of a not even well-defined theory, I care first about having a well-defined theory.
A. Neumaier said:
These are already well developed, to the point of giving results with 12cdecimals of relative accuracy!
There are lots of well-defined theories completely unrelated to experiment. They are completely irrelevant. To claim physical content for a theory you need to show that you can reproduce the experimental results!
Once I have a well-defined theory, which I have if I use a lattice regularization, then I can start about using your renormalized QED at 6 loops to compute approximations. So, no problem. Nobody forbids me to use such not-even-theories as approximations for particular situations like scattering.
A. Neumaier said:
Thus to make a Bohmian version of QED based on a lattice you need to spell out which precise lattice field theory (at which lattice spacing, with which interaction constants) you want to consider.
I can consider a particular lattice theory in general, using unspecified constants. Who was it who has referenced that paper where lattice computations have been used to compute the renormalization down to the place where the Landau pole should appear, but it did not appear on the lattice? So, to compute the renormalization is something possible and has been already done, and once in this case all lattice approximations are well-defined theories. All one has to do is to compute with this program the resulting large distance limit of the constants and to compare them with observation.
A. Neumaier said:
For lack of computational evidence you would have to prove theoretically (not just say some handwaving words!) - which is probably impossible in the face of QED triviality and the Fermion doubling problem - that you can accurately approximate this lattice theory in some way that reproduces the standard low energy results of QED. Only then you have a substantiated claim.
QED triviality is not a problem of lattice theory, it is a problem which appears only in the limit of the lattice distance going to zero. Which I propose explicitly not to do. To go with the lattice distance below Planck length simply makes no sense at all. Don't forget that a lattice theory remains well-defined if the interaction constant is greater than 1, while you will fail completely with your Feynman diagrams.
Then, fermion doubling is first of all a problem of getting the accuracy. It appears if you approximate the first derivatives in a node n with ##\frac{f(n+1)-f(n-1)}{2h}##, but not if you use the less accurate ##\frac{f(n+1)-f(n)}{h}##. Just to clarify that it is not unsolvable in principle. But, ok, even if we prefer higher accuracy, we can get rid of unnecessary doublers. In this case, we can use staggered fermions, which reduces the doublers to four. Then, to regularize the theory, we need discretization in space only, not in time. If one uses the original Dirac equation with the ##\alpha_i,\beta##, this gives a staggered evolution equation on a 3D lattice. This reduces the doubling problem by another factor two, thus, gives two Dirac fermions. Completely sufficient for the SM, where fermions appear only in electroweak doublets.
For details, with the explicit 3D lattice, see
arxiv:0908.0591