Having revisited renormalization, it appears group theory has put some distance between renormalization and the realm of the purely ad hoc. There is no denying it works. I have some discomfort with the approach and I don't think I'm alone. But it is also safe to say no one has come up with a better approach. The main issue is the apparent arbitrariness in choosing which solutions to ignore. The risk of missing a solution that does turn out to be meaningful appears unavoidable. Of course you can always come back and revisit these solutions if need be, but that seems a rather haphazard way of doing science. The history of this controversy is interesting in itself. Bear with me if some of these references are dated and the questions raised have since been satisfactorily answered.
First, by Popper: Moreover, the situation is unsatisfactory even within electrodynamics, is spite of its predictive successes. For the theory, as it stands, is not a deductive system. It is, rather, something between a deductive system and a collection of calculating procedures of somewhat ad hoc character. I have in mind, especially, the so-called 'method of renormalization': at present, it involves the replacement of an expression of the form 'lim log x - lim log y' by the expression 'lim (log x - log y)'; a replacement for which no better justification is offered than that the former expression turns out to be equal to - and therefore to be indeterminate, while the latter expression leads to excellent results (especially in the calculation of the so-called Lamb-Retherford shift). It should be possible, I think, either to find a theoretical justification for the replacement or else to replace the physical idea of renormalization by another physical idea - one that allows us to avoid these indeterminate expressions. (Karl Popper: Quantum Theory and the Schism in Physics).
By Sachs:The well known trouble with RQFT [Relativistic Quantum Field Theory] is that when its formal expression is examined for its solutions, it is found that it does not have any! This is because of infinities that are automatically generated in this formulation. After this failure of the quantum theory was discovered, renormalization computational techniques were invented that provide a recipe for subtracting away the infinities and thereby generating finite predictions--some which had amazing empirical success. But the trouble is that a) such a scheme is not demonstrably mathematically consistent (implying that, in principle, any number of predictions could come from the same physical situations, though one of them is empirically correct) and b) there still remains the problem that there are no finite solutions for the problem.(http://www.compukol.com/mendel/)
By Penrose:A popular approach to quantum field theory is via 'path integrals', which involve forming quantum linear superpositions not just of different particle states (as with ordinary wave functions), but of entire space-time histories of physical behaviour (see Feynman 1985, for a popular account). However, this approach has additional infinities of its own, and one makes sense of it only via the introduction of various 'mathematical tricks'. Despite the undoubted power and impressive accuracy of quantum field theory (in those few cases where the theory can be fully carried through), one is left with a feeling that deeper understandings are needed before one can be confident of any 'picture of physical reality' that it may seem to lead to.(Roger Penrose: The Emperor's New Mind).
By Davies: ...it may be assumed that the theory will last. Nevertheless, the presence of the infinite quantities which are formally removed by the renormalisation procedure is worrying. (Paul Davies: Superforce: The Search for a Grand Unified Theory of Nature).
By Feyerabend: This procedure [renormalization] consists in crossing out the results of certain calculations and replacing them by a description of what is actually observed. (
http://en.wikipedia.org/wiki/Paul_Feyerabend)
And of course the famous one by Feynman: But no matter how clever theword, it is what I call a dippy process! Having to resort to such hocus pocus hasprevented us from proving that the theory of quantum electrodynamics is mathematically self consistent. I suspect that renormalization is not mathematically legitimate.
http://www.quantummatter.com/documents/Einstein-WebPage.pdf.
And one by the somewhat obscure Hanson: When he is absorbed in high-energy problems the scientist must renormalize his equations in order to continue with his physics at all, but the consequences may be more costly than the gain. Before renormalization, he assumed, let us say, that a certain volume of space contained a number of particles, and the physical problem was to discover the probabilities that these particles would have certain positions, velocities or densities. But the mathematical form in which he describes his experiment is altered by renormalization, and altered in such a way that he can no longer assume that there really are any actual particles within the volume he is considering . . . since the equations are applicable either way, particle or not. The physicist's probability determinations then become shadowy references to the 'ghost states of particles', the references themselves are called 'negative probabilities'; and it is difficult to attach any real physical sense to these expressions. (N.R. Hanson: Quanta and Reality).
I do not think group theory puts to rest all of these past criticisms.