What is quantum field theory and why was it developed? What is its relation to quantum mechanics?
Quantum field theory is a theory in which the variables are fields, and particles come in secondarily as "quanta" of the fields. The first quantum field theory to be fully developed was quantum electrodynamics, aka QED, which was developed in the late 1940s. The reason it was developed at that time, although preliminary work had been done in prewar Europe, was that microwave technology had been highly developed in the radar labroatories during WWII. So two men, Lamb and Retherford, were able to obtain the microwave spectrum of pure hydrogen. And they discovered that one of the lines in the spectrun was shifted a little bit from the position (energy) predicted for it by the best existing theory. This "Lamb shift" became a challenge for the theorists to explain.
Early on, the idea that the electron was interacting with a cloud of virtual particles drawn around it from the quantum vacuum (which many still thought of as the "Dirac Sea") was proposed, and Bethe did a back of the envelope approximation that showed the idea could work. Within a year Schwinger had developed a field theory, which still had a few bugs in it. Then Feynmann introduced his path integrals and diagrams, and was at first ignored. A letter now came from Japan saying that Tomonaga had developed a theory much like Schwinger's, independently.
Finally Freeman Dyson worked through everybody's theories and showed they were all equivalent, and effectively invented the technology that is now taught as QED in the beginning chapters of QFT textbooks.
When the new theory was applied to the Lamb shift it was astonishingly accurate. It became the ruling theory of high energy physics.
In 1956 Yang and Mills introduced their non-abelian gauge theory with gauge group SU(2), in an attempt to do for the strong force what QED had done for electromagnetism. But Yang-Mills theory had serious problems, and was nearly forgotten as time moved on into the 1960s. But in the second half of the sixties Faddeev and Popov showed how to quantize Yang-Mills, and Feltzmann and 'tHooft showed how to renormalize it, and then Weinberg and Salam created the U(1)X SU(2) gauge electroweak theory, unifying the electromagnetic and weak forces, and finally in the 1970s several people defined QCD quantum chromodynamics, the theory of the strong force and it was united (not just pasted together) with the electroweak theory to become the Standard Model. Field theory triumphant.
Thanks for the reply. Could you elaborate on the above statement a bit.
Whereas a pure particle theory, like Schroedinger's or Dirac's will have a physics defined by momenta and positions of particles, which may each range over continua, but are finite in number, a field theory deals in objects (fields) which have infinitely many degrees of freedom. Things that were ordinary functions in particle theories are now functionals. Variables that used to take on numeric values are now distributions.
This makes a huge difference. For example you cannot guarantee that the product of two distributions exists, this means that all the mathematics of field theory has finicky special cases and tricks to it. Dyson was the first to really cope with these issues, because the facts about distributions only came clear with Laurent Schwarz's thesis in 1948.
Typically a field theory starts with a classical Lagrangian with its Noether currents and possibly a covariant derivative. Then this is quantized, which now is a highly non-trivial business. Because the usual approach involves tacit products of distributions, there are singularities to be handled*.
The way these singularities are handled is by first Regularization, and then Renormalization. Regularization creates a non-physical, but mathematically consistent deformation of the theory, which is used to complete the quantization, and then renormalization, which shoves the singularities to an external multiplier where they don't interfere with the innards of the theory, completes the proces and removes the deformation.
*There are methods which do not involve the singularities, but they do not produce the handy calculations associated with ordinary renormalization.
Is this saying that basically instead of a function mapping points between sets they are mapping functions to functions? That is, each argument is now a function and not a discrete number of points? Sounds like you are dealing with function space and this reminds me of stuff I studied in real analysis.
Any online resources?
Well, if you have Postscript, there's http://www.pt.tu-clausthal.de/~aswl/scripts/qft.html [Broken].
Protonman, you have the idea. It's more like functional analysis than measure theory, though both of them come in.
Is quantization some sort of advanced general method of statistical analysis? If so, then when do we use a quantization procedure? In what situations does it apply? For example, does a quantization procedure apply when you know you must have a solution, but it is inherently impossible to narrow the answer to only one solution. So you must then calculate the probability of every possible solution and see how the possibilities interfere with each other - a Feynman type of integration? Otherwise, it seems distrubing to have methods only applicable to one situation - a loss of generality.
Is quantization some sort of advanced general method of statistical analysis? If so, then when do we use a quantization procedure? In what situations does it apply? It seems distrubing to have methods only applicable to one situation.
Quantization apparently means different things to different people! See the discussion of Strings, Branes, and LQG about Thiemann's quantization of "The LQG String". But pretty generally quantization is a process applied to a classical theory to produce a quantum theory. It converts coordinates into states in a Hilbert space and variables into operators on the Hilbert space. And those operators are constrained to obey the commutation rules that enforce uncertainty. What else may be required of a "true quantization" seems to be controversial.
For me, there must be relatively many people in comparable numbers on different sides of an issue for it to be "controversial". This certainly isn't the case with LQG-quantization which only leads to wrong theories that have nothing to do with the physical universe.
Jeff, may I gently suggest you read you sig?
Can we state the quint essential geometry of a valid quantization process? Where do the various entities live, in the tangent or cotangent space, in the tangent or cotangent bundle, etc?
SelfAdjoint learned over the last few weeks that LQG quantization is unphyical. So instead of just being honest about this, he's chosen to finesse this fact by claiming that whether the standard methods of quantization - you know, the one's experiment has shown over and over are correct - is a controversial issue when it's really not.
Jeff, I am not finessing. Notice .this thread over on S.P.R. where several people are discussing quantization and what it requires. Some say a central charge is required, one guy plimps for an energy tensor that is annihilated by some nonzero vector, and so on. Urs' discussions at the Ulm meeting, which seem to have come to some concusions, show that the concept is not completely well-defined in physicists' minds.
And I do not appreciate your spiteful way of imputing motives to me
Note my original post
Several people discussing an issue in a public forum doesn't equal controversy. There's no controversy about whether the way LQG imposes constraints is valid, it isn't.
I've made the point in the past that the fact that LQG is popular only outside the physics community is telling. Your reaction to this was to say that "head counts" are not a good way to judge this issue, as if we we're talking about sociology or something. Yet all it takes for you to conclude that an issue is controversial is to happen upon a brief exchange on the subject among a few people you don't know and whose comments you at best only partially understand. I'm not spiteful. You're hypocritical.
Name calling is not allowed here.
I'm not interested in the least on the content of other threads, or on your personal issues with SelfAdjoint and with LQG. You are entitled to your opinion, but you should able to express it without the use of such resources.
This is not true. There are many groups working on this. They are serious physicists that do understand what they are doing. You may not agree with their methods or interpretations, but calling their effort "outside the physics community" is a gross mischaracterization.
I'm sorry, but I didn't deserve this admonition since if you check you'll see that I'd already changed "hypocrite" to "hypocritical" to be inline with selfadjoints saying that I was "spiteful", which isn't true.
Again, I'm sorry, but no there aren't.
Once again, I'm sorry, but I didn't say that lqg isn't being pursued within the physics community but rather only that it's unpopular within the physics community, which is a fact.
Some places in which there is research on LQG:
Penn State University
Max-Planck-Institut für Gravitationsphysik (within the Albert Einstein Institute)
Institute for Nuclear Sciences (Mexico)
University of Michigan
Perimeter Institute for Theoretical Physics (Canada)
Centre for Theoretical Physics (Marseille, France)
University of Rome "La Sapienza",
Max-Planck-Institut (Leipzig, Germany)
Universidad de la Republica (Uruguay)
University of Nottingham
Universidad de Oviedo (Spain)
University of Cambridge
Imperial College, London.
Strings are definitely more popular, I agree, but your assertion that LQG "only leads to wrong theories that have nothing to do with the physical universe" is greatly unjustified, and it can mislead people into thinking that it is not an active area of research anymore. It is.
You are entitled to your opinion, of course, but don't misrepresent it as an agreed-upon fact.
I repeat my assertion that quantization, particularly the question of what constitutes a physically meaningful quantization, is constroversial, or at the very least, unresolved. Different physicists give different answers.
There is first quantization, and then second quantization of quantum field theory. Is there a third quantization? Why or why not?
Because they haven't found a need for it. If and when they ever do, they'll introduce and define it then. Some physicists think the "first quantization/secondquantization" terminology is a misnomer. But pretty generally, first quantization does the particle, or string, and second quantization does the field.
quantization is the process of going from a classical theory to a quantum theory. both canonical quantization and path integral quantization have worked, and yield the same results, and are in excellent agreement with a wide variety of experiments.
some of the best tested theories of all time are quantum theories. to say that the procedure of quantization is "controversial" just because some very speculative theories of gravity that are completely removed from experiment diverge from the agreed upon methods is not very fair. canonical quantization has been undergraduate physics since 1920, and is anything but controversial.
you can quantize as many times as you want. see Baez for the details.
Separate names with a comma.