Quantum Field Theory

Tom Mattson

Staff Emeritus
Gold Member
Hello everyone!

This is the rebirth of my thread in PF v2.0 entitled "Do you know QM and SR?" Since I started that thread, a 2nd edition of the book (Warren Siegel's Fields) has been released. The url is:

http://xxx.lanl.gov/pdf/hep-th/9912205

I'll post some of the more useful comments from the old thread shortly.

Related Quantum Physics News on Phys.org

Tom Mattson

Staff Emeritus
Gold Member
OK, first off my notes on the first subsection of Chapter 1 are on the web, here:

Second, once it came to light that group theory is a central mathematical theme in QFT, the following links were provided:

http://members.tripod.com/~dogschool
http://www.wspc.com/books/physics/0097.html [Broken]
http://www.wspc.com/books/physics/1279.html [Broken]

Third, the subject of the physical meaning of the commutator came up. Here was my attempt to explain it:

in classical mechanics, a quantity 'A' that does not explicitly depend on time and whose Poisson bracket with the Hamiltonian {A,H}=0, and for every conserved quantity there is an associated symmetry. The result carries over to quantum mechanics if one takes the commutator [A,H]=AH-HA instead of the Poisson bracket. [A,H]=0 implies that A is conserved (again, provided that A does not depend explicitly on time).

Commutators are also involved in the uncertainty principle. When [A,B]=0, then A and B are simultaneously diagonalizable. That means that there are simultaneous eigenfunctions of both operators, meaning that both observables can be measured simultaneously. If [A,B] does not equal zero, then there is an uncertainty relation. An example is [x,p]=i(hbar).

Fourth, the subject of Hermiticity of operators came up. Again, here is what I had to say on the subject:

The dagger signifies the Hermitian conjugate. For a matrix A, A+ is the complex conjugate of the transpose. An operator is Hermitian if A=A(+). The Hermitian conjugate is important for (at least) two reasons:
1. A=A(+)-->A has real eigenvalues, so all physical observables correspond to Hermitian operators.
2. An operator is unitary if A(+)=A-1. (Someone correct me if this is wrong): All operators that form a group with Hermitian generators are unitary. Unitarity is necessary to guarantee probability conservation.

A(+) is "A-dagger". It seems that superscripts are not done in the same way here as they were in the last version of PF.

The following link on brackets was supplied:

I'll post some more of the relevant discussion when I get a chance.

Last edited by a moderator:

Hurkyl

Staff Emeritus
Gold Member
All operators that form a group with Hermitian generators are unitary. Unitarity is necessary to guarantee probability conservation.
Hrm, did you mean to say "unitary generators" instead of "hermitian generators"?

BTW that last link is a nice one! The section on generators makes more sense now; the way it was presented by Siegel seemed fairly arbitrary: it worked, but there was no motivation for it.

Hurkyl

Last edited:

Tom Mattson

Staff Emeritus
Gold Member
Originally posted by Hurkyl
Hrm, did you mean to say "unitary generators" instead of "hermitian generators"?
Nope.

Take the time evolution operator, for instance:

U(t,t0)=exp[-iH(t-t0)/hbar]

U is unitary, and it is generated by H, a Hermitian operator.

arivero

Gold Member
Peskin

Aside, let me to note that in superstringtheory.com people is doing (for free) a reading of the Peskin Schroeder. This week they are starting chapter nine.

Hurkyl

Staff Emeritus
Gold Member
Oh right, lie generator. (bonks self) Ok I withdraw my objection. Hurkyl

Tom Mattson

Staff Emeritus
Gold Member
OK, back to my recap:

Fifth, the subject of symmetry in the context of indices came up. My remarks:

He's talking about symmetry/antisymmetry under an exchange of indices.
Take Sij=xiyj+xjyi
That combination is symmetric under the exchange i<-->j, because Sij=Sji.
Now take Aij=xiyj-xjyi
That combination is antisymmetric because Aij=-Aji.

Sixth, there was the ever-so-troublesome issue of correspondence. Specifically, what the heck are those terms of order (hbar)^2 in the correspondence between the commutator and the Poisson bracket?

A guy (lethe) at another forum (sciforums) helped me out here:

Tom, you have written:

in my QM courses, I learned that the correspondence principle was simply i/hbar[A,B]QM --> {a,b}classical
the correspondence principle that you learned in QM does not always work like this.

a reminder:
the correspondence principle says turn your classical poison bracket into your quantum commutator. and then write your classical function with the classical canonical variables replaced by the corresponding quantum operators.

for example, if the classical function is qp, the quantum operator cannot be QP, but rather must be 1/2(QP+PQ). for functions of degree 3 or higher it gets more complicated. you have to also include some quantum anomoly terms, when taking the poison pracket to the commutator.

for example, one can show that the correspondence principle takes {pq2,p2q} = 3p2q2 --> i\hbar/3[P3,Q3] + \hbar^2

that term on the end there means that we have to add a quantum anomoly term when we quantize the system, in order to be consistent with the assumptions of correspondence principle. but notice that the anomoly term is of order hbar^2.

you can also get quantum anomoly terms in your hamiltonian, when you, for example, try to quantize a noncartesian system. here again, the anomoly term is of order hbar^2.

so in the classical limit, hbar --> 0, and all variables commute.

in the semiclassical limit, divide by hbar, then take hbar --> 0, and the anomoly terms disappear, and the commutators simply become the poisson brackets, the hamiltonian becomes the classical hamiltonian.

i believe that their is no axiomatic way that is self consistent to specify the rules for quantization.

tom, i haven t looked at any of those documents you listed as references, do they treat this issue?

is it clear what s going on here? i m getting this mostly from ticciati. also shankar has a bit about this.

i think this is what s going on with siegel, but i ain t positive, so i would appreciate any feedback.

let me explain that in a little more detail:

let s say that i have a classical system that i want to quantize. this means that i want to write down a mapping z that, at the very least satisfies the following conditions:

z(p) = P
z(q) = Q

z({f(p,q),g(p,q)}) = i*hbar [z(f(p,q)),z(g(p,q))]

where f and g are any functions.

naively, you might want the mapping to be a little stronger. you might want it to take

z({f(p,q),g(p,q)}) = i*hbar [f(P,Q),g(P,Q)]

but we will see that even the weaker condition above is not possible. using just these three assumptions, you can discover by considering all the commutators that z(p2)=P2 + k, for some constant number k, and similarly for Q. then taking the poisson bracket of those two functions immediately shows that

z(pq)=1/2(PQ+QP)

so my second stronger condition above is violated. if i wanted to preserve the functional form of the observable, i would have to require z(pq) = PQ.

this fact is not so distressing however, because the classical variables commute, there is some ordering ambiguity. and furthermore, this symmetric sum is required to make the quantum observable hermitian.

however, the problem gets worse when you go to third degree functions.

you can calculate the poisson bracket for those third degree terms i mentioned above, and then plug in the mapping for that poisson bracket, and compare it with the commutator, you will see that the mapping no longer preserves the bracket to commutator isomorphism, except in the hbar --> 0 limit.

according to ticciati though, this is only of academic interest, since in QFT we never have to quantize systems with cross terms like that.

Pellman and Alis also had some helpful comments, but unfortunately I did not copy them in time. Tom Mattson

Staff Emeritus
Gold Member
On to Fermions...

This section is written in anticipation of later studying Supersymmetry (SUSY). Looking at the superstringtheory.com site, I see that the two main mathematical topics unique to SUSY are *drumroll*

Anticommuting variables and Graded Lie Algebras

So, it's not surprising that we find both of those in Chapter 1 of a book written by a string theorist.

Hurkyl's thoughts:

So, I imagine then that the vector space involved in the problems is either a Grassman algebra, or is implicitly extended to one, which leaves the |x> notation for a vector kinda moot because the vector is identified with the operator x, but I suppose kets are still useful for clarity's sake.
I imagine then that |0> is not the zero vector, but is instead the multiplicative identity for the algebra? That would explain how the one homework problem could possibly work:
|x> = e^(xa+)|0>

I also imagine then that each "anticommuting variable" is simply a generator of the algebra? So then any vector v in the algebra of a single anticommuting variable x can be written:
v = a 1 + b x = a|0> + b|x>
?
And then for two anticommuting variables x & y:
v = a 1 + b x + c y + d xy = a|0> + b|x> + c|y> + d|xy>
?

And Rutwig's invaluable advices:

The notation of Grassman numbers comes from old days, when the notation of c- and q-numbers to write complex numbers and operators in a Hilbert space was common. Now these numbers are in fact elements of the exterior algebra, and the anticommutation rule is nothing more but the statement of the elementary property of the wedge product of linear forms.

A little addendum. All these commutation anticommutation relations indeed show that we will have to enlarge the concept of Poisson bracket or Lie bracket to a more general structure that covers them. If we write the anticommutation as the ordinary bracket, the operators will be symmetric, and therefore do not belong to ordinary structures. However, since bosons behave like ordinary algebras and the action boson-fermion gives fermion, the fermions will have a module structure with respect to the bosons. Now it follows that all these are the ingredients to define a supersymmetry algebra.

since Tom made me curious with the question, I have had a look on the book and seen that brackets are formally introduced after the question of commutation anticommutation of bosons and fermions. This is bad, as I see it, since finally all will be reduced to these or similar brackets and the isolated presentation leads to confusion.

And Heumpje was quite helpful here, too:

For instance:
[int]dc 1 = 0
[int]dc c = 1
or
(d/dc)(c*c)=-c*
while
(d/dc)(cc*)= c*
where c is a grassman number and c* its complex conjugate.
When calculating a path integral you have the problem that because of the introduced minus signs in the exponent your integral becomes indefinite. To prevent this Grassman numbers are introduced. It doesn't solve anything in the end, since your now stuck on a set of, although nice from a mathematical viewpoint, ununderstandable numbers.
There is a nice introduction to the use of Grassman numbers in:
"Quantum Many-Particle systems" By Negele & Orland, Frontiers in Physics series from Adisson-Wesley, 1988

That's all I want to bring over from PF v2.0 right now. I have a little bit on Lie groups (the 3rd subsection), but I want to try to pick it up here with Fermions next time, if for no other reason than to figure out what I am doing wrong on those exercises.

lethe

did you guys figure out what the half bracket half brace: [A,B} notation means? that was kind of confusing to me...

rutwig

Originally posted by lethe
did you guys figure out what the half bracket half brace: [A,B} notation means? that was kind of confusing to me...
This notation wants to point out that you have two types of commutators, the ordinary Lie bracket [] and the symmetric bracket {} corresponding to the fermion action. But this is the worst possible notation, and leads to confusion. It can all be expressed by using only the bracket [] and requiring the Jacobi superidentity.

regards,

pellman

Anybody get exercise IA2.2 on pg 45? Here's how I see it. (I'm going to use P instead of psi).

The delta function is P '- P, the most general function of P is a + bP, so the integrand is

(P' - P)(a - bP') = aP' - aP -bP'^2 + bPP' (P'^2 term vanishes)
= (a + bP)P' - aP

The anti-derivative of this is (a + bP)P'^2/2 - aPP' = aPP', but what are the limits of integration for P'? All "P-space?" What space is that anyway? And does it even have the necessary characteristics (e.g., the correct topology) to allow integration to be done with anti-derivatives?

I don't see how we can get a + bP out of this.

- Todd

Tom Mattson

Staff Emeritus
Gold Member
Originally posted by pellman
Anybody get exercise IA2.2 on pg 45? Here's how I see it. (I'm going to use P instead of psi).

I, too, am stumped here. Oh, Ruuuutwiiiig!

The delta function is P '- P, the most general function of P is a + bP, so the integrand is
Whoa, are you saying that the delta function is the simple difference of P and P'? I thought it was a distribution defined similarly to the delta function for "normal" (commuting) variables?

The anti-derivative of this is (a + bP)P'^2/2 - aPP' = aPP', but what are the limits of integration for P'?
Siegel explicitly states that the integration is indefinite.

I found a paper at the LANL arXiv that gives a rundown this Grassman calculus. I will dig up the link and post it here, as well as take another crack at this one.

rutwig

Originally posted by Tom
I, too, am stumped here. Oh, Ruuuutwiiiig!
Ok, here is another assertion of Siegel which is not vey clear. The intention here is to generalize the known analysis to "superanalytic" functions, that is, functions having commuting an anticommuting variables. Let v be an anticommuting variable and let be given a displacement (infinitesimal) by an (infinitesimal) odd supernumber dv. If f: F--> G (F odd supernumbers, G Grassmann algebra) is a function, then f(v) must have also a displacement with:

df(v)=dv[(d/dv)f(v)]=[f(v)(d/dv)]dv (**)

(d/dv) is called the left derivative operator, and ...the right...

Now, f(v)=av+b is the most general soltuion to (**). In fact, if you expand v as a series in the Grassmann algebra, and similarly f(v), regard the coefficients of the latter as functions of the coefficients of v and vary them infinitesimally, so that the expression of dv is obtained. Now look at the conditions which the coefficients of f(v) must satisfy in order to agrre with (**).
Since the f is linear, it suffices to give sense to [inte]dv and [inte]v dv. Now, if the equation (known for distributions)

[inte] [(d/dx)f(x)]dx=0

shall also hold for anticommuting beasts, then

[inte]dv= 0 and [inte]vdv=y (y is a supernumber). Thus

[inte] f(v+a)dv=[inte] f(v)dv and

[inte]f(v)(d/dv)g(v)dv= [inte] f(v)(d/dv)g(v)dv

pellman

Originally posted by Tom

Whoa, are you saying that the delta function is the simple difference of P and P'? I thought it was a distribution defined similarly to the delta function for "normal" (commuting) variables?
That's the problem. We're trying to show that &delta;(&psi;) = &psi;. (I've got this math-symbol thing figured out now.)

Originally posted by rutwig
Now, if the equation (known for distributions)

&int;[(d/dx)f(x)]dx=0

shall also hold for anticommuting beasts,
Why should it hold at all? For a plain old commuting scalar variable, for instance,

&int; [(d/dx)f(x)] dx = f(b) - f(a)

if b and a are the endpoints. In any case, it would only be special cases that vanish right?

pellman

Easier(?) question

I'm really more interested in Ex IA2.3 on page 47. (Note: there is a correction to this exercise on the errata page.)

Can someone please show me how

{a,a*} = {&zeta;,-&part;/&part; &zeta;}
= - &zeta;&part;/&part; &zeta; - (&part;/&part; &zeta;)&zeta;
= 1 ?

If &zeta; were a usual commuting variable, then [&zeta;,-&part;/&part; &zeta;] = - &zeta;&part;/&part; &zeta; + (&part;/&part; &zeta;)&zeta; = 1 is easy to show. How does the anticommuting nature of &zeta; change the behavior of the derivative here?

(Edited for more)

1.
{a,a*}&psi;(&zeta;) = {&zeta;,-&part;/&part; &zeta;}&psi;(&zeta;)
= - &zeta;(&part;/&part; &zeta;)&psi;(&zeta;) - (&part;/&part; &zeta;)[&zeta;&psi;(&zeta;)]
= - &zeta;(&part;/&part; &zeta;)&psi;(&zeta;) + &zeta; (&part;/&part; &zeta;)&psi;(&zeta;) - 1 x &psi;(&zeta;)
(the plus sign arise because I guess the zeta and derivative anticommute, although I don't really see it.)
= - &psi;(&zeta;)
=> {a,a*} = -1

2. Another approach
Let &psi;(&zeta;) = A + B&zeta;. Then..

{a,a*}&psi;(&zeta;) = - &zeta;(&part;/&part; &zeta;)(A + B&zeta;) - (&part;/&part; &zeta;)[&zeta;(A + B&zeta;)]
= - B&zeta; - (&part;/&part; &zeta;)(A&zeta;) (&zeta;2 = 0)
= - B&zeta; - A
= - &psi;(&zeta;)
=> {a,a*} = -1

(even more)

Okay. The reason 1 and 2 give {a,a*} = -1 is because I am working in the <&zeta;|&psi;> representation and the operators are acting on the bra instead of the ket and the negative derivative becomes a positive derivative in that case. That is,

<&zeta;|a = (a*|&zeta;>)* = (-&part;/&part;&zeta; |&zeta;>)*
= + &part;/&part;&zeta;<&zeta;|

and

<&zeta;|a* = <&zeta;|&zeta;

Why we get the plus sign is still a mystery to me.

Last edited:

rutwig

Re: Easier(?) question

Originally posted by pellman
I'm really more interested in Ex IA2.3 on page 47. (Note: there is a correction to this exercise on the errata page.)

Again there is a mistake of signs in the book due to an unconventional choice. The definition of fermionic oscillator is fixed, and set for convenience (see the realization of the Heisenberg algebra). Now, if we set a|[the]>=[the] |[the]>. Thus |[the]>=exp([the]a*)|0> and we obtain a*|[the]>= ([pard]/[pard][the])[the].

Strong recommendation: take another book, Siegel has not been revised carefully. The text I used to recommend can be found at:

http://arturo.fi.infn.it/casalbuoni/lezioni99.pdf [Broken]

Last edited by a moderator:

rutwig

Originally posted by pellman
Why should it hold at all? For a plain old commuting scalar variable, for instance,
for antocommuting variables the comparison with ordinary ones leads inevitably to confusion. Conventional prejudices must be given up, since here no measure theory plays a role. And it is required to hold.

pellman

Re: Re: Easier(?) question

Originally posted by rutwig

Strong recommendation: take another book, Siegel has not been revised carefully. The text I used to recommend can be found at:

http://arturo.fi.infn.it/casalbuoni/lezioni99.pdf [Broken]
Thanks, rutwig!

Last edited by a moderator:

pellman

I have corresponded with Siegel about ExIA2.3 and he has added another comment to the errata page. However, his comment is brief and may still be confusing.

If any of you are interested, just let me know and I will type up the exercise as it should correctly appear and post it here.

- Todd

Tom Mattson

Staff Emeritus
Gold Member
Originally posted by pellman
If any of you are interested, just let me know and I will type up the exercise as it should correctly appear and post it here.
Could ya?

Thanks.

Rutwig--I am printing out that other book. It looks much more conventional, which is good, but I want to ask you something. What would it take for us to be able to get through Siegel's book? I ask because he has a trilogy of books (QFT, SUSY, and String Theory) online, and I was hoping to get through them all.

rutwig

Originally posted by Tom
Rutwig--I am printing out that other book. It looks much more conventional, which is good, but I want to ask you something. What would it take for us to be able to get through Siegel's book? I ask because he has a trilogy of books (QFT, SUSY, and String Theory) online, and I was hoping to get through them all.
I fear that I do not understand fully what you mean. The problem I see at Siegel's books is that have grown out from his lectures at Stony Brook, and that the texts have been conceived in that sense, that is, I have not the impression they were written for people not attending the lectures. This would explain eventual imprecise points, or at least where it is not transparent what exactly is meant. They are excellent books, but the lecture that I do is that the topics should not be unknown.
To the question of SUSY, a knowledge of representation theory of the Lorentz (hélas Poincaré) group is recommendable, specifically to see what the Dirac, Weyl, Majorana spinors are (this follows at once from the point of view of Clifford algebras, which is the underlying formalism to define the Dirac matrices and corresponds to the natural generalization of space reflections for the covering group of SO(3)).
Also some Yang-Mills formlism, etc. Physically it is supposed that the reader has followed or follows regular physics lectures. Any graduate and or undergraduate (I don't know well the equivalence of european/american educational subdivision) should not have problems from the physical content. In any case, since I have not yet seen the two other books, I cannot pronounce myself with full clarity. I will comment on this later.

pellman

Ex IA2.3(b) with corrections (part (a) is okay as is)

Define eigenstates of the annihilation operator (“coherent states") by

a|&zeta;> = &zeta; |&zeta;>

where &zeta; is anticommuting. Show that this implies

a+|&zeta;> = (-&part; /&part; &zeta;) |&zeta;>
|&zeta;> = exp(- &zeta;a+) |0>
|&zeta;’ + &zeta;> = exp(- &zeta;’a+) |&zeta;>
xa+a|&zeta;> = |x&zeta;>
<&zeta;|&zeta;’> = e- &zeta;*&zeta;’
&int; d&zeta;*d&zeta; e+&zeta;*&zeta;|&zeta;><&zeta;| = const
(I haven't gotten this one yet and am not sure of the normalization. It's probably equal to &pi; or 2&pi;.)

Define wave functions in this space, &psi;(&zeta;*) = <&zeta;|&psi;>. Taylor expand them in &zeta;*, and compare this to the usual two-component representation using |0> and a+|0> as a basis.

Note:

You can't really show that anticommutator and a|&zeta;> = &zeta; |&zeta;> alone necessarily imply a+|&zeta;> = (-&part; /&part; &zeta;) |&zeta;>. You also need the expression for <&zeta;|&zeta;’>. Or instead you can derive <&zeta;|&zeta;’> by assuming a+|&zeta;> = (-&part; /&part; &zeta;) |&zeta;>. See this thread for more: https://www.physicsforums.com/showthread.php?s=&threadid=791

Also, keep in mind that <&zeta;|a = (a+|&zeta;>)+ = (-&part; /&part; &zeta;*) <&zeta;|.

I haven't attempted part c yet.

Last edited:

pellman

Looks like this thread is temporarily dead. I'm pausing with Fields too so that I can learn some perturbation theory, which I have until now neglected. To anyone reading, I for one am definitely returning to this book -- in about a month probably -- so don't be put off by the lack of recent posts.

- Todd

damgo

I bailed on Siegel a while ago too, and switched back to Peskin&Schroeder (sometimes I feel like I change QFT books more often than I change clothes), at least until I can refresh myself on Hamilton-Jacobi theory and Poisson brackets, which I never really learned in the first place.

BTW, rutwig, can you recommend a good book/resource to learn about Clifford algebras?

Tom Mattson

Staff Emeritus
Gold Member
Originally posted by pellman
To anyone reading, I for one am definitely returning to this book -- in about a month probably -- so don't be put off by the lack of recent posts.
Same here. The big thing holding me up is that damn section on Fermions. I have never seen these "anticommuting numbers" before (operators yes, but numbers no). I have the book by Berezin, The Method of Second Quantization, to which Siegel refers, but I think I would have to read a book on Functional Analysis before I could get through that one.

Since that kind of heavy, rigorous treatment is clearly not needed to solve the exercises in Siegel, I am trying just to learn the Grassman calculus. Unfortunately, there is not one consolidated source for that that I can find. So, I am trying to put together a comprehensive tutorial on it from the following documents:

http://www.physics.rutgers.edu/~coleman/mbody/pdf [Broken]
Density Operators for Fermions
Quasiclassical and Statistical Properties of Fermion Systems
On Generalized Super-Coherent States
Fermions, Bosons, Anyons, Bolzmanions and Lie-Hopf Algebras

Maybe if my review is good enough, I'll send it to Am. J. Phys.

Who knows?

Last edited by a moderator:

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving