Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

(Simple) Derivation of Yang-Mills Equations

  1. Jul 8, 2013 #1
    Hello all, my teacher assigned a problem related to the yang-mills equation in my general relativity class and I just wanted to ask a couple of questions about this problem. I believe it is a simplified version of the Yang-Mills you encounter in particle physics.

    Basicly assuming that [itex] F_{\mu\nu} → F_{\mu\nu} + A_\mu A_\nu - A_\nu A_\mu [/itex] (for some non-commuting potentials) we were instructed to derive the field equations from our lagrangian [itex] S = \frac{1}{4} \int tr ( F_{\mu\nu}F^{\mu \nu}) d^4x [/itex]. Firstly I wondered if we could just ignore the fact that we are taking a trace.. I am really not sure what difference this makes to the equations. He presented the problem with basically only the details I described above so perhaps the trace is used when dealing with a more complex problem.. I'm not entirely sure. Anyways I decided to try varying with respect to [itex] \delta A_\nu[/itex] the math went like:

    [itex] \delta S = \frac{1}{4} \int tr (( F_{\mu\nu} \delta F^{\mu \nu} + \delta F^{\mu\nu} F_{\mu \nu} ) d^4x [/itex]

    [itex] \delta S = \frac{1}{2} \int tr ((\delta F_{\mu\nu} F^{\mu \nu} )) d^4x [/itex]

    [itex] \delta S = \frac{1}{2} \int tr (F^{\mu\nu} (\partial_\mu \delta A_\nu - \partial_\nu \delta A_\mu + \delta(A_\mu A_\nu - A_\nu A_\mu))) d^4x [/itex]

    [itex] \delta S = \int tr (F^{\mu\nu} (\partial_\mu \delta A_\nu + (\delta A_\mu A_\nu + A_\mu \delta A_\nu))) d^4x [/itex]

    Now using integration by parts and boundary conditions on the first term and similar index swapping from above I made it about as far as I could on my own and came upon

    [itex] \delta S = \int tr ((-\partial_\mu F^{\mu\nu} \delta A_\nu +F^{\mu\nu} (-\delta A_\nu A_\mu + A_\mu \delta A_\nu))) d^4x [/itex]

    So then I supposed that I could multiply by some sort of inverse matrix or something of the sort. Also I assumed that we could ignore the trace (which again I'm not sure is correct, but I have never varied a trace of a matrix before). My final equation came out to be.

    [itex] \partial_\mu F^{\mu\nu} + (\delta A_\nu A_\mu (\delta A_\nu)^{-1} - A_\mu) F^{\mu\nu} = 0 [/itex]

    However something has definitely gone wrong, I know I need to get rid of that pesky [itex] \delta A_\nu [/itex].. I also tried using Euler-Lagrange equations but I had even less luck than here.. I would just like to bounce some ideas off you guys and perhaps get pointed in the direction of some good literature.
    Last edited: Jul 8, 2013
  2. jcsd
  3. Jul 8, 2013 #2
    In your second-to-last equation, why don't you split up the trace using tr(X+Y+Z) = tr(X) + tr(Y) + tr(Z) and then use the cyclic property of the trace on the Y term to get the ##\delta A## in the same spot as it is in the X and Z terms? Then you can combine the three terms back together again and get a nice expression for ##\delta S/\delta A##, as desired.

    Regarding the trace: I bet you can convince yourself that your method for taking the variation of a trace is correct if you try a few concrete examples. Pick a specific matrix ##M## and a variation ##\delta M##. Then compute the variation ##\delta ({\rm tr} (M)) = {\rm tr}(M + \delta M) - {\rm tr} (M)## and compare to your assumption that ##\delta ({\rm tr} M) = {\rm tr} (\delta M)##.

    In the end you get an equation of the form ##{\rm tr} (M \delta A) = 0## which must hold for any value of the matrix ##\delta A##. See if you can think up examples of matrices ##\delta A## that force particular entries in the matrix M to be zero. If you can do so for every entry in ##M##, then the equation reduces to ##M = 0##, as desired.
    Last edited: Jul 8, 2013
  4. Jul 8, 2013 #3
    Ahh, I have never seen this property before. Using that I arrived at

    [itex] \partial_\mu F^{\mu\nu} =A_\mu F^{\mu\nu} - F^{\mu\nu} A_\mu [/itex]

    Is this a reasonable answer for such a problem? Sorry but I have no intuition for the yang-mills and it is a very dense subject.

    I just saw your edit about the trace, I will play around with it and see if I can't convince myself, I understand that the trace is linear and hence I felt this was an okay way to approach it, but it just felt odd that in the end I completely disregard the fact that I am taking a trace in order to reach this conclusion, really the above line should be enclosed in a trace operator but I feel as if that makes no diference.
  5. Jul 8, 2013 #4

    king vitamin

    User Avatar
    Gold Member

    Of course, stating that the trace of two matrices is equal is a much weaker statement than being able to say that the two matrices are equal, so it's very important that the equation you get is not just relating the trace of both sides.

    The key is that the variation of A is totally arbitrary - you want the the action to be an extremum for any possible variation. So you just need to convince yourself that δA can be chosen such that tr(MδA)=0 implies an arbitrary chosen component M_{ij} must be zero. So for each component of M, you can find a variation of A forcing that component to zero.
  6. Jul 10, 2013 #5
  7. Jul 10, 2013 #6
    Looks about right. Often this would be written as

    ##D_\mu F^{\mu \nu} = 0##

    where the covariant derivative ##D## acts on a general matrix-valued field ##G## as

    ##D_\mu G = \partial_\mu - [A_\mu, G]##

    (The brackets are a matrix commutator.)

    The covariant derivative in Yang-Mills theory is supposed to closely parallel the covariant derivative in GR. And ##A## and ##F## in Yang-Mills theory are respectively the analogs of the Christofel symbol ##\Gamma## and the Riemann curvature tensor ##R_{\mu\nu\rho\lambda}##
  8. Jul 11, 2013 #7
    Alright thank you guys. This does seem to closely parallel the stuff were about to be getting into with GR so I am pretty glad to have worked through the problem.. also thank you for linking to that more geometrical derivation.. it was a bit advanced for my taste but it was interesting to see none the less.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook