# Conjugate Transpose for a vector

1. Sep 22, 2009

### Hepth

In particle physics, we commonly have the gamma matrices, whose conjugate transpose is the raised or lowered index. Does the same rule apply to ANY indexed quantity? What about to scalar/vectors like momentum.

Is the conjugate of momentum:

$$\left(q_\mu\right)^\dagger = q^\mu$$

The reason I ask is I am trying to compute:

$$\left(\sigma_{\mu\nu} q^\nu \right)^\dagger=$$

I get:
$$\left(\sigma_{\mu\nu}\right)^\dagger =-\sigma^{\nu \mu}$$
but, since the q is a "scalar" quantity when summed over, it doesn't change any signs under the adjoint, correct?

Last edited: Sep 22, 2009
2. Sep 23, 2009

### CompuChip

In general, it is not true that
$$\left(q_\mu\right)^\dagger = q^\mu$$
for arbitrary q.
Operators for which this is true are called Hermitian. In quantum mechanics, all observables are Hermitian (position, momentum and energy being some of them).

3. Sep 23, 2009

### Hepth

Yeah, I think I figured that out as I finish writing the question, but I guess you can't delete posts in homework help after they've been edited? So I let it stay.

So this means while I'm calculating a decay amplitude of a process I NEED to keep track of the indices on all my q's.

BUT if I have an operator/hamiltonian, say, what I have above: $$\sigma_{\mu\nu} q^\nu$$ this will end up being a vector (gammas) after contraction, correct? But the conjugate process, $$q^\mu \sigma^{\mu \nu}$$ doesn't contract. I mean, sure, the sums are still implied, but it doesn't leave me with an invariant, or an operator that transforms the same way as the original, right?

4. Sep 23, 2009

### CompuChip

I don't think
$$q^\mu \sigma^{\mu \nu}$$
is a valid expression. You mightmean
$$q_\mu \sigma^{\mu \nu}$$
but that will transform like a vector (only one with the index upstairs).

5. Sep 23, 2009

### Hepth

thats what I think, but following my notation:

$$\left(\sigma_{\mu\nu} q^\nu\right)^\dagger$$
=
$$\left(q^\nu\right)^\dagger \left(\sigma_{\mu\nu}\right)^\dagger$$
=
$$q^\nu \left(\sigma_{\mu\nu}\right)^\dagger$$

$$\left(\sigma_{\mu\nu}\right)^\dagger = -\frac{i}{2} \left[\gamma_\mu , \gamma_\nu \right]^\dagger = -\frac{i}{2}\left( \left(\gamma_\mu \gamma_\nu \right)^\dagger - \left(\gamma_\nu \gamma_\mu \right)^\dagger\right) = -\frac{i}{2}\left( \gamma_\nu^\dagger \gamma_\mu^\dagger - \gamma_\mu^\dagger \gamma_\nu^\dagger \right)$$
with $$\gamma_\mu^\dagger = \gamma^\mu$$ right?
so
$$=-\frac{i}{2}\left( \gamma^\nu \gamma^\mu - \gamma^\mu \gamma^\nu \right) =\frac{i}{2}\left( \gamma^\mu \gamma^\nu - \gamma^\nu \gamma^\mu \right) = \sigma^{\mu \nu}$$

Therefor
$$\left(\sigma_{\mu\nu} q^\nu\right)^\dagger = q^\nu \sigma^{\mu \nu}$$

Am I doing one of these wrong?

6. Sep 24, 2009

### turin

Is that independent of the basis that you choose for the gammas? (I would be surprised if it is, but also happy to know about this.) Is there a reason that you have chosen to avoid such a construction as:
$$A\gamma^{\mu\dagger}A=\gamma^{\mu}$$
(where, for me $A$ usually means $\gamma^{0}$)?

7. Sep 24, 2009

### turin

It depends on your notation.1 At least I agree that this is an archaic notation (since the unification of GR and QM is in fashion). Also, the OP obviously wants to make a distinction between raised and lowered indices, but I'm not clear what this distinction should really be in the OP's context.

1 See, for example, J. J. Sakurai, Advanced Quantum Mechanics, Addison-Wesley (1967) [ISBN 0-201-06710-2].

8. Sep 24, 2009

### Hepth

If you're in the dirac basis, http://en.wikipedia.org/wiki/Gamma_matrices,
and you're metric is {1,-1,-1,-1} diagonal (for calculating this cross section), then isn't it true that
$$\gamma^0^\dagger = \gamma_0 = \gamma^0$$
$$\gamma^1^\dagger = -\gamma^1 = \gamma_1$$
$$\gamma^2^\dagger = -\gamma^2 = \gamma_2$$
$$\gamma^3^\dagger = -\gamma^3 = \gamma_3$$

Ahh, so you're saying, I have to choose a basis first. Well, can't I just choose the dirac basis and stick with it? Or should/ MUST I leave it completely general. I mean, I'm using dirac spinors for my wavefunctions.

9. Sep 24, 2009

### Hepth

Using obviously $$\gamma_\mu = g_{\mu \nu} \gamma^\nu$$

10. Sep 24, 2009

### turin

I'm not saying that you have to, but I always do. What I'm really suggesting is that you can just use
$$\gamma^{\mu\dagger}=\gamma^{0}\gamma^{\mu}\gamma^{0}$$
and not worry about the raising and lowering with the Hermitian conjugate (for Lorentz tensor indices anyway). However, this does provide some kind of uniformity to the peculiar raising and lowering of spinor and flavor indices that one encounters in SUSY. I just never realized that I should also be raising and lowering a Lorentz tensor index under Hermitian conjugation.

11. Sep 24, 2009

### turin

So, you're saying that
$$\gamma^{\mu\dagger} = g_{\mu \nu} \gamma^\nu$$
so that the Hermitian conjugate acts on the gamma matrices in the same way that the metric tensor does. Then, I think that this is just a special property of the gamma matrices (and even perhaps a special subset of representations), and you should not extend this to general Lorentz-vector-valued objects. For instance, if $q^\mu$ is a Hermitian Lorentz-vector-operator, then your rule would suggest that
$$q^\mu = g_{\mu \nu} q^\nu$$
which would only be true in certain frames (i.e. the rest frame of q using your metric signature).

12. Sep 24, 2009

### Hepth

No, i wasn't trying to generalize, just talking about gamma matrices and actually doing the matrix math you see, in the dirac basis at least, that it is true, for a {1,-1,-1,-1} metric.

But, like was said earlier, i just ended up using the $$\gamma_0 \gamma_\mu^\dagger \gamma_0 = \gamma_\mu$$

Now I get to simplify 256 combinations of gamma matrices....

13. Sep 25, 2009

### turin

There are computer algebra systems for that. Are you familiar with REDUCE?

14. Sep 26, 2009

### Hepth

No, but I heard about FeynCalc for mathematica which I was informed should do it.

15. Sep 26, 2009

### turin

That works if you're willing to sell your soul ... No, seriously, that should work feyn. REDUCE is simply the free alternative that came to mind.