Hermitian conjugation of a four-vector

  • Thread starter Thread starter miemie0205
  • Start date Start date
  • Tags Tags
    Hermitian
miemie0205
Messages
4
Reaction score
0

Homework Statement


$$M=C/m(k.k'g^{\mu\nu} - k^{\nu}k'^{\mu})\epsilon ^*_{\mu}(k,\lambda)\epsilon _{\nu}(k',\lambda ')$$
Calculate $$\sum _{\lambda} |M|^2$$

Homework Equations


$$\sum _{\lambda}\epsilon ^*_{\mu}\epsilon _{\nu}=-g_{\mu\nu}$$

The Attempt at a Solution


Firstly, I find $$M^{\dagger}= C/m[k.k'g_{\mu\nu} + k_{\nu}k'_{\mu}]\epsilon _{\nu}\epsilon ^*_{\mu}$$
Is this right? I'm confused because I usually calculate hermitian conjugation of an operator, not a specificfour-vector like $$k_{\mu}$$
 
Physics news on Phys.org
You should probably to clarify what each factor is. I'm guessing that the ##\epsilon_\mu(k,\lambda)## are complex-valued polarization vectors which mutually commute? If so, I'm not sure you need ##M^\dagger##, but merely ##\overline M## (i.e., complex conjugate of ##M##). In which case the final factors would be something like ##\epsilon_\mu \, \epsilon^*_\nu##.

Then perform the computation directly. (Choose a different pair of dummy indices in ##\overline M## or you'll end up in a mess.)
 
  • Like
Likes miemie0205
My bad. I need to correct some points.
$$M=C/m(k\cdot k' g^{\mu\nu} - k^\nu k'^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k',\lambda') $$
with
$$\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k',\lambda')$$ are polarization vectors.
I am fine with these polarization vectors conjugate. They are $$\epsilon_\mu(k,\lambda) \epsilon_\nu(k',\lambda')$$
And the conjugate of $$k\cdot k' g^{\mu\nu}$$ is $$k\cdot k' g_{\mu\nu}$$
How is the conjugate of $$k^\nu k'^\mu$$ Is it $$k^{*\nu}k'^{*\mu}$$
From all the books about quantum physics I have, I know how to take the Hermitian conjugate or complex conjugate of an operator, a matrix. But I am confused with 4-momenta like $$k^\nu k'^\mu$$
 
miemie0205 said:
And the conjugate of $$k\cdot k' g^{\mu\nu}$$ is $$k\cdot k' g_{\mu\nu}$$
Are you sure about that? :oldwink:

Aren't those things all real-valued? I'm guessing that ##{\mathbf k}## is a real-valued vector, not complex-valued. (?)

Btw, ##g^{\mu\nu}## denotes the components of the matrix inverse to the matrix whose components are ##g_{\mu\nu}##.

How is the conjugate of $$k^\nu k'^\mu$$ Is it $$k^{*\nu}k'^{*\mu}$$
If ##{\mathbf k}## is a real-valued vector, you don't have to do anything. (Note that ##M## as a whole is a scalar, afaict.)

If ##{\mathbf k}## were complex-valued, then you'd just take the complex conjugate of each component.

HTH.
 
  • Like
Likes miemie0205
From the book that I am using: the metric tensor is:
$$g_{\mu\nu}=g^{\mu\nu}= \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & -1 & 0 & 0\\
0 & 0 & -1 & 0\\
0 & 0 & 0 & -1
\end{pmatrix}$$
and k, k' are the four-momenta of two photons in a final state of $$h(q) \leftarrow \gamma(k,\lambda)\gamma(k',\lambda')$$
h(q) is the Higgs Boson. So k, k' are real, aren't they?
The thing that made me confused is that $$k,k'$$ and $$k^\nu,k'^\mu$$ appear together in the parentheses.
I have to take this Particle class in less than 3 weeks, so I apologize for my silly questions (if they are) :cry:
 
Yes, ##k## and ##k'## are real-valued 4-vectors. Note that ##k## (no index) usually means the whole vector, whereas ##k^\mu## means the ##\mu##'th component thereof. Hence, ##k \cdot k'## is probably shorthand for ##k^\alpha k'_\alpha##, which is the same as ##g_{\alpha\beta} k^\alpha k'^\beta##.

Btw, I don't regard any question as "silly" if the student is genuinely trying to learn. :oldbiggrin:
We just need to zero in on exactly what you don't understand.
 
  • Like
Likes miemie0205
So ##k\cdot k'=\sum_{over all \alpha} k_\alpha k'^\alpha## right?
Within the same book when they mention the 4-vector, they wrote p (without the above arrow). Normal vector they wrote ##\vec p## or ##\bf p## and ##p^\mu## is the contravariant vector, which is ##p^\mu=(E, p_x, p_y, p_z)##
Then I can interpret M as another form as follow:
M=C/m(k\cdot k&#039; g^{\mu\nu} - k^\nu k&#039;^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k&#039;,\lambda&#039;)<br /> =C/m(g_{\mu\nu}k^\mu k&#039;^\nu - k^\nu k&#039;^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k&#039;,\lambda&#039;)
At this point everything makes sense.
And ##g_{\mu\nu}##, ##g^{\mu\nu}##, are they equal?
 
miemie0205 said:
So ##k\cdot k'=\sum_{over all \alpha} k_\alpha k'^\alpha## right?
Yes. But since you need to ask this question, read this Wiki page about the Einstein summation convention. I was using that convention when I wrote ##k_\alpha k'^\alpha## -- the summation over ##\alpha## is understood and need not be shown explicitly.

Within the same book
Which book? (Btw, it's always best to mention your sources at the beginning of this kind of thread.)

when they mention the 4-vector, they wrote p (without the above arrow). Normal vector they wrote ##\vec p## or ##\bf p## and ##p^\mu## is the contravariant vector, which is ##p^\mu=(E, p_x, p_y, p_z)##
You'd have to check the author's conventions carefully. Sometimes the over-arrow denotes a 3-vector. Sometimes bold font denotes a 3-vector, but ordinary font denotes 4-vector. Sometimes Greek indices denote 4-vector indices, while Latin indices (i,j,k, etc) denote 3-vector indices.

Then I can interpret M as another form as follow:
M=C/m(k\cdot k&#039; g^{\mu\nu} - k^\nu k&#039;^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k&#039;,\lambda&#039;)<br /> =C/m(g_{\mu\nu}k^\mu k&#039;^\nu - k^\nu k&#039;^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k&#039;,\lambda&#039;)
At this point everything makes sense.
NO. Your right hand side is wrong. You should have written $$C/m(k_\alpha k'^\alpha g^{\mu\nu} - k^\nu k'^\mu)\epsilon^*_\mu(k,\lambda) \epsilon^*_\nu(k',\lambda') ~.$$Can you see the difference? In the original ##M## there's implicit summation over ##\mu## and ##\nu##, so ##M## is a scalar. You can't just introduce more of the same indices inside the expression. (Write out ##M## with explicit summation signs to see what I mean.)

Btw, both your ##\epsilon## now seem to have a star, but in your OP only one of them did.

And ##g_{\mu\nu}##, ##g^{\mu\nu}##, are they equal?
Those components happen to be numerically equal in this case because that matrix is its own inverse. I.e., let $$g ~:=~ \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & -1 & 0 & 0\\
0 & 0 & -1 & 0\\
0 & 0 & 0 & -1
\end{pmatrix}$$Then ##g^{-1} = g##. The matrix equation ##g g^{-1} = 1## is written in component form as
$$g_{\mu\lambda}\, g^{\lambda\nu} ~=~ \delta_\mu^\nu ~,$$where ##\delta_\mu^\nu## is the Kronecker delta. You'll need to become familiar with translating between such matrix notation and component notation, and understand how to use the summation convention on repeated indices.
 
Back
Top