Divergence of product of killing vector and energy momentum tensor vanishes. Why?

Derivator
Messages
147
Reaction score
0
Hi,

in my book, it says:
-----------------------
Beacause of T^{\mu\nu}{}{}_{;\nu} = 0 and the symmetry of T^{\mu\nu}, it holds that

\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0
-----------------------

(here, T^{\mu\nu} ist the energy momentum tensor and \xi_\mu a killing vector. The semicolon indicates the covariant derivative, i.e. ()_{;} is the generalized divergence)I don't understand, why from

" T^{\mu\nu}{}{}_{;\nu} = 0 and the symmetry of T^{\mu\nu} "

it follows, that

\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0

must hold.---
derivator
 
Physics news on Phys.org
What results when

\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0

is expanded?
 
<br /> \left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = \left(T^{\mu\nu})_{;\nu}\xi_\mu\right + T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = 0 + T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} <br />

So <br /> T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} <br /> should be equal to 0. But why?
 
Use the fact that T^{\mu\nu} is symmetric.
 
Lets write <br /> <br /> T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} <br /> <br /> without Einstein summation convention:

<br /> <br /> \sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\mu\right)_{;\nu} = - \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu}<br /> <br />

I see no chance to get it =0
:-(
 
Is

\sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\alpha\sum_\beta T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}

correct?
 
why shouldn't it be correct? You only changed the names of the indices?!
 
Derivator said:
why shouldn't it be correct? You only changed the names of the indices?!

What about

\sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu} = \sum_\beta\sum_\alpha T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}
 
<br /> T^{\mu\nu}(\xi_\mu)_{;\nu} = (1/2)\left( T^{\mu\nu}\xi_{\mu;\nu} + T^{\nu\mu}\xi_{\nu;\mu} \right) = (1/2)T^{\mu\nu}\left(\xi_{\mu;\nu} + \xi_{\nu;\mu} \right) = 0<br />
 
Last edited:
  • #10
George Jones said:
What about

\sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu} = \sum_\beta\sum_\alpha T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}
Yes, it's also correct. But I don't see your point.
 
  • #11
Derivator said:
Yes, it's also correct. But I don't see your point.

You wrote
Derivator said:
\sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\mu\right)_{;\nu} = - \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu}

Substitute the relabeled expressions into the left and right sides of the above.
 
  • #12
Oh I see! Thanks!

---------

@samalkhaiat

how do you justify this step:
<br /> <br /> T^{\mu\nu}(\xi_\mu)_{;\nu} = (1/2)\left( T^{\mu\nu}\xi_{\mu;\nu} + T^{\nu\mu}\xi_{\nu;\mu} \right)<br /> <br />

?

is it true, that you only relabled the summation indices in
<br /> <br /> T^{\nu\mu}\xi_{\nu;\mu} \right)<br /> <br />
 
Last edited:
  • #13
That is correct.
 
Back
Top