Divergence of product of killing vector and energy momentum tensor vanishes. Why?

Click For Summary

Discussion Overview

The discussion centers around the divergence of the product of a Killing vector and the energy-momentum tensor, specifically exploring why the expression \((T^{\mu\nu}\xi_\mu)_{;\nu} = 0\) holds true under certain conditions. Participants delve into the implications of the symmetry of the energy-momentum tensor and the properties of covariant derivatives.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant questions the derivation of \((T^{\mu\nu}\xi_\mu)_{;\nu} = 0\) from the conditions \(T^{\mu\nu}{}{}_{;\nu} = 0\) and the symmetry of \(T^{\mu\nu}\).
  • Another participant suggests expanding \((T^{\mu\nu}\xi_\mu)_{;\nu}\) to analyze its components.
  • A participant provides an expansion leading to the expression \(T^{\mu\nu}(\xi_\mu)_{;\nu} = 0 + T^{\mu\nu}(\xi_\mu)_{;\nu}\), questioning why the latter must equal zero.
  • Discussion includes the symmetry of \(T^{\mu\nu}\) as a potential factor in the analysis.
  • Participants explore the implications of relabeling indices in summations and whether it affects the validity of their expressions.
  • One participant presents a formula involving the average of two terms, questioning the justification for this step.
  • Clarifications are made regarding the relabeling of summation indices and its correctness.

Areas of Agreement / Disagreement

Participants express differing views on the derivation and implications of the expressions involving the energy-momentum tensor and the Killing vector. No consensus is reached on the justification of certain steps in the derivation.

Contextual Notes

Participants rely on the properties of the energy-momentum tensor and the Killing vector, but the discussion reveals uncertainties regarding the implications of symmetry and the treatment of indices in summations.

Derivator
Messages
147
Reaction score
0
Hi,

in my book, it says:
-----------------------
Beacause of [tex]T^{\mu\nu}{}{}_{;\nu} = 0[/tex] and the symmetry of [tex]T^{\mu\nu}[/tex], it holds that

[tex]\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0[/tex]
-----------------------

(here, [tex]T^{\mu\nu}[/tex] ist the energy momentum tensor and [tex]\xi_\mu[/tex] a killing vector. The semicolon indicates the covariant derivative, i.e. [tex]()_{;}[/tex] is the generalized divergence)I don't understand, why from

" [tex]T^{\mu\nu}{}{}_{;\nu} = 0[/tex] and the symmetry of [tex]T^{\mu\nu}[/tex] "

it follows, that

[tex]\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0[/tex]

must hold.---
derivator
 
Physics news on Phys.org
What results when

[tex]\left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = 0[/tex]

is expanded?
 
[tex] \left(T^{\mu\nu}\xi_\mu\right)_{;\nu} = \left(T^{\mu\nu})_{;\nu}\xi_\mu\right + T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = 0 + T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} [/tex]

So [tex] T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} [/tex] should be equal to 0. But why?
 
Use the fact that T^{\mu\nu} is symmetric.
 
Lets write [tex] <br /> T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} <br /> [/tex] without Einstein summation convention:

[tex] <br /> \sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\mu\right)_{;\nu} = - \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu}<br /> [/tex]

I see no chance to get it =0
:-(
 
Is

[tex]\sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\alpha\sum_\beta T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}[/tex]

correct?
 
why shouldn't it be correct? You only changed the names of the indices?!
 
Derivator said:
why shouldn't it be correct? You only changed the names of the indices?!

What about

[tex]\sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu} = \sum_\beta\sum_\alpha T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}[/tex]
 
[tex] T^{\mu\nu}(\xi_\mu)_{;\nu} = (1/2)\left( T^{\mu\nu}\xi_{\mu;\nu} + T^{\nu\mu}\xi_{\nu;\mu} \right) = (1/2)T^{\mu\nu}\left(\xi_{\mu;\nu} + \xi_{\nu;\mu} \right) = 0[/tex]
 
Last edited:
  • #10
George Jones said:
What about

[tex]\sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu} = \sum_\beta\sum_\alpha T^{\alpha\beta} \left(\xi_\alpha\right)_{;\beta}[/tex]
Yes, it's also correct. But I don't see your point.
 
  • #11
Derivator said:
Yes, it's also correct. But I don't see your point.

You wrote
Derivator said:
[tex]\sum_\mu\sum_\nu T^{\mu\nu} \left(\xi_\mu\right)_{;\nu} = \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\mu\right)_{;\nu} = - \sum_\mu\sum_\nu T^{\nu\mu} \left(\xi_\nu\right)_{;\mu}[/tex]

Substitute the relabeled expressions into the left and right sides of the above.
 
  • #12
Oh I see! Thanks!

---------

@samalkhaiat

how do you justify this step:
[tex] <br /> T^{\mu\nu}(\xi_\mu)_{;\nu} = (1/2)\left( T^{\mu\nu}\xi_{\mu;\nu} + T^{\nu\mu}\xi_{\nu;\mu} \right)<br /> [/tex]

?

is it true, that you only relabled the summation indices in
[tex] <br /> T^{\nu\mu}\xi_{\nu;\mu} \right)<br /> [/tex]
 
Last edited:
  • #13
That is correct.
 

Similar threads

  • · Replies 38 ·
2
Replies
38
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
990
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 50 ·
2
Replies
50
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K