Index Raising in Linearized General Relavitiy

In summary, the book is confusing me on the next equality. This is what I expected when applying the flat metric: \Gamma^{\alpha}_{\phantom{k}\mu\nu}=\frac{1}{2}(\eta^{\alpha\beta}h_{\mu\beta,\nu}+\eta^{\alpha\beta}h_{\beta\nu,\mu}-\eta^{\alpha\beta}h_{\mu\nu,\beta})\\\Gamma^{\alpha}_{\phantom{k}\mu\nu}=\frac{1}{2}(h_{\mu\phantom{\alpha
  • #1
alex3
44
0
I'm reading a few textbooks (Straumann, Schutz, Hartle) on GR and am a little confused working through a small part of each on linearized GR.

1. Relevant equations

Using Straumann, the Ricci tensor is given by

[tex]
R_{\mu\nu} =
\partial_{\lambda} \Gamma^{\lambda}_{\phantom{k}\nu\mu} -
\partial_{\nu} \Gamma^{\lambda}_{\phantom{k}\lambda\mu}
[/tex]

with the Christoffel symbols given by

[tex]
\Gamma^{\alpha}_{\phantom{k}\mu\nu}
=
\frac{1}{2}\eta^{\alpha\beta}
(
h_{\mu\beta,\nu} +
h_{\beta\nu,\mu} -
h_{\mu\nu,\beta}
)
[/tex]

2. The problem

My problem is that the book is confusing me on the next equality. This what I expected when applying the flat metric:

[tex]
\Gamma^{\alpha}_{\phantom{k}\mu\nu}
=
\frac{1}{2}
(
\eta^{\alpha\beta}h_{\mu\beta,\nu} +
\eta^{\alpha\beta}h_{\beta\nu,\mu} -
\eta^{\alpha\beta}h_{\mu\nu,\beta}
)
\\
\Gamma^{\alpha}_{\phantom{k}\mu\nu}
=
\frac{1}{2}
(
h_{\mu\phantom{\alpha},\nu}^{\phantom{k}\alpha} +
h^{\alpha}_{\phantom{\alpha}\nu,\mu} -
h_{\mu\nu}^{\phantom{\mu\nu},\alpha}
)
[/tex]

i.e. the flat metric raises all [itex]\beta[/itex]'s to [itex]\alpha[/itex]'s.

However, the book gets this

[tex]
\Gamma^{\alpha}_{\phantom{k}\mu\nu}
=
\frac{1}{2}
(
h^{\alpha}_{\phantom{\alpha}\mu,\nu} +
h^{\alpha}_{\phantom{\alpha}\nu,\mu} -
h_{\mu\nu}^{\phantom{\mu\nu},\alpha}
)
[/tex]

So, the problem is in the first term: how come the book is able to swap the [itex]\alpha[/itex] and [itex]\mu[/itex] like that?
 
Physics news on Phys.org
  • #2
Because [itex]h_{\mu \beta , \nu}[/itex] is symmetric in [itex]\mu[/itex] and [itex]\beta[/itex], [itex]h_{\mu\phantom{\alpha},\nu}^{\phantom{k}\alpha} = h^{\alpha}_{\phantom{\alpha}\mu,\nu}[/itex].


[tex]\eta^{\alpha\beta} h_{\mu\beta,\nu} = \eta^{\alpha \beta} h_{\beta \mu , \nu}[/tex]
 
  • #3
How do we know that [itex]h_{\alpha\beta}[/itex] is symmetric? I can't see it mentioned anywhere. The only condition I see is [itex]\lvert h_{\alpha\beta}\rvert \ll 1[/itex].
 
  • #4
alex3 said:
How do we know that [itex]h_{\alpha\beta}[/itex] is symmetric? I can't see it mentioned anywhere. The only condition I see is [itex]\lvert h_{\alpha\beta}\rvert \ll 1[/itex].

[itex]h_{\alpha\beta} = g_{\alpha\beta} - \eta_{\alpha\beta}[/itex], and [itex]g[/itex] and [itex]\eta[/itex] are both symmetric.
 
  • #5
Why do we assume [itex]g_{\alpha\beta}[/itex] is symmetric then? Is that a property we assume of all metrics? I didn't think we did. Do we assume symmetry of [itex]g_{\alpha\beta}[/itex] as it deviates only slightly from the Minkowski metric?
 
Last edited:
  • #6
alex3 said:
Why do we assume [itex]g_{\alpha\beta}[/itex] is symmetric then? Is that a property we assume of all metrics?[/itex]

In standard general relativity, yes.

alex3 said:
I didn't think we did. Do we assume symmetry of [itex]g_{\alpha\beta}[/itex] as it deviates only slightly from the Minkowski metric?

No, a symmetric [itex]g[/itex] can differ substantially from the Minkowski metric.

If the metric weren't symmetric, then it would not always have a tangent space isomorphic to Minkowski spacetime. If a metric tensor field is not symmetric, then there exists at least one point (event) at which the metric tensor for the tangent space is not symmetric.
 
  • #7
Got it now, thank you very much!
 

1. What is index raising in linearized general relativity?

Index raising is a mathematical operation used in linearized general relativity to convert a lower index into an upper index, or vice versa. This is done by multiplying the tensor with the metric tensor's inverse.

2. Why is index raising important in linearized general relativity?

Index raising is important in linearized general relativity because it allows us to easily manipulate and transform tensors, which are crucial in understanding the equations and concepts of general relativity.

3. How is index raising different from index lowering?

Index raising and index lowering are inverse operations. While index raising involves multiplying the tensor with the inverse metric tensor, index lowering involves multiplying the tensor with the metric tensor itself. This changes the position of the indices and can affect the properties of the tensor.

4. Can index raising be applied to any tensor in linearized general relativity?

Yes, index raising can be applied to any tensor in linearized general relativity as long as the tensor is covariant or contravariant. However, the resulting tensor may have different properties depending on the type of tensor being raised.

5. Is index raising reversible in linearized general relativity?

Yes, index raising is reversible in linearized general relativity. This means that if we raise an index and then lower it again, we will get back the original tensor. This is because the inverse metric tensor is the inverse of the metric tensor, so the operations cancel each other out.

Similar threads

  • Advanced Physics Homework Help
Replies
2
Views
433
  • Advanced Physics Homework Help
Replies
1
Views
290
  • Advanced Physics Homework Help
Replies
6
Views
1K
  • Advanced Physics Homework Help
Replies
18
Views
1K
  • Advanced Physics Homework Help
Replies
18
Views
2K
  • Advanced Physics Homework Help
Replies
22
Views
3K
  • Advanced Physics Homework Help
Replies
1
Views
3K
  • Advanced Physics Homework Help
Replies
3
Views
848
  • Advanced Physics Homework Help
Replies
0
Views
255
  • Advanced Physics Homework Help
Replies
0
Views
996
Back
Top