Linearized Gravity & Metric Perturbation when Indices Raised

Click For Summary

Discussion Overview

The discussion revolves around the topic of linearized gravity, specifically focusing on the decomposition of the metric into a flat Minkowski metric plus a small perturbation. Participants explore the implications of raising indices on the metric and the perturbation, questioning the conventional results and the definitions involved in tensor index manipulation.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express confusion regarding the relationship between the raised indices of the metric and the perturbation, particularly questioning the sign convention used in the expression for the inverse metric.
  • One participant suggests that the Minkowski metric has a predefined meaning that differs from the symbol obtained when raising indices, implying that this affects the interpretation of the tensor.
  • Another participant explains that the inverse metric is derived from the original metric and that the perturbation introduces complications that lead to a negative sign in the raised indices.
  • There is a discussion about whether raising indices on the perturbation should yield the same results as raising indices on the Minkowski metric, with some arguing that it does not due to the different metrics being used.
  • Participants also discuss the importance of using proper references for tensor manipulation rules, with some expressing skepticism about personal notes compared to established textbooks.
  • One participant mentions the need for clarity on the significance of the Minkowski metric's components in the context of linearized gravity.
  • Several participants share recommendations for textbooks on tensor analysis, indicating a desire for resources that include exercises and clear explanations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the correct interpretation of the raised indices or the implications of the perturbation on the metric. Multiple competing views remain regarding the proper application of tensor manipulation rules and the significance of the Minkowski metric in this context.

Contextual Notes

There are unresolved questions about the assumptions underlying the manipulation of indices and the definitions of the metrics involved. The discussion highlights the complexity of tensor analysis in the context of linearized gravity and the potential for differing interpretations based on the conventions used.

Who May Find This Useful

This discussion may be useful for students and researchers interested in linearized gravity, tensor analysis, and the nuances of metric perturbations in general relativity.

George Keeling
Gold Member
Messages
183
Reaction score
42
TL;DR
##g^{\mu\nu}=\eta^{\mu\nu} \pm h^{\mu\nu}##?
I have just met linearized gravity where we decompose the metric into a flat Minkowski plus a small perturbation$$g_{\mu\nu}=\eta_{\mu\nu}+h_{\mu\nu},\ \ \left|h_{\mu\nu}\ll1\right|$$from which we 'immediately' obtain $$g^{\mu\nu}=\eta^{\mu\nu}-h^{\mu\nu}$$I don't obtain that. In my rule book for tensor index manipulation it says you can always raise all the indices on an equation so I get$$g^{\mu\nu}=\eta^{\mu\nu}+h^{\mu\nu}$$Here's how to get a minus sign. For small changes $$\delta\left(g^{\mu\rho}g_{\rho\nu}\right)=g^{\mu\rho}\delta\left(g_{\rho\nu}\right)+\delta\left(g^{\mu\rho}\right)g_{\rho\nu}=0$$$$\Rightarrow g^{\mu\rho}\delta\left(g_{\rho\nu}\right)g^{\nu\sigma}=-\delta\left(g^{\mu\rho}\right)g_{\rho\nu}g^{\nu\sigma}$$now if I say ##\delta g\rightarrow h## I get $$g^{\mu\rho}g^{\nu\sigma}h_{\rho\nu}=-h^{\mu\sigma}$$So raising indices on ##h_{\mu\nu}## produces the minus sign! Why should I accept this exceptional result instead of the usual one that ##g^{\mu\rho}g^{\nu\sigma}h_{\rho\nu}=h^{\mu\sigma}##?

@haushofer proved the minus sign a different way here but that doesn't tell me why the 'normal' result is wrong.
 
Physics news on Phys.org
George Keeling said:
Summary:: ##g^{\mu\nu}=\eta^{\mu\nu} \pm h^{\mu\nu}##?

I don't obtain that. In my rule book for tensor index manipulation it says you can always raise all the indices on an equation
This is wrong in this case because ##\eta^{\mu\nu}## has a predefined meaning that is different from the symbol you get when you raise the indices with the metric g. It is therefore not a tensor in this case.
 
  • Informative
  • Skeptical
Likes   Reactions: George Keeling and Dale
It's because ##g^{\mu \nu}## is obtained from ##g_{\mu \nu}## as the inverse matrix, i.e., via
$$g^{\mu \nu} g_{\nu \rho}=\delta_{\nu}^{\rho}.$$
Now make
$$g_{\nu \rho}=\eta_{\nu \rho}+h_{\nu \rho}$$
and
$$g^{\mu \nu}=\eta^{\mu \rho}+\tilde{h}^{\mu \nu},$$
where
$$(\eta_{\mu \nu})=(\eta^{\mu \nu})=\mathrm{diag}(1,-1,-1,-1).$$
From this you get, neglecting the term in second order of ##h_{\mu \nu}##,
$$g^{\mu \nu} g_{\nu \rho} = (\eta^{\mu \nu}+\tilde{h}^{\mu \nu})(\eta_{\nu \rho} + h_{\nu \rho})
=\delta_{\rho}^{\mu}+{h^{\mu}}_{\rho} + {\tilde{h}^{\mu}}_{\rho} +\ldots \stackrel{!}{=} \delta_{\rho}^{\mu}.$$
Note that here by definition the raising and lowering of indices is done with the "Minkowski metrix" ##\eta_{\mu \nu}## or ##\eta^{\mu \nu}## respectively. We get
$${\tilde{h}^{\mu}}_{\rho} = -{h^{\mu}}_{\rho}$$
or raising the 2nd index again with ##\eta^{\rho \nu}##:
$$\tilde{h}^{\mu \nu}=-h^{\mu \rho}.$$
So the additional - comes from the extra convention to lower and raise indices with ##\eta_{\mu \nu}## and ##\eta^{\mu \nu}## respectively when working within "linearized GR".
 
  • Like
Likes   Reactions: dextercioby and George Keeling
Orodruin said:
It is therefore not a tensor in this case.
Raising indices on ##g_{\mu\nu}## and ##\eta_{\mu\nu}## worked but on ##h_{\mu\nu}## didn't. Is ##h_{\mu\nu}## the It you refer to? (Edit: No, it's ##\eta_{\mu\nu}## or ##{\widetilde{\eta}}^{\mu\nu},## see below)
 
Last edited:
vanhees71 said:
Note that here by definition the raising and lowering of indices is done with the "Minkowski metrix"
When raising indices on ##h## I don't think it makes any difference whether you use ##g## or ##\eta## or whether there's a + or a - in the ##h_{μν}## because nearly everything is second order or smaller: $$g^{\mu\sigma}g^{\nu\rho}h_{\sigma\rho}=\left(\eta^{\mu\sigma}\pm h^{\mu\sigma}\right)\left(\eta^{\nu\rho}\pm h^{\nu\rho}\right)h_{\sigma\rho}=\eta^{\mu\sigma}\eta^{\nu\rho}h_{\sigma\rho}\pm\eta^{\mu\sigma}h^{\nu\rho}h_{\sigma\rho}\ldots$$
The proof you give is pretty like the one I gave. Yours raises and lower indices on ##{\widetilde{h}}^{\mu\nu},h_{\mu\nu}## in the normal way. Why is it suddenly OK to do that?
I'm still confused. 😥
 
George Keeling said:
Raising indices on ##g_{\mu\nu}## and ##\eta_{\mu\nu}## worked but on ##h_{\mu\nu}## didn't. Is ##h_{\mu\nu}## the It you refer to?
Raising the indices on ##g## and ##\eta## does not use the same metric. That is the entire point. They differ by a quantity of order h.
 
  • Like
  • Informative
Likes   Reactions: vanhees71 and George Keeling
Consider the one-dimensional manifold with metric ##g= 1+h##. The inverse metric is then ##1/g = 1/(1+h) = 1-h + \mathcal O(h^2)##. This already tells you that the + sign in the inverse is wrong.
 
  • Like
Likes   Reactions: dextercioby, vanhees71 and George Keeling
George Keeling said:
In my rule book for tensor index manipulation
What rule book? Please give a specific reference.
 
Orodruin said:
Raising the indices on g and η does not use the same metric. That is the entire point. They differ by a quantity of order h.
At last I get it:
we can raise indices on $$g_{\mu\nu}=\eta_{\mu\nu}+h_{\mu\nu}$$and get$$g^{\mu\nu}=\eta^{\mu\nu}+h^{\mu\nu}$$but the ##\eta^{\mu\nu}## in that is the Minkowski metric in the ##g^{\mu\nu}## manifold. So its components are not the usual$${\widetilde{\eta}}^{\mu\nu}=\left(\begin{matrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\\\end{matrix}\right)$$but something slightly different. We want to use ##{\widetilde{\eta}}^{\mu\nu}## , not ##\eta^{\mu\nu}##. So we'll also have to replace ##h^{\mu\nu}## by an unknown tensor ##{\widetilde{h}}^{\mu\nu}## and we get $$g^{\mu\nu}={\widetilde{\eta}}^{\mu\nu}+{\widetilde{h}}^{\mu\nu}$$which is the same as
vanhees71 said:
##g^{\mu \nu}=\eta^{\mu \rho}+\tilde{h}^{\mu \nu},##
give or take a tilde.
I had also missed the significance of the following line
vanhees71 said:
##(η_{μν})=(η^{μν})=diag(1,−1,−1,−1).##
and I am using the opposite sign convention and using up more screen space. Then I can just follow vanhees61.

Thank you both for your patience!
 
  • Like
Likes   Reactions: vanhees71
  • #10
PeterDonis said:
What rule book? Please give a specific reference.
It's my collected notes on various rules about tensors. You probably wouldn't call it a proper book!
 

Attachments

  • #11
George Keeling said:
It's my collected notes on various rules about tensors.
Since you open with a claim about there being an error in Carroll, and your claim is erroneous, you should not be depending on these to be accurate. I would strongly suggest using a textbook on tensor analysis as a reference if you want to be sure you're using the right rules. I would also strongly suggest asking for input from others before treating any claim about an error in a published textbook as accurate.

George Keeling said:
You probably wouldn't call it a proper book!
Indeed not.
 
  • Haha
Likes   Reactions: George Keeling
  • #12
Well, I think it's a good attitude, not to blindly believe what's printed in textbooks or even peer-reviewed papers. If I'd had a recipe against typos, misconceptions, or other errors, I'd made a lot of money selling the method ;-)). Of course, it's likely that one misunderstands the textbook, but if you can't rederive some result in it, it's good to question whether the author of the textbook or the paper may be in error and look for other treatments of the subject.
 
  • Like
Likes   Reactions: George Keeling
  • #13
PeterDonis said:
Since you open with a claim about there being an error in Carroll,
Sorry if it looked like that. I should have emphasised that it was me being so thick that I couldn't understand.
 
  • #14
One more thing
PeterDonis said:
I would strongly suggest using a textbook on tensor analysis
Can you recommend such a textbook with lots of exercises?
 
  • #15
George Keeling said:
Can you recommend such a textbook with lots of exercises?
If you mean a math textbook, I can't recommend one since my knowledge of tensor analysis comes from physics textbooks. The one I first learned it from was MTW, but that might be too heavy (both in the literal and figurative sense--the book weighs more than any other book in my library by a fair margin, excepting only the unabridged dictionary and the CRC). Carroll's book (or his free online lecture notes) might be a better, lighter introduction from a physics viewpoint, but it might not meet the "lots of exercises" part.
 
  • Like
Likes   Reactions: George Keeling
  • #16
I know which textbook I would suggest ... but I am quite biased on the matter ;)
 
  • Like
Likes   Reactions: George Keeling
  • #17
That I can warmly suggest:

https://www.taylorfrancis.com/books...l-methods-physics-engineering-mattias-blennow

I learned GR first for myself from Landau&Lifshitz vol. 2. That provides the necessary math in a very efficient manner. It's however restricted to holonomous bases. Then I got MTW an also learned the more modern Cartan approach. It emphasizes the geometrical interpretation ("geometrodynamics"). If you want less geometry and a more physical approach, Weinberg's book (1971) is a gem (as all textbooks by Weinberg).
 
  • Like
Likes   Reactions: berkeman and George Keeling
  • #18
Strangely, I have three of the books mentioned! So I guess I have plenty to read and study. Thanks again.
 
  • Like
Likes   Reactions: vanhees71

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
674
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K