# I Evaluating metric tensor in a primed coordinate system

1. Dec 5, 2017

### vibhuav

I am trying to learn GR. In two of the books on tensors, there is an example of evaluating the inertia tensor in a primed coordinate system (for example, a rotated one) from that in an unprimed coordinate system using the eqn. $I’ = R I R^{-1}$ where R is the transformation matrix and $R^{-1}$ is its inverse, and $I$ is the inertia matrix.

Is this method of transformation of (inertia) tensor from one coordinate system to another applicable for all tensors? In particular, can I use this method to evaluate the metric tensor in the primed coordinate system, given the metric tensor in the unprimed system and the transformation matrix?

Assuming it is applicable, I attempted to evaluate the metric tensor in Cartesian coordinate system, from the 2D polar system (using $g’(Cart) = T g(polar) T^{-1}$, where $T$ is the transformation matrix from polar to Cartesian), and was expecting an identity matrix with diagonal elements 1 and all others 0.

I am evaluating the metric tensor as follows:

g'(Cart) =
\begin{bmatrix}
cos\theta & -r\ sin\theta\\
sin\theta & r\ cos\theta \end{bmatrix} \times
\begin{bmatrix}
1 & 0\\
0 & r^2 \end{bmatrix} \times
\begin{bmatrix}
cos\theta & sin\theta\\
-sin\theta \over r & cos\theta \over r\end{bmatrix}

Instead of the identity matrix for the metric tensor in Cartesian coordinates, I ended up with:
\begin{bmatrix}cos^2\theta + r^2 sin^2\theta&cos\theta\ sin\theta - r^2 cos\theta\ sin\theta\\
cos\theta\ sin\theta - r^2 cos\theta\ sin\theta&sin^2\theta+r^2 cos^2\theta \end{bmatrix}
Almost there, but not quite; the $r^2$ is messing it up.

What am I missing? Since the Euclidean space is flat, the Cartesian coordinate system has an identity matrix as a metric tensor, right?

2. Dec 5, 2017

### Staff: Mentor

Yes.

It looks like you are multiplying the matrices left to right. Try multiplying them right to left. (Note that right to left multiplication is the usual convention.)

Yes.

3. Dec 5, 2017

### Staff: Mentor

Actually, the grouping (left first vs. right first) can't matter because matrix multiplication is associative. But I was able to get the right answer (the identity matrix) by first multiplying the right two factors, then multiplying the result by the left factor. So I'm not sure what is going wrong by doing it the other way.

4. Dec 5, 2017

### Ibix

Shouldn't that be $R^T$, not $R^{-1}$? Because if we expect that $R.g.R^{-1}=I$ then it follows that $g=R^{-1}.I.R$ and hence $g=I$.

5. Dec 5, 2017

### Orodruin

Staff Emeritus
First of all, a tensor is not a matrix or vice versa. A rank two tensor can be represented by one, but it really is a different object.

The transformation rule you quote is valid only for tensors of rank two that have one covariant and one contravariant index. In the case of the inertia tensor, you are likely using Cartesian coordinates and then index placement really does not matter. Hence, your
is not an example, but the special case when it is generally true regardless of index placement. In the case where you have two contravariant indices, the matrix representation will instead be on the form
$$T' = R T R^T,$$
where $R^T$ is the transpose of $R$, not its inverse. Note that, in the case of rotations, $R^T = R^{-1}$ and the transformation rule indeed is the same. For the case of two covariant indices, you would instead get
$$T' = (R^{-1})^T T R^{-1}.$$
These relations follow directly from the more illustrative
$$\newcommand{\dd}[2]{\frac{\partial #1}{\partial #2}} T'^{ab} = \dd{x'^a}{x^c} \dd{x'^b}{x^d} T^{bd} \quad \mbox{and} \quad T'_{ab} = \dd{x^c}{x'^a} \dd{x^d}{x'^b} T_{cd}.$$
Note that the components of $R$ are $\partial x'^a/\partial x^c$ and those of $R^{-1}$ are $\partial x^c/\partial x'^a$.

If you want to consider the appropriate transformation rule for the metric with $R G R^{-1}$, then you need to raise one of the covariant indices, which turns the metric into the Kronecker delta, which is represented by the identity matrix $I$. You would then obatin
$$R I R^{-1} = RR^{-1} = I,$$
which is correct as the Kronecker delta is represented by the identity matrix in all coordinate systems. For your metric tensor transformation, you would obtain (both indices are covariant)
$$G' = \begin{pmatrix} \cos\theta & \sin\theta\\ -r\sin\theta & r\cos\theta \end{pmatrix}^{-1}\begin{pmatrix} 1 & 0 \\ 0 & r^2\end{pmatrix} \begin{pmatrix} \cos\theta & -r \sin\theta\\ \sin\theta & r \cos\theta \end{pmatrix}^{-1} = \begin{pmatrix} \cos\theta & -\frac 1r \sin\theta\\ \sin\theta & \frac 1r \cos\theta \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & r^2\end{pmatrix} \begin{pmatrix} \cos\theta & \sin\theta\\ -\frac 1r \sin\theta & \frac 1r \cos\theta \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}.$$

Edit: Clearly I started writing this before the previous posts so in no way should it be taken as a critique of those. Personally, I try to stay away from matrix representations as much as possible as the actual transformation rules tend to get lost. Only use it if you are working in Cartesian coordinates (i.e., you are only interested in rotations) or know precisely what you are doing.

6. Dec 5, 2017

### vibhuav

Great answer; thanks a lot! There is so much details to know and you shed some light for me regarding how contravariant and covariant indices are to be handled. As you said,
$T'^{ab} = \frac{\partial x'^a}{\partial x^c} \frac{\partial x'^b}{\partial x^d} T^{bd}$ and $T'_{ab} = \frac{\partial x^c}{\partial x'^a} \frac{\partial x^d}{\partial x'^b} T_{cd}$ are more illustrative.

7. Dec 5, 2017

### Orodruin

Staff Emeritus
Happy to be of help.

To be honest, the word "illustrative" might have been the wrong one to use. "Appropriate" or "illuminating" might have been better choices. Also, it makes sense to point out explicitly that for the case $R^T = R^{-1}$, you would also have the covariant transformation turning into the one you quoted in the OP as then
$$(R^{-1})^T T R^{-1} = (R^T)^T T R^{-1} = R T R^{-1}.$$

On an unrelated note, I could not help but noticing that your $\LaTeX$ typesetting of the matrix elements was "r\ sin\theta" (etc), which produces $r\ sin\theta$. I understand that you wanted the whitespace to separate the $sin$ from the $r$, however there is a more technically correct way of doing it: Functions such as sine, cosine, the exponential function, etc, should be typeset in a particular way, namely with the function name in regular font rather the math mode variable font. When you write "rsin\theta", $\LaTeX$ interprets it all as a string of five variables and typesets it as such. If you instead use "r\sin\theta", then the sine function will be typeset appropriately as $r\sin\theta$. Many functions like this are predefined but you find one missing you can add your own using the \DeclareMathOperator command from the amsmath package (essentially always include amsmath ...). Normally I would not bother pointing it out, but from your efforts it seems as if you did pay some attention to how the $\LaTeX$ appeared and it is a useful thing to know when (like me) you are a bit of a $\LaTeX$ perfectionist.

8. Dec 5, 2017

### vibhuav

Ah yes, and good observation!

I loved typesetting math equations with $\LaTeX$ when I was in college, almost 25 years ago, but lately don't use it. Being able to typeset with $\LaTeX$ is one of the perks of Physics Forum, and I tried to recollect as much as I could while writing the original post, but obviously I have forgotten many details. Maybe I'll bring my book and keep it handy, but thanks for goading me towards that!