Tensors: raising and lowering indices

  • Thread starter Thread starter Irid
  • Start date Start date
  • Tags Tags
    Indices Tensors
Click For Summary

Homework Help Overview

The discussion revolves around the manipulation of tensor indices, specifically raising and lowering indices of a given tensor \(X^{\mu \nu}\). Participants explore the implications of these operations in the context of tensor algebra, particularly in relation to the metric tensor.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the process of lowering and raising indices, questioning the differences between the resulting tensors and their implications. There is an exploration of the matrix representations and the order of multiplication in tensor operations. Some participants express confusion about the significance of the metric tensor and its role in these transformations.

Discussion Status

The discussion is active, with participants providing insights and clarifications regarding tensor operations. Some have offered guidance on the correct application of matrix multiplication in the context of tensor indices, while others are exploring the implications of these operations in different bases. There is a recognition of the need for further understanding as participants continue to pose questions.

Contextual Notes

Participants are navigating the complexities of tensor algebra, including the distinction between different tensor ranks and the effects of the metric tensor. There are references to specific examples and operations that may not be fully resolved, indicating an ongoing learning process.

Irid
Messages
207
Reaction score
1
From Carroll's textbook:

1. The problem statement
Imagine we have a tensor [tex]X^{\mu \nu}[/tex] with components

[tex]X^{\mu \nu} = \begin{pmatrix}<br /> 2 & 0 & 1 & -1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]

Find the components of: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex].2. The attempt at a solution
I don't really understand what am I doing here, just following examples from the textbook. First, temporarily rename

[tex]X^{\mu \nu} \rightarrow X^{\mu \sigma}[/tex]

Then, lower an index:

[tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma}[/tex]

where [tex]\eta_{\nu \sigma}[/tex] is the metric. Multiplicate matrices:

[tex]{X^\mu}_\nu = \begin{pmatrix}<br /> -1 & 0 & 0 & 0\\<br /> 0 & 1 & 0 & 0\\<br /> 0 & 0 & 1 & 0\\<br /> 0 & 0 & 0 & 1<br /> \end{pmatrix}\begin{pmatrix}<br /> 2 & 0 & 1 & -1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}=\begin{pmatrix}<br /> -2 & 0 & 1 & -1\\<br /> 1 & 0 & 3 & 2\\<br /> 1 & 1 & 0 & 0\\<br /> 2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]

Then, for part (b), rename

[tex]X^{\mu \nu} = X^{\sigma \nu}[/tex]

and lower another index:

[tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu}[/tex]

However, in matrix notation it's the same procedure, so the answer for (b) is

[tex]{X_\mu}^\nu = \begin{pmatrix}<br /> -2 & 0 & 1 & -1\\<br /> 1 & 0 & 3 & 2\\<br /> 1 & 1 & 0 & 0\\<br /> 2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]

3. Questions
If the matrices are the same, what is the difference between these two tensors: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex]?
As I understand, I had a (2,0) tensor to begin with, multiplied it by the metric (0,2) to obtain two (1,1) tensors. What is that good for? What's the difference in the rank of tensors if the matrices look similar, only a few components change?
 
Physics news on Phys.org
Irid said:
[tex]{X^\mu}_\nu = \begin{pmatrix}<br /> -1 & 0 & 0 & 0\\<br /> 0 & 1 & 0 & 0\\<br /> 0 & 0 & 1 & 0\\<br /> 0 & 0 & 0 & 1<br /> \end{pmatrix}\begin{pmatrix}<br /> 2 & 0 & 1 & -1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}=\begin{pmatrix}<br /> -2 & 0 & 1 & -1\\<br /> 1 & 0 & 3 & 2\\<br /> 1 & 1 & 0 & 0\\<br /> 2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]
This is the correct multiplication to perform. However, you have not performed it correctly.

Raising and lowering of indices is useful for balancing equations. For an equation to hold, the indices that are not contracted must agree on both sides of the equation. Sometimes it will be necessary to swap an upper and lower index in order to get this to happen. In special relativity, the metric \eta used to raise and lower is somewhat boring. However, in general relativity a more interesting metric, usually denoted g, is used. \eta is a special case of this metric g. When you start working with it, the practice you get by using the simpler \eta will come in handy.
 
Okay, you're just making a small mistake when you're turning index notation into matrix notation.

You're right that [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}[/tex]

Now since we're writing this in terms of indices, it doesn't matter which order you write [tex]\eta_{\nu \sigma}[/tex] and [tex]X^{\mu \sigma}[/tex] in (as log as you write in the indices of course).

So you can write [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}=X^{\mu \sigma}\eta_{\nu \sigma}=X^{\mu \sigma}\eta_{\sigma \nu}[/tex]
(where the last step follows because [tex]\eta_{\nu \sigma}[/tex] is symmetric)

And this is matrix multiplication in the order [tex]X^{\mu}{}_{\nu}=[X][\eta][/tex]
since you sum over columns in [tex][X][/tex] and rows in [tex][\eta][/tex] (it may help to write out the product explicitly to see this).

Now, the other way you did correctly since [tex]X_{\mu}{}^{\nu}=\eta_{\mu \sigma}X^{\sigma \nu}=[\eta][X][/tex]

So, the matrices should not be the same (as you probably expected). The difference is in what basis the components are with respect to. [tex]X^{\mu}{}_{\nu}[/tex] are the components of a (1,1) tensor in the [tex]\hat{e}\otimes\hat{\theta}[/tex] basis (to use Carroll's notation - pg. 21), and [tex]X_{\mu}{}^{\nu}[/tex] are the components of the same (1,1) tensor in the [tex]\hat{\theta}\otimes\hat{e}[/tex] basis. Note these are the same tensor expressed in different bases. You can also make a (0,2) tensor by lowering both indices of [tex]X^{\mu \sigma}[/tex]. The (0,2), (1,1), and original (2,0) tensors are all different objects labeled by the same letter - it's the index placement that differentiates them.

edit: hah, it took me too long to type this - you beat me jimmy!
 
OK, so I now see that (a):

[tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma} = [X][\eta] = \begin{pmatrix}<br /> 2 & 0 & 1 & -1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}\begin{pmatrix}<br /> -1 & 0 & 0 & 0\\<br /> 0 & 1 & 0 & 0\\<br /> 0 & 0 & 1 & 0\\<br /> 0 & 0 & 0 & 1<br /> \end{pmatrix}=\begin{pmatrix}<br /> -2 & 0 & 1 & -1\\<br /> 1 & 0 & 3 & 2\\<br /> 1 & 1 & 0 & 0\\<br /> 2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]

and (b):

[tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu} = [\eta][X] = \begin{pmatrix}<br /> -1 & 0 & 0 & 0\\<br /> 0 & 1 & 0 & 0\\<br /> 0 & 0 & 1 & 0\\<br /> 0 & 0 & 0 & 1<br /> \end{pmatrix}\begin{pmatrix}<br /> 2 & 0 & 1 & -1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}=\begin{pmatrix}<br /> -2 & 0 & -1 & 1\\<br /> -1 & 0 & 3 & 2\\<br /> -1 & 1 & 0 & 0\\<br /> -2 & 1 & 1 & -2<br /> \end{pmatrix}[/tex]

Thus, the order of matrices in multiplication is determined by the position of the index being raised (lowered). Suppose I wanted to raise an index, then

[tex]{X^\mu}_\nu = \eta^{\mu \sigma} X_{\sigma \nu} = [\eta] [X]\, ?[/tex]

Also, why is the metric such a special tensor that it's used to lower and raise these indices?
 
Irid said:
Also, why is the metric such a special tensor that it's used to lower and raise these indices?
Given [tex]A^{\mu}[/tex], then by definition, [tex]A_0 = -A^0, A_i = A^i[/tex]. [tex]\eta[/tex] is the matrix that accomplishes this transformation.
 
Starts getting clearer...

OK, what if the index being raised is in the middle, such as

[tex]{{X_{\mu}}^\nu}_\rho = \eta^{\nu \sigma} X_{\mu \sigma \rho} = [\eta][X] \quad \text{OR} \quad [X][\eta][/tex]

or is this operation simply not allowed?
 
Another question, how do I symmetrize [tex]X^{(\mu \nu)}[/tex], given [tex]X^{\mu \nu}[/tex]? I tried to lower both indices then raise them back in a different order, but eventually the metric and inverse metric just multiplies and I obtain [tex]X^{\mu \nu} = X^{\nu \mu}[/tex]. I don't know how to obtain [tex]X^{\nu \mu}[/tex].
 
Irid said:
Starts getting clearer...

OK, what if the index being raised is in the middle, such as

[tex]{{X_{\mu}}^\nu}_\rho = \eta^{\nu \sigma} X_{\mu \sigma \rho} = [\eta][X] \quad \text{OR} \quad [X][\eta][/tex]

or is this operation simply not allowed?

It's allowed. You just can't write it as a matrix multiplication since X isn't a matrix. But if you know all of the components of X and eta, you wouldn't have any trouble computing any component of the lowered index tensor, would you?
 
Irid said:
Another question, how do I symmetrize [tex]X^{(\mu \nu)}[/tex], given [tex]X^{\mu \nu}[/tex]? I tried to lower both indices then raise them back in a different order, but eventually the metric and inverse metric just multiplies and I obtain [tex]X^{\mu \nu} = X^{\nu \mu}[/tex]. I don't know how to obtain [tex]X^{\nu \mu}[/tex].

It's [itex](X^{\mu \nu} + X^{\nu \mu})/2[/itex]. There's nothing to 'compute' until you have more information about X. E.g. [itex]X^{(01)}=(X^{01} + X^{10})/2[/itex].
 
  • #10
Dick said:
It's [itex](X^{\mu \nu} + X^{\nu \mu})/2[/itex]. There's nothing to 'compute' until you have more information about X. E.g. [itex]X^{(01)}=(X^{01} + X^{10})/2[/itex].

Well, I know the matrix representation of X. Isn't that enough?
 
  • #11
Irid said:
Well, I know the matrix representation of X. Isn't that enough?

Then just add the matrix of X and the transpose of the matrix of X and divide by 2.
 
  • #12
Yeah, I suspected that

[tex]X^{\mu \nu} = \left( X^{\nu \mu} \right)^T[/tex]

but you can never be sure... OK, enough help to solve the problem, I hope I'll understand more as I learn about the subject further. I'm sure I'll post some more newbie questions shortly :))
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
7
Views
3K
Replies
1
Views
4K
  • · Replies 22 ·
Replies
22
Views
4K