1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Tensors: raising and lowering indices

  1. Sep 7, 2008 #1
    From Carroll's textbook:

    1. The problem statement
    Imagine we have a tensor [tex]X^{\mu \nu}[/tex] with components

    [tex]X^{\mu \nu} = \begin{pmatrix}
    2 & 0 & 1 & -1\\
    -1 & 0 & 3 & 2\\
    -1 & 1 & 0 & 0\\
    -2 & 1 & 1 & -2
    \end{pmatrix}[/tex]

    Find the components of: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex].


    2. The attempt at a solution
    I don't really understand what am I doing here, just following examples from the textbook. First, temporarily rename

    [tex]X^{\mu \nu} \rightarrow X^{\mu \sigma}[/tex]

    Then, lower an index:

    [tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma}[/tex]

    where [tex]\eta_{\nu \sigma}[/tex] is the metric. Multiplicate matrices:

    [tex] {X^\mu}_\nu = \begin{pmatrix}
    -1 & 0 & 0 & 0\\
    0 & 1 & 0 & 0\\
    0 & 0 & 1 & 0\\
    0 & 0 & 0 & 1
    \end{pmatrix}\begin{pmatrix}
    2 & 0 & 1 & -1\\
    -1 & 0 & 3 & 2\\
    -1 & 1 & 0 & 0\\
    -2 & 1 & 1 & -2
    \end{pmatrix}=\begin{pmatrix}
    -2 & 0 & 1 & -1\\
    1 & 0 & 3 & 2\\
    1 & 1 & 0 & 0\\
    2 & 1 & 1 & -2
    \end{pmatrix}[/tex]

    Then, for part (b), rename

    [tex]X^{\mu \nu} = X^{\sigma \nu}[/tex]

    and lower another index:

    [tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu}[/tex]

    However, in matrix notation it's the same procedure, so the answer for (b) is

    [tex]{X_\mu}^\nu = \begin{pmatrix}
    -2 & 0 & 1 & -1\\
    1 & 0 & 3 & 2\\
    1 & 1 & 0 & 0\\
    2 & 1 & 1 & -2
    \end{pmatrix}[/tex]

    3. Questions
    If the matrices are the same, what is the difference between these two tensors: (a) [tex]{X^\mu}_\nu[/tex]; (b) [tex]{X_\mu}^\nu[/tex]?
    As I understand, I had a (2,0) tensor to begin with, multiplied it by the metric (0,2) to obtain two (1,1) tensors. What is that good for? What's the difference in the rank of tensors if the matrices look similar, only a few components change?
     
  2. jcsd
  3. Sep 8, 2008 #2
    This is the correct multiplication to perform. However, you have not performed it correctly.

    Raising and lowering of indices is useful for balancing equations. For an equation to hold, the indices that are not contracted must agree on both sides of the equation. Sometimes it will be necessary to swap an upper and lower index in order to get this to happen. In special relativity, the metric \eta used to raise and lower is somewhat boring. However, in general relativity a more interesting metric, usually denoted g, is used. \eta is a special case of this metric g. When you start working with it, the practice you get by using the simpler \eta will come in handy.
     
  4. Sep 8, 2008 #3
    Okay, you're just making a small mistake when you're turning index notation into matrix notation.

    You're right that [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}[/tex]

    Now since we're writing this in terms of indices, it doesn't matter which order you write [tex]\eta_{\nu \sigma}[/tex] and [tex]X^{\mu \sigma}[/tex] in (as log as you write in the indices of course).

    So you can write [tex]X^{\mu}{}_{\nu}=\eta_{\nu \sigma}X^{\mu \sigma}=X^{\mu \sigma}\eta_{\nu \sigma}=X^{\mu \sigma}\eta_{\sigma \nu}[/tex]
    (where the last step follows because [tex]\eta_{\nu \sigma}[/tex] is symmetric)

    And this is matrix multiplication in the order [tex]X^{\mu}{}_{\nu}=[X][\eta][/tex]
    since you sum over columns in [tex][X][/tex] and rows in [tex][\eta][/tex] (it may help to write out the product explicitly to see this).

    Now, the other way you did correctly since [tex]X_{\mu}{}^{\nu}=\eta_{\mu \sigma}X^{\sigma \nu}=[\eta][X][/tex]

    So, the matrices should not be the same (as you probably expected). The difference is in what basis the components are with respect to. [tex]X^{\mu}{}_{\nu}[/tex] are the components of a (1,1) tensor in the [tex]\hat{e}\otimes\hat{\theta}[/tex] basis (to use Carroll's notation - pg. 21), and [tex]X_{\mu}{}^{\nu}[/tex] are the components of the same (1,1) tensor in the [tex]\hat{\theta}\otimes\hat{e}[/tex] basis. Note these are the same tensor expressed in different bases. You can also make a (0,2) tensor by lowering both indices of [tex]X^{\mu \sigma}[/tex]. The (0,2), (1,1), and original (2,0) tensors are all different objects labeled by the same letter - it's the index placement that differentiates them.

    edit: hah, it took me too long to type this - you beat me jimmy!
     
  5. Sep 9, 2008 #4
    OK, so I now see that (a):

    [tex]{X^\mu}_\nu = \eta_{\nu \sigma} X^{\mu \sigma} = [X][\eta] = \begin{pmatrix}
    2 & 0 & 1 & -1\\
    -1 & 0 & 3 & 2\\
    -1 & 1 & 0 & 0\\
    -2 & 1 & 1 & -2
    \end{pmatrix}\begin{pmatrix}
    -1 & 0 & 0 & 0\\
    0 & 1 & 0 & 0\\
    0 & 0 & 1 & 0\\
    0 & 0 & 0 & 1
    \end{pmatrix}=\begin{pmatrix}
    -2 & 0 & 1 & -1\\
    1 & 0 & 3 & 2\\
    1 & 1 & 0 & 0\\
    2 & 1 & 1 & -2
    \end{pmatrix}
    [/tex]

    and (b):

    [tex]{X_\mu}^\nu = \eta_{\mu \sigma} X^{\sigma \nu} = [\eta][X] = \begin{pmatrix}
    -1 & 0 & 0 & 0\\
    0 & 1 & 0 & 0\\
    0 & 0 & 1 & 0\\
    0 & 0 & 0 & 1
    \end{pmatrix}\begin{pmatrix}
    2 & 0 & 1 & -1\\
    -1 & 0 & 3 & 2\\
    -1 & 1 & 0 & 0\\
    -2 & 1 & 1 & -2
    \end{pmatrix}=\begin{pmatrix}
    -2 & 0 & -1 & 1\\
    -1 & 0 & 3 & 2\\
    -1 & 1 & 0 & 0\\
    -2 & 1 & 1 & -2
    \end{pmatrix}
    [/tex]

    Thus, the order of matrices in multiplication is determined by the position of the index being raised (lowered). Suppose I wanted to raise an index, then

    [tex]{X^\mu}_\nu = \eta^{\mu \sigma} X_{\sigma \nu} = [\eta] [X]\, ? [/tex]

    Also, why is the metric such a special tensor that it's used to lower and raise these indices?
     
  6. Sep 9, 2008 #5
    Given [tex]A^{\mu}[/tex], then by definition, [tex]A_0 = -A^0, A_i = A^i[/tex]. [tex]\eta[/tex] is the matrix that accomplishes this transformation.
     
  7. Sep 9, 2008 #6
    Starts getting clearer...

    OK, what if the index being raised is in the middle, such as

    [tex]{{X_{\mu}}^\nu}_\rho = \eta^{\nu \sigma} X_{\mu \sigma \rho} = [\eta][X] \quad \text{OR} \quad [X][\eta][/tex]

    or is this operation simply not allowed?
     
  8. Sep 9, 2008 #7
    Another question, how do I symmetrize [tex]X^{(\mu \nu)}[/tex], given [tex]X^{\mu \nu}[/tex]? I tried to lower both indices then raise them back in a different order, but eventually the metric and inverse metric just multiplies and I obtain [tex]X^{\mu \nu} = X^{\nu \mu}[/tex]. I don't know how to obtain [tex]X^{\nu \mu}[/tex].
     
  9. Sep 9, 2008 #8

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    It's allowed. You just can't write it as a matrix multiplication since X isn't a matrix. But if you know all of the components of X and eta, you wouldn't have any trouble computing any component of the lowered index tensor, would you?
     
  10. Sep 9, 2008 #9

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    It's [itex](X^{\mu \nu} + X^{\nu \mu})/2[/itex]. There's nothing to 'compute' until you have more information about X. E.g. [itex]X^{(01)}=(X^{01} + X^{10})/2[/itex].
     
  11. Sep 9, 2008 #10
    Well, I know the matrix representation of X. Isn't that enough?
     
  12. Sep 9, 2008 #11

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Then just add the matrix of X and the transpose of the matrix of X and divide by 2.
     
  13. Sep 9, 2008 #12
    Yeah, I suspected that

    [tex]X^{\mu \nu} = \left( X^{\nu \mu} \right)^T[/tex]

    but you can never be sure... OK, enough help to solve the problem, I hope I'll understand more as I learn about the subject further. I'm sure I'll post some more newbie questions shortly :))
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Tensors: raising and lowering indices
  1. Tensors - indices (Replies: 5)

Loading...