Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Raising/Lowering indices and matrix multiplication

  1. Jan 23, 2012 #1
    Please read the following carefully. The point of the following is to distinguish between [tex]T^{\mu}_{\mbox{ } \nu}[/tex] and [tex]T_{\mbox{ }\mu}^{\nu}[/tex] which clearly involves a metric tensor. But when you want to go from component manipulation to matrix operations you have to be careful. Components are scalars and the products are commutative, but the matrix products they represent are not.

    Setup: let M be a manifold and p a point on the manifold. Let V be the tangent space at p and V* be the cotangent space at p. Let [tex]T:V^*\times V^*\to R[/tex] be a (2,0) tensor. Choosing a basis for V we can talk about the components of T: [tex]T^{\mu\nu}[/tex]. Let g be the metric tensor so [tex]g:V\times V\to R[/tex] and the inverse [tex]g^{-1}:V^*\times V^*\to R[/tex].

    If we only "fill" one of the slots of T we can create maps like [tex]T:V^*\to V[/tex] and we can also do the same for g (think Riesz rep. theorem) [tex]g:V\to V^*[/tex].

    So in that sense we can create some new maps: [tex]gT:V^*\to V^*[/tex] and [tex]Tg:V\to V[/tex].

    My Claim:

    the components of gT are [tex]g_{\mu\nu}T^{\nu\lambda}=T_{\mu}^{\mbox{ }\lambda}[/tex] while the components of Tg are [tex]T^{\nu\lambda}g_{\lambda\mu}=T^{\nu}_{\mbox{ }\mu}[/tex]. Moreover I claim that to work out the matrix products you use gT and Tg as guidelines. A professor disagrees however, he says he's never seen the metric act from the right (and usually in component form it is written on the left).

    How, using matrices would you lower the first or second index on [tex]T^{\mu\nu}[/tex] and does the order matter? Looking at the above in a coordinate free manner with composition of functions it does seem to matter.

    Thank you in advance.
     
  2. jcsd
  3. Jan 23, 2012 #2

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Not sure what you mean by the "components of Tg", since it's not a tensor. (It doesn't have codomain ℝ). Of course, Tg can be used to define a (1,1) tensor, which you might also want to denote by Tg:
    $$Tg(u,\omega)=\omega(Tg(u)).$$ Is this the one you want to find the components of? We have ##(Tg)_i{}^j=Tg(e_i,e^j)=e^j(Tg(e_i))##.
     
  4. Jan 23, 2012 #3
    I believe:

    [tex]gT\rightarrow ( g_{\mu \nu}T^{\nu \lambda })[/tex]
    [tex]Tg\rightarrow ( g_{\lambda \mu}T^{\nu \lambda })[/tex]

    where gT and Tg represent the matrix multiplication of the components of T and g.
     
    Last edited: Jan 23, 2012
  5. Jan 23, 2012 #4
    it looks like you're multiplying by 'g' first each time, but the sum doesn't make it look like matrix multiplication.
     
  6. Jan 23, 2012 #5
    Maybe this will make more sense:

    [tex]g_{\mu \nu}T^{\nu \lambda }=

    \begin{bmatrix}
    g_{11} & g_{12}\\
    g_{21} & g_{22}
    \end{bmatrix}

    \begin{bmatrix}
    T^{11} & T^{12}\\
    T^{21} & T^{22}
    \end{bmatrix}
    [/tex]

    [tex]g_{\lambda \mu }T^{\nu \lambda }=

    \begin{bmatrix}
    T^{11} & T^{12}\\
    T^{21} & T^{22}
    \end{bmatrix}

    \begin{bmatrix}
    g_{11} & g_{12}\\
    g_{21} & g_{22}
    \end{bmatrix}[/tex]

    Unless that's not what you were asking.
     
    Last edited: Jan 23, 2012
  7. Jan 23, 2012 #6
    Elfmotat,

    That is what I'm asking. So you agree that 'matrix wise' you need to 'operate' on different sides to lower different indices? Do you have a reference. The instructor is somewhat out of date and responds better to references.
     
  8. Jan 23, 2012 #7
    In the case of the first one (gμνTνλ) you're summing over the second index in g and the first index in T. Multiplying the corresponding matrices like I did above allows this to happen, by definition of matrix multiplication. I.e. you get gμ1T+gμ2T.
     
  9. Jan 23, 2012 #8
    Oh, I totally agree. It seems motivated from a matrix multiplication standpoint and a composition of functions standpoint, but I lost points on this one. And he's stubborn. So it almost doesn't matter if I'm right. Its pretty frustrating.
     
  10. Jan 23, 2012 #9

    Fair enough, I have been a bit sloppy with my meaning. Though would you agree that Tg can be thought of as a linear transformation on V? Let me try to explain,

    I'm going to use roman indices to make the texing easier.

    Let [tex]e_i[/tex] be a basis for V and [tex]a^i[/tex] the corresponding dual basis in V*. Then [tex]v\in V[/tex] can be expressed as [tex]v=v^i e_i[/tex] and the metric tensor as [tex]g=g_{ij}a^i\otimes a^j[/tex] and T as [tex]T=T^{ij}e_i\otimes e_j[/tex].

    So I imagine feeding v to g, g(_,v) which gives [tex]g_{ij}v^j a^i[/tex] which is an element of the cotangent space V*. Then I put this into the second slot of T,

    [tex]T(\mbox{_},g(\mbox{_},v))=T(\mbox{_},g_{ij}v^j a^i)=g_{ij}v^jT^{mn}e_m\delta_n^i=g_{ij}v^jT^{mi}e_m[/tex]

    So the object T(_,g(_,v)) has components [tex]T^{mi}g_{ij}[/tex] which implies a matrix product Tg to produce a new function (not a tensor?) but a linear transformation on V.

    To summarize: To create a linear transformation on V, out of the tensor T, using the metric tensor g, requires a product Tg, which can be computed with their matrix representations, but it requires g acting from the right. Am I off track Fredrik?
     
  11. Jan 24, 2012 #10

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Yes. I understood your definition of Tg. If A and B respectively denote the maps ##\omega\mapsto T(\cdot,\omega)## and ##v\mapsto g(\cdot,v)##, then Tg is defined as ##A\circ B##. So
    $$Tg(v)=A(B(v))=A(g(\cdot,v))=T(\cdot,g(\cdot,v)).$$ In the abstract index notation, ##g(\cdot,v)## is written as ##g_{ab}v^b## and ##T(\cdot,\omega)## as ##T^{ab}\omega_b##. So Tg(v) is written as ##T^{ab}g_{bc}v^c##. By the usual magic of the abstract index notation, this expression can be interpreted in several different ways. Since it has a free index upstairs, it can be interpreted as a member of V, or as the corresponding member of V**. If we choose the latter interpretation, we can write ##Tg(v)(\omega)=T^{ab}g_{bc}v^c\omega_a##. The function that takes ##(\omega,v)## to that number is the (1,1) tensor I mentioned in my previous post. If it's denoted by Tg, its components are ##(Tg)^i{}_j=T^{ik}g_{kj}##.

    Isn't the only reason you end up with g on the right that you chose to fill the second slot of T and g instead of the first?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Raising/Lowering indices and matrix multiplication
Loading...