Raising/Lowering indices and matrix multiplication

Click For Summary

Discussion Overview

The discussion revolves around the manipulation of indices in tensor operations, specifically focusing on the roles of the metric tensor in raising and lowering indices during matrix multiplication. Participants explore the implications of these operations in the context of tensors defined on manifolds and their components in various bases.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant emphasizes the distinction between T^{\mu}_{\mbox{ } \nu} and T_{\mbox{ }\mu}^{\nu}, noting the importance of the metric tensor in these operations.
  • Another participant questions the interpretation of Tg, suggesting it does not represent a tensor but can define a (1,1) tensor through a specific mapping.
  • Some participants propose that the components of gT and Tg can be expressed in terms of matrix multiplication, with specific indices being summed over.
  • There is a discussion about whether the metric tensor can act from the right in these operations, with some participants agreeing that it does seem to matter based on the order of operations.
  • A later reply clarifies that Tg can be viewed as a linear transformation on V, leading to further exploration of its components and implications.
  • Participants express frustration over differing views from an instructor regarding the proper handling of indices and matrix multiplication.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the interpretation of Tg and the proper handling of the metric tensor in index manipulation. Multiple competing views remain regarding the roles of gT and Tg in tensor operations.

Contextual Notes

Some participants note that the definitions and interpretations of the tensor operations depend on the chosen basis and the specific slots being filled, which may lead to different outcomes in the manipulation of indices.

homology
Messages
305
Reaction score
1
Please read the following carefully. The point of the following is to distinguish between T^{\mu}_{\mbox{ } \nu} and T_{\mbox{ }\mu}^{\nu} which clearly involves a metric tensor. But when you want to go from component manipulation to matrix operations you have to be careful. Components are scalars and the products are commutative, but the matrix products they represent are not.

Setup: let M be a manifold and p a point on the manifold. Let V be the tangent space at p and V* be the cotangent space at p. Let T:V^*\times V^*\to R be a (2,0) tensor. Choosing a basis for V we can talk about the components of T: T^{\mu\nu}. Let g be the metric tensor so g:V\times V\to R and the inverse g^{-1}:V^*\times V^*\to R.

If we only "fill" one of the slots of T we can create maps like T:V^*\to V and we can also do the same for g (think Riesz rep. theorem) g:V\to V^*.

So in that sense we can create some new maps: gT:V^*\to V^* and Tg:V\to V.

My Claim:

the components of gT are g_{\mu\nu}T^{\nu\lambda}=T_{\mu}^{\mbox{ }\lambda} while the components of Tg are T^{\nu\lambda}g_{\lambda\mu}=T^{\nu}_{\mbox{ }\mu}. Moreover I claim that to work out the matrix products you use gT and Tg as guidelines. A professor disagrees however, he says he's never seen the metric act from the right (and usually in component form it is written on the left).

How, using matrices would you lower the first or second index on T^{\mu\nu} and does the order matter? Looking at the above in a coordinate free manner with composition of functions it does seem to matter.

Thank you in advance.
 
Physics news on Phys.org
Not sure what you mean by the "components of Tg", since it's not a tensor. (It doesn't have codomain ℝ). Of course, Tg can be used to define a (1,1) tensor, which you might also want to denote by Tg:
$$Tg(u,\omega)=\omega(Tg(u)).$$ Is this the one you want to find the components of? We have ##(Tg)_i{}^j=Tg(e_i,e^j)=e^j(Tg(e_i))##.
 
I believe:

gT\rightarrow ( g_{\mu \nu}T^{\nu \lambda })
Tg\rightarrow ( g_{\lambda \mu}T^{\nu \lambda })

where gT and Tg represent the matrix multiplication of the components of T and g.
 
Last edited:
elfmotat said:
I believe:

gT\rightarrow ( g_{\mu \nu}T^{\nu \lambda })
Tg\rightarrow ( g_{\mu \lambda}T^{\nu \lambda })

where gT and Tg represent the matrix multiplication of the components of T and g.

it looks like you're multiplying by 'g' first each time, but the sum doesn't make it look like matrix multiplication.
 
homology said:
it looks like you're multiplying by 'g' first each time, but the sum doesn't make it look like matrix multiplication.

Maybe this will make more sense:

g_{\mu \nu}T^{\nu \lambda }=<br /> <br /> \begin{bmatrix}<br /> g_{11} &amp; g_{12}\\ <br /> g_{21} &amp; g_{22}<br /> \end{bmatrix}<br /> <br /> \begin{bmatrix}<br /> T^{11} &amp; T^{12}\\ <br /> T^{21} &amp; T^{22}<br /> \end{bmatrix}<br />

g_{\lambda \mu }T^{\nu \lambda }=<br /> <br /> \begin{bmatrix}<br /> T^{11} &amp; T^{12}\\ <br /> T^{21} &amp; T^{22}<br /> \end{bmatrix}<br /> <br /> \begin{bmatrix}<br /> g_{11} &amp; g_{12}\\ <br /> g_{21} &amp; g_{22}<br /> \end{bmatrix}

Unless that's not what you were asking.
 
Last edited:
Elfmotat,

That is what I'm asking. So you agree that 'matrix wise' you need to 'operate' on different sides to lower different indices? Do you have a reference. The instructor is somewhat out of date and responds better to references.
 
In the case of the first one (gμνTνλ) you're summing over the second index in g and the first index in T. Multiplying the corresponding matrices like I did above allows this to happen, by definition of matrix multiplication. I.e. you get gμ1T+gμ2T.
 
elfmotat said:
In the case of the first one (gμνTνλ) you're summing over the second index in g and the first index in T. Multiplying the corresponding matrices like I did above allows this to happen, by definition of matrix multiplication.

I.e. you get gμ1T+gμ2T.

Oh, I totally agree. It seems motivated from a matrix multiplication standpoint and a composition of functions standpoint, but I lost points on this one. And he's stubborn. So it almost doesn't matter if I'm right. Its pretty frustrating.
 
Fredrik said:
Not sure what you mean by the "components of Tg", since it's not a tensor. (It doesn't have codomain ℝ). Of course, Tg can be used to define a (1,1) tensor, which you might also want to denote by Tg:
$$Tg(u,\omega)=\omega(Tg(u)).$$ Is this the one you want to find the components of? We have ##(Tg)_i{}^j=Tg(e_i,e^j)=e^j(Tg(e_i))##.


Fair enough, I have been a bit sloppy with my meaning. Though would you agree that Tg can be thought of as a linear transformation on V? Let me try to explain,

I'm going to use roman indices to make the texing easier.

Let e_i be a basis for V and a^i the corresponding dual basis in V*. Then v\in V can be expressed as v=v^i e_i and the metric tensor as g=g_{ij}a^i\otimes a^j and T as T=T^{ij}e_i\otimes e_j.

So I imagine feeding v to g, g(_,v) which gives g_{ij}v^j a^i which is an element of the cotangent space V*. Then I put this into the second slot of T,

T(\mbox{_},g(\mbox{_},v))=T(\mbox{_},g_{ij}v^j a^i)=g_{ij}v^jT^{mn}e_m\delta_n^i=g_{ij}v^jT^{mi}e_m

So the object T(_,g(_,v)) has components T^{mi}g_{ij} which implies a matrix product Tg to produce a new function (not a tensor?) but a linear transformation on V.

To summarize: To create a linear transformation on V, out of the tensor T, using the metric tensor g, requires a product Tg, which can be computed with their matrix representations, but it requires g acting from the right. Am I off track Fredrik?
 
  • #10
homology said:
Though would you agree that Tg can be thought of as a linear transformation on V?
Yes. I understood your definition of Tg. If A and B respectively denote the maps ##\omega\mapsto T(\cdot,\omega)## and ##v\mapsto g(\cdot,v)##, then Tg is defined as ##A\circ B##. So
$$Tg(v)=A(B(v))=A(g(\cdot,v))=T(\cdot,g(\cdot,v)).$$ In the abstract index notation, ##g(\cdot,v)## is written as ##g_{ab}v^b## and ##T(\cdot,\omega)## as ##T^{ab}\omega_b##. So Tg(v) is written as ##T^{ab}g_{bc}v^c##. By the usual magic of the abstract index notation, this expression can be interpreted in several different ways. Since it has a free index upstairs, it can be interpreted as a member of V, or as the corresponding member of V**. If we choose the latter interpretation, we can write ##Tg(v)(\omega)=T^{ab}g_{bc}v^c\omega_a##. The function that takes ##(\omega,v)## to that number is the (1,1) tensor I mentioned in my previous post. If it's denoted by Tg, its components are ##(Tg)^i{}_j=T^{ik}g_{kj}##.

homology said:
So the object T(_,g(_,v)) has components T^{mi}g_{ij} which implies a matrix product Tg to produce a new function (not a tensor?) but a linear transformation on V.
Isn't the only reason you end up with g on the right that you chose to fill the second slot of T and g instead of the first?
 

Similar threads

  • · Replies 38 ·
2
Replies
38
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
859
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 62 ·
3
Replies
62
Views
7K