Raising/Lowering Indices w/ Metric Tensor

Click For Summary
SUMMARY

The discussion focuses on the operations involving tensors, specifically the process of raising and lowering indices using the metric tensor ##\eta^{\mu \sigma}##. Participants clarify the equivalence of expressions involving tensors, such as ##A^{\sigma}{}_{\nu} e^{\nu} \otimes e_{\sigma}## and ##T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}##, emphasizing the importance of index ordering in tensor products. The conversation highlights that while the mathematical operations yield similar results, the arrangement of indices can lead to different interpretations of the tensors involved. Ultimately, the discussion underscores the necessity of maintaining consistent index notation to avoid confusion.

PREREQUISITES
  • Understanding of tensor notation and operations
  • Familiarity with metric tensors, specifically ##\eta^{\mu \sigma}##
  • Knowledge of tensor products and their properties
  • Basic concepts of linear algebra and vector spaces
NEXT STEPS
  • Study the properties of metric tensors in detail
  • Learn about tensor contraction and its applications
  • Explore the implications of index notation in tensor calculus
  • Investigate the differences between covariant and contravariant tensors
USEFUL FOR

Mathematicians, physicists, and students studying differential geometry or general relativity who seek to deepen their understanding of tensor operations and index manipulation.

cianfa72
Messages
2,951
Reaction score
308
TL;DR
Rules to raise or lower indices through metric tensor
I'm still confused about the notation used for operations involving tensors.
Consider the following simple example:
$$\eta^{\mu \sigma} A_{\mu \nu} = A_{\mu \nu} \eta^{\mu \sigma}$$
Using the rules for raising an index through the (inverse) metric tensor ##\eta^{\mu \sigma}## we get ##A^{\sigma}{}_{\nu}##. However if we work out explicitly the contraction employing the operator ##C_{\alpha}^{\mu} ()## we get:

$$C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) = A_{\mu \nu} \eta^{\mu \sigma} e^{\mu} (e_{\mu}) e^{\nu} \otimes e_{\sigma} = A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma}$$
The latter is a tensor, say ##T = T_{\nu} {}^{\sigma} e^{\nu} \otimes e_{\sigma}##.

Is it the same as ##A^{\sigma}{}_{\nu} e_{\sigma} \otimes e^{\nu}## ?
 
Last edited:
Physics news on Phys.org
You made a mistake in your work;
\begin{align*}C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) &= A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} (e_{\mu}) e^{\nu} \otimes e_{\sigma} \\ &= A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma} \\ &={A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}\end{align*}
 
ergospherical said:
You made a mistake in your work;
\begin{align*}C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) &= A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} (e_{\mu}) e^{\nu} \otimes e_{\sigma} \\ &= A_{\mu \nu} \eta^{\mu \sigma} e^{\nu} \otimes e_{\sigma} \\ &={A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}\end{align*}
Oops yes, from the RHS on the first line summing over ##\alpha## we get the second line and then (summing over ##\mu##) the result.

Maybe I'm missing the point, in your result the ##\nu## index in ##A^{\sigma}{}_{\nu}## actually refers to the first element in tensor product ##e^{\nu} \otimes e_{\sigma}##, not to the second one. Instead in the expression ##T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}## is the first index ##\nu## that refers to the first element.

Is the following correct ?

$${A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma} = T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}$$
 
cianfa72 said:
Is the following correct ?
$${A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma} = T_{\nu}{}^{\sigma} e^{\nu} \otimes e_{\sigma}$$
Well, yes, but only because that’s how you defined ##{T_{\nu}}^{\sigma}##…

You are worrying too much. It’s conventional to maintain the same horizontal ordering of the component indices and the tensor arguments (so that it’s easy to tell which slot is which), but you can do whatever you want.
 
  • Like
Likes   Reactions: cianfa72
Sorry, it seems to me there are actually two different answers we get reversing the order of the 'index raising' operation, namely:
$$C_{\alpha}^{\mu} (A_{\alpha \nu} \eta^{\mu \sigma} e^{\alpha} \otimes e^{\nu} \otimes e_{\mu} \otimes e_{\sigma}) = {A^{\sigma}}_{\nu} e^{\nu} \otimes e_{\sigma}$$
then if we reverse the order we get:
$$C_{\alpha}^{\mu} (\eta^{\mu \sigma} A_{\alpha \nu} e_{\mu} \otimes e_{\sigma} \otimes e^{\alpha} \otimes e^{\nu}) = {A^{\sigma}}_{\nu} e_{\sigma} \otimes e^{\nu}$$

The two are really two different tensors, where is the mistake ?
 
It’s nothing more significant than the ordering of the vector and co-vector arguments.
 
Suppose ##n=2## we get in the two cases:
$$A^{1}{}_{1} e^{1} \otimes e_{1} + A^{1}{}_{2} e^{2} \otimes e_{1} + A^{2}{}_{1} e^{1} \otimes e_{2} + A^{2}{}_{2} e^{2} \otimes e_{2}$$ $$A^{1}{}_{1} e_{1} \otimes e^{1} + A^{1}{}_{2} e_{1} \otimes e^{2} + A^{2}{}_{1} e_{2} \otimes e^{1} + A^{2}{}_{2} e_{2} \otimes e^{2}$$
So the difference is really just the slots order in which plug in the vector and co-vector.

Does the same thing hold for cases like the following ?

##\eta_{\mu \alpha} A^{\alpha \nu} \eta_{\sigma \nu} \Rightarrow A_{\mu \sigma} e^{\mu} \otimes e^{\sigma}##

##\eta_{\sigma \nu} A^{\alpha \nu} \eta_{\mu \alpha} \Rightarrow A_{\mu \sigma} e^{\sigma} \otimes e^{\mu}##
 
Yeah, chill, it’s just like the difference between ##f(x,y) = x^2 y## and ##g(x,y) = y^2 x##, whereby ##f(x,y) = g(y,x)##.
 
ergospherical said:
Yeah, chill, it’s just like the difference between ##f(x,y) = x^2 y## and ##g(x,y) = y^2 x##, whereby ##f(x,y) = g(y,x)##.
yes, the point confusing me is that when formally we raise and/or lower tensor indices through the metric tensor we need to take in account it (in other words we need to take in account the orders of the slots -- in the above example the order of the slots 'waiting' for vectors to be plugged in).

Hence from the point of view of the tensor we get from the raising/lowering operations through the metric tensor, the order makes the difference.
 
Last edited:

Similar threads

  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 38 ·
2
Replies
38
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 124 ·
5
Replies
124
Views
9K
  • · Replies 5 ·
Replies
5
Views
1K