Contravariant and covariant indices

Click For Summary
SUMMARY

The discussion clarifies the notation of contravariant and covariant indices in the context of Lorentz transformations, specifically addressing whether to write \Lambda^\mu\,_\nu or \Lambda^\mu_\nu. It establishes that the placement of indices affects the interpretation of transformations, particularly in matrix representations. The relationship between \Lambda and its inverse is defined by the equation \Lambda^{-1}=\eta^{-1}\Lambda^T\eta, leading to the conclusion that \Lambda_\nu{}^\mu = (\Lambda^{-1})^\mu{}_\nu \neq \Lambda^\mu{}_\nu when using the conventions for raising and lowering indices with \eta_{\mu\nu} and \eta^{\mu\nu}.

PREREQUISITES
  • Understanding of Lorentz transformations and their properties
  • Familiarity with matrix representations of linear transformations
  • Knowledge of tensor notation and index manipulation
  • Concept of raising and lowering indices using metric tensors
NEXT STEPS
  • Study the properties of Lorentz transformations in detail
  • Learn about the implications of matrix representations in physics
  • Explore tensor calculus and its applications in general relativity
  • Investigate the conventions for raising and lowering indices in various contexts
USEFUL FOR

Physicists, mathematicians, and students studying relativity, linear algebra, or tensor analysis who seek to deepen their understanding of index notation and its implications in transformations.

spookyfish
Messages
53
Reaction score
0
When we write contravariant and covariant indices, for example for the Lorentz transformation, does it matter if we write \Lambda^\mu\,_\nu or \Lambda^\mu_\nu?
i.e. if the \nu index is to the right of the \mu or they are at the same place with respect to left-right?
 
Physics news on Phys.org
Lorentz transformations are linear operators on ##\mathbb R^4## (or ##\mathbb R^2## or ##\mathbb R^3##). So they can be represented by matrices. (See the https://www.physicsforums.com/showthread.php?t=694922 about matrix representations of linear transformations). I will not make any notational distinction between a linear operator and its matrix representation with respect to the standard basis.

Let ##\Lambda## be an arbitrary Lorentz transformation. By definition of Lorentz transformation, we have ##\Lambda^T\eta\Lambda=\eta##. This implies that ##\Lambda^{-1}=\eta^{-1}\Lambda^T\eta##. Let's use the notational convention that for all matrices X, we denote the entry on row ##\mu##, column ##\nu## by ##X^\mu{}_\nu##. If we use this convention, the definition of matrix multiplication, our formula for ##\Lambda^{-1}## and the convention that every index that appears twice is summed over, we get
$$(\Lambda^{-1})^\mu{}_\nu = (\eta^{-1})^\mu{}_\rho (\Lambda^T)^\rho{}_\sigma \eta^\sigma{}_\nu = (\eta^{-1})^\mu{}_\rho \Lambda^\sigma{}_\rho \eta^\sigma{}_\nu.$$ This is where things get funny. It's conventional to write ##\eta_{\mu\nu}## instead of ##\eta^\mu{}_\nu##, and ##\eta^{\mu\nu}## instead of ##(\eta^{-1})^\mu{}_\nu##. If we use this convention, we have
$$(\Lambda^{-1})^\mu{}_\nu = \eta^{\mu\rho} \Lambda^\sigma{}_\rho \eta_{\sigma\nu}.$$ Now if we also use the convention that ##\eta^{\mu\nu}## raises indices and ##\eta_{\mu\nu}## lowers them, we end up with
$$(\Lambda^{-1})^\mu{}_\nu = \Lambda_\nu{}^\mu.$$ So if ##\Lambda## isn't the identity transformation, we have
$$\Lambda_\nu{}^\mu = (\Lambda^{-1})^\mu{}_\nu \neq \Lambda^\mu{}_\nu.$$ As you can see, the inequality is a result of the definitions of ##\eta_{\mu\nu}## and ##\eta^{\mu\nu}##, so if you use a notational convention that denotes these things by something else, or doesn't use these things to raise and lower indices, it may be OK to write ##\Lambda^\mu_\nu##.
 
Last edited by a moderator:

Similar threads

  • · Replies 13 ·
Replies
13
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 36 ·
2
Replies
36
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K