Transforming between contra and covariant vectors

  • I
  • Thread starter dyn
  • Start date
  • #1
dyn
535
23

Main Question or Discussion Point

Hi.
The book I am using gives the following equations for the the Lorentz transformations of contravariant and covariant vectors
x = Λμν xν ( 1 )

xμ/ = Λμν xv ( 2 )

where the 2 Lorentz transformation matrices are the inverses of each other. I am trying to get equation 2 from equation 1 but if I lower the index on the LHS of (1) using the metric gρμ and apply it to both sides of (1) I get

xρ/ = Λρν x ν

Then i'm stuck because how can I lower the ν on xν as ν is already repeated twice on the RHS and so I can't use a metric with ν in it ?
Thanks
 

Answers and Replies

  • #2
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,667
6,450
The relation between the contravariant and covariant components is ##x^\nu = g^{\nu\mu}x_\mu##.
 
  • #3
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,401
2,578
Hi.
The book I am using gives the following equations for the the Lorentz transformations of contravariant and covariant vectors
x = Λμν xν ( 1 )

xμ/ = Λμν xv ( 2 )

where the 2 Lorentz transformation matrices are the inverses of each other. I am trying to get equation 2 from equation 1 but if I lower the index on the LHS of (1) using the metric gρμ and apply it to both sides of (1) I get

xρ/ = Λρν x ν

Then i'm stuck because how can I lower the ν on xν as ν is already repeated twice on the RHS and so I can't use a metric with ν in it ?
Thanks
You have

##x'^\mu = \Lambda^\mu_\nu x^\nu \Rightarrow g^{\mu \rho} x'_\rho = \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

Now, you operate on both sides with the ##g## to get:

##x'_\rho = g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

So the transformation matrix for the lowered components is ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda}##

The final step is to realize that ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} = \Lambda^\lambda_\rho##. That might seem obvious, but it's actually not, because ##\Lambda## is not a tensor; the two indices refer to different coordinate systems. So it's not immediately obvious that you can raise and lower indices the way you could with a tensor.
 
  • Like
Reactions: dyn
  • #4
dyn
535
23
You have

##x'^\mu = \Lambda^\mu_\nu x^\nu \Rightarrow g^{\mu \rho} x'_\rho = \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

Now, you operate on both sides with the ##g## to get:

##x'_\rho = g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##
.
Thanks for your reply. Can you explicitly explain this step for me ? What do I multiply each side of the equation by in terms of indices ?
 
  • #5
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,667
6,450
You are not to multiply each side with anything. You are to insert the relation given in #2, which @stevendaryl did for you explicitly in the first line of #3.
 
  • #6
dyn
535
23
I don't understand that step
 
  • #7
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,667
6,450
It is just an insertion of a known relation.
 
  • #8
dyn
535
23
It is just an insertion of a known relation.
I presume you mean the relation gαβgαγ = δβγ ?
So that means I multiply each side of the equation by gμα ?
 
  • #9
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,401
2,578
Thanks for your reply. Can you explicitly explain this step for me ? What do I multiply each side of the equation by in terms of indices ?
Let me write it without any indices. I think you will find that there is only one way to put in indices so that it makes sense.

  1. ##x' = \Lambda x##
  2. Let's write: ##x = g^{-1} g x##
  3. Substituting this expression for ##x## into equation 1: ##x' = \Lambda g^{-1} g x##
  4. Operate on 3 using ##g##: ##g x' = g \Lambda g^{-1} g x##
  5. Now, let's define the combination: ##\widetilde{x} \equiv g x##
  6. Also, ##\widetilde{x'} \equiv g x'##
  7. And ##\widetilde{\Lambda} \equiv g \Lambda g^{-1}##
  8. So the contravariant transformation law is: ##\widetilde{x'} = \widetilde{\Lambda} \widetilde{x}##
There is really only one way that you can insert indices into the equations 1-8 that makes sense.
 
  • #10
dyn
535
23
Thanks for that but its the placement of indices in each step that confuses me. Was the following statement correct ?
I presume you mean the relation gαβgαγ = δβγ ?
So that means I multiply each side of the equation by gμα ?
 
  • #11
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,401
2,578
Thanks for that but its the placement of indices in each step that confuses me. Was the following statement correct ?
As I said, there really is only one way to do the indices so that it makes sense.

But yes, if you start with ##x'^\mu = \Lambda^\mu_\nu x^\nu##, and multiply both sides by ##g_{\mu \alpha}## (and sum over ##\mu##). Then you rewrite ##x^\nu## as ##g^{\nu \lambda} x_\lambda##
 
  • Like
Reactions: dyn
  • #12
dyn
535
23
The final step is to realize that ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} = \Lambda^\lambda_\rho##. That might seem obvious, but it's actually not, because ##\Lambda## is not a tensor; the two indices refer to different coordinate systems. So it's not immediately obvious that you can raise and lower indices the way you could with a tensor.
I always see Λ as a 4x4 matrix and I always thought a matrix was a tensor of rank 2. I see Λμν as the entry on row μ and column ν of that matrix but i'm unsure how the rows and columns relate to the inverse Λμν
 
  • #13
Nugatory
Mentor
12,616
5,167
I always thought a matrix was a tensor of rank 2
The individual components of a rank-2 tensor can be represented as a matrix, but not all matrices are representations of rank-2 tensors.

The components of a tensor transform in a particular way when you change coordinate systems; not all matrices have that property. Also when you represent a tensor as a matrix, you lose the distinction between contravariant and covariant components.
 
  • Like
Reactions: dyn

Related Threads on Transforming between contra and covariant vectors

Replies
16
Views
8K
  • Last Post
Replies
14
Views
990
  • Last Post
Replies
7
Views
4K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
6
Views
2K
Replies
4
Views
2K
Replies
5
Views
1K
Replies
8
Views
3K
Top