# Transforming between contra and covariant vectors

• I
Hi.
The book I am using gives the following equations for the the Lorentz transformations of contravariant and covariant vectors
x = Λμν xν ( 1 )

xμ/ = Λμν xv ( 2 )

where the 2 Lorentz transformation matrices are the inverses of each other. I am trying to get equation 2 from equation 1 but if I lower the index on the LHS of (1) using the metric gρμ and apply it to both sides of (1) I get

xρ/ = Λρν x ν

Then i'm stuck because how can I lower the ν on xν as ν is already repeated twice on the RHS and so I can't use a metric with ν in it ?
Thanks

Orodruin
Staff Emeritus
Homework Helper
Gold Member
The relation between the contravariant and covariant components is ##x^\nu = g^{\nu\mu}x_\mu##.

stevendaryl
Staff Emeritus
Hi.
The book I am using gives the following equations for the the Lorentz transformations of contravariant and covariant vectors
x = Λμν xν ( 1 )

xμ/ = Λμν xv ( 2 )

where the 2 Lorentz transformation matrices are the inverses of each other. I am trying to get equation 2 from equation 1 but if I lower the index on the LHS of (1) using the metric gρμ and apply it to both sides of (1) I get

xρ/ = Λρν x ν

Then i'm stuck because how can I lower the ν on xν as ν is already repeated twice on the RHS and so I can't use a metric with ν in it ?
Thanks

You have

##x'^\mu = \Lambda^\mu_\nu x^\nu \Rightarrow g^{\mu \rho} x'_\rho = \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

Now, you operate on both sides with the ##g## to get:

##x'_\rho = g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

So the transformation matrix for the lowered components is ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda}##

The final step is to realize that ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} = \Lambda^\lambda_\rho##. That might seem obvious, but it's actually not, because ##\Lambda## is not a tensor; the two indices refer to different coordinate systems. So it's not immediately obvious that you can raise and lower indices the way you could with a tensor.

dyn
You have

##x'^\mu = \Lambda^\mu_\nu x^\nu \Rightarrow g^{\mu \rho} x'_\rho = \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##

Now, you operate on both sides with the ##g## to get:

##x'_\rho = g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} x_\lambda##
.
Thanks for your reply. Can you explicitly explain this step for me ? What do I multiply each side of the equation by in terms of indices ?

Orodruin
Staff Emeritus
Homework Helper
Gold Member
You are not to multiply each side with anything. You are to insert the relation given in #2, which @stevendaryl did for you explicitly in the first line of #3.

I don't understand that step

Orodruin
Staff Emeritus
Homework Helper
Gold Member
It is just an insertion of a known relation.

It is just an insertion of a known relation.

I presume you mean the relation gαβgαγ = δβγ ?
So that means I multiply each side of the equation by gμα ?

stevendaryl
Staff Emeritus
Thanks for your reply. Can you explicitly explain this step for me ? What do I multiply each side of the equation by in terms of indices ?

Let me write it without any indices. I think you will find that there is only one way to put in indices so that it makes sense.

1. ##x' = \Lambda x##
2. Let's write: ##x = g^{-1} g x##
3. Substituting this expression for ##x## into equation 1: ##x' = \Lambda g^{-1} g x##
4. Operate on 3 using ##g##: ##g x' = g \Lambda g^{-1} g x##
5. Now, let's define the combination: ##\widetilde{x} \equiv g x##
6. Also, ##\widetilde{x'} \equiv g x'##
7. And ##\widetilde{\Lambda} \equiv g \Lambda g^{-1}##
8. So the contravariant transformation law is: ##\widetilde{x'} = \widetilde{\Lambda} \widetilde{x}##
There is really only one way that you can insert indices into the equations 1-8 that makes sense.

Thanks for that but its the placement of indices in each step that confuses me. Was the following statement correct ?
I presume you mean the relation gαβgαγ = δβγ ?
So that means I multiply each side of the equation by gμα ?

stevendaryl
Staff Emeritus
Thanks for that but its the placement of indices in each step that confuses me. Was the following statement correct ?

As I said, there really is only one way to do the indices so that it makes sense.

But yes, if you start with ##x'^\mu = \Lambda^\mu_\nu x^\nu##, and multiply both sides by ##g_{\mu \alpha}## (and sum over ##\mu##). Then you rewrite ##x^\nu## as ##g^{\nu \lambda} x_\lambda##

dyn
The final step is to realize that ##g_{\mu \rho} \Lambda^\mu_\nu g^{\nu \lambda} = \Lambda^\lambda_\rho##. That might seem obvious, but it's actually not, because ##\Lambda## is not a tensor; the two indices refer to different coordinate systems. So it's not immediately obvious that you can raise and lower indices the way you could with a tensor.
I always see Λ as a 4x4 matrix and I always thought a matrix was a tensor of rank 2. I see Λμν as the entry on row μ and column ν of that matrix but i'm unsure how the rows and columns relate to the inverse Λμν

Nugatory
Mentor
I always thought a matrix was a tensor of rank 2
The individual components of a rank-2 tensor can be represented as a matrix, but not all matrices are representations of rank-2 tensors.

The components of a tensor transform in a particular way when you change coordinate systems; not all matrices have that property. Also when you represent a tensor as a matrix, you lose the distinction between contravariant and covariant components.

dyn