# Coordinate Transformation

• B
This is intuitively very simple problem but I am unable to complete it with Mathematical rigor. Here is the deal:

A coordinate system $(u,v,w,p)$ in which the metric tensor has the following non-zero components, $g_{uv}= g_{ww}=g_{pp}=1$. Find the coordinate transformation between $(u,v,w,p)$ to the regular coordinates $(t,x,y,z)$.

So in the first part of the same question you are supposed to prove that coordinate space $(u,v,w,p)$ is flat which is easy enough to see because the metric components are constant, all Riemann tensor components are zero. Hence the space is Minkowski. Which, intuitively, implies that the transformation between the coordinates will be linear because standard $(t,x,y,z)$ form a Minkowski space. But I am unable to prove it Mathematically.

Here is what I did. The question had a hint to calculate $e_u.e_u$ and $e_v.e_v$ (e represents the basis in either coordinate system and subscript implies which coordinates they belong to. I am removing the vector heads or the circumflex). From the metric tensor in this coordinate they are going to be zero.

So, I did write the conversion between the basis of the two systems,
$$e_{\alpha '} = \Lambda^\alpha_{\alpha '} = \frac{\partial x^\alpha}{\partial x^{\alpha '}}e_\alpha$$
Here the primed indices represent the $(u,v,w,p)$ coordinates and unprimed represents $(t,x,y,z)$. This gives the following dot product,
$$e_{\alpha '}.e_{\beta '}= \Lambda^\alpha_{\alpha '}\Lambda^\beta_{\beta '}e_\alpha.e_\beta = \frac{\partial x^\alpha}{\partial x^{\alpha '}}\frac{\partial x^\beta}{\partial x^{\beta '}}e_\alpha.e_\beta$$
And I can use the orthonormality of the $(t,x,y,z)$ coordinates to simplify $e_\alpha.e_\beta$ but even then using the metric all,
$$e_{\alpha '}.e_{\beta '} = \frac{\partial x^\alpha}{\partial x^{\alpha '}}\frac{\partial x^\beta}{\partial x^{\beta '}}e_\alpha.e_\beta$$
gives me is a bunch of (actually 16) partial differential equations but I do not know where to go from there.

How do I prove that the partial differential equations have a solution,

$$x^\alpha = A^\alpha_{\alpha '}x^{\alpha '}$$

where $$A^\alpha_{\alpha '}$$ are constants? Surely it can be seen that it is ONE of the solution of the system of PDEs but is that the only solution(?). I arrived at the answer intuitively but want to be able to SEE it mathematically being a solution. Also, how do I end up calculating the values of $$A^\alpha_{\alpha '}$$?

Last edited:

stevendaryl
Staff Emeritus
I think you're making it harder than it needs to be. You weren't asked to prove that there is only one transformation that works, you were just asked to find one. Obviously, it's not unique, because you can always combine your transformation with a rotation, boost or translation to get another transformation.

I would say that you should just check to see if you can solve the problem with a constant matrix ##A^\alpha_{\alpha'}##. Then it wouldn't be a differential equation at all, but algebra: Find ##A## such that ##A^\alpha_{\alpha'} A^\beta_{\beta'} g_{\alpha \beta} = g_{\alpha' \beta'}##

Alright. So I get 16 equations for the 16 components of ##A^\alpha_{\alpha '}##. So it is solvable. Just not sure how to calculate the values of the individual ##A^\alpha_{\alpha '}##.

Orodruin
Staff Emeritus
Homework Helper
Gold Member
Alright. So I get 16 equations for the 16 components of ##A^\alpha_{\alpha '}##. So it is solvable. Just not sure how to calculate the values of the individual ##A^\alpha_{\alpha '}##.
Well, two of the coordinates are already orthonormal. I suggest you do not touch anything that has to do with them.

Well the complication, for me, is that equations are quadratic in ##A^\alpha_{\alpha '}##. For example, ##\vec{e}_w.\vec{e}_w=-(A^t_w)^2+(A^x_w)^2+(A^y_w)^2+(A^z_w)^2=1##.

Last edited:
stevendaryl
Staff Emeritus
Well the complication, for me, is that equations are quadratic in ##A^\alpha_{\alpha '}##. For example, ##\vec{e}_w.\vec{e}_w=-(A^t_w)^2+(A^x_w)^2+(A^y_w)^2+(A^z_w)^2=1##.

On @Orodruin's suggestion, you should just leave ##w## and ##p## alone, and focus on ##u## and ##v##.

Assume that ##y=w## and ##z=p##. That means:
1. ##A^y_w = 1##
2. ##A^y_p = A^y_u = A^y_v = 0##
3. ##A^z_p = 1##
4. ##A^z_w = A^z_u = A^z_v = 0##
5. ##A^t_w = A^t_p = 0##
6. ##A^x_w = A^x_p = 0##
So that leaves 4 numbers left to figure out:

##A^t_u, A^t_v, A^x_u, A^x_v##

The equation ##g_{\alpha' \beta'} = A^\alpha_{\alpha'} A^\beta_{\beta'} g_{\alpha \beta}## gives you 4 equations:
1. ##g_{uu} = A^t_u A^t_u g_{tt} + A^t_u A^x_u g_{tx} + A^x_u A^t_u g_{xt} + A^x_u A^x_u g_{xx}##
2. ##g_{uv} = A^t_u A^t_v g_{tt} + A^t_u A^x_v g_{tx} + A^x_u A^t_v g_{xt} + A^x_u A^x_v g_{xx}##
3. ##g_{vu} = A^t_v A^t_v g_{tt} + A^t_v A^u_v g_{tx} + A^x_v A^t_u g_{xt} + A^x_v A^x_u g_{xx}##
4. ##g_{vv} = A^t_v A^t_v g_{tt} + A^t_v A^x_v g_{tx} + A^x_v A^t_v g_{xt} + A^x_v A^x_v g_{xx}##
You know
1. ##g_{xx} = 1##
2. ##g_{xt} = g_{tx} = 0##
3. ##g_{tt} = -1## (I'm assuming this is the convention you're using?)
4. ##g_{uu} = 0##
5. ##g_{uv} = g_{vu} = 1##
6. ##g_{vv} = 0##
Yes, the equations are quadratic, so that means that there are multiple solutions. Just come up with one solution.

Nice. Thanks a bunch. I was just not entirely sure that I was allowed to make that assumption. Then it simplifies greatly. In retrospect, the book did ask to refer to a previous problem from another chapter which was basically the same but restricted to ##t## and ##x##.