Tomer
- 198
- 0
This is really more maths than physics, but I think that's the sort of thing only physicists know...
I need to show that although V^{\beta}_{,\alpha} and V^{\mu} \Gamma^{\beta}_{\mu \alpha} don't transform like tensors, their sum, the covariant derivative, does.
Regular transformation laws for tensors: T^{\alpha' \beta'}_{\gamma'\delta'} = \Lambda^{\alpha'}_{\alpha} \Lambda^{\beta'}_{\beta} \Lambda^{\gamma}_{\gamma'} \Lambda^{\delta}_{\delta'} T^{\alpha\beta}_{\gamma\delta}
Also: T_{,\alpha} \equiv \frac{dT}{dx^{\alpha}}
This is reaaaaly lengthy so I'll post only what I've gotten after trying to transform the terms in the question (which takes long enough).
I cannot show that their sum is transformed like a tensor though!
What I get:
1. V^{\beta'}_{,\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\beta}_{,\alpha} + \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta'}_{\beta, \alpha} V^{\beta}
2. V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\mu'}_{\nu}V^{\nu} (<br /> \Lambda^{\beta'}_{\beta} \Lambda^{\mu}_{\mu'} \Lambda^{\alpha}_{\alpha'} \Gamma^{\beta}_{\mu\alpha} + \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta}_{\mu', \alpha})
Now, assuming that I properly played with the second expression, I get after some manipulations and usage of \Lambda^{\mu'}_{\nu} \Lambda^{\mu}_{\mu'} = \delta^{\mu}_{\nu}:
2'. V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\mu}\Gamma^{\beta}_{\mu\alpha} + \Lambda^{\alpha}_{\alpha'} V^{\nu} \Lambda^{\beta'}_{\nu, \alpha}
Now, it is clear that when we add these two equations, 1 and 2', the first parts are exactly the tensor transformations we want.
That means, that the second sum needs to somehow cancel out. But I don't see how/why it happens. Actually, the "extra parts" seem to be equal, so according to my "calculations", they add up. I cannot see where a minus sign could come from.
I either miss something here, or I did something wrong.
I'm sorry for not posting the whole way, it would just take forever... I'd really really appreciate an answer!
Tomer.
Homework Statement
I need to show that although V^{\beta}_{,\alpha} and V^{\mu} \Gamma^{\beta}_{\mu \alpha} don't transform like tensors, their sum, the covariant derivative, does.
Homework Equations
Regular transformation laws for tensors: T^{\alpha' \beta'}_{\gamma'\delta'} = \Lambda^{\alpha'}_{\alpha} \Lambda^{\beta'}_{\beta} \Lambda^{\gamma}_{\gamma'} \Lambda^{\delta}_{\delta'} T^{\alpha\beta}_{\gamma\delta}
Also: T_{,\alpha} \equiv \frac{dT}{dx^{\alpha}}
The Attempt at a Solution
This is reaaaaly lengthy so I'll post only what I've gotten after trying to transform the terms in the question (which takes long enough).
I cannot show that their sum is transformed like a tensor though!
What I get:
1. V^{\beta'}_{,\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\beta}_{,\alpha} + \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta'}_{\beta, \alpha} V^{\beta}
2. V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\mu'}_{\nu}V^{\nu} (<br /> \Lambda^{\beta'}_{\beta} \Lambda^{\mu}_{\mu'} \Lambda^{\alpha}_{\alpha'} \Gamma^{\beta}_{\mu\alpha} + \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta}_{\mu', \alpha})
Now, assuming that I properly played with the second expression, I get after some manipulations and usage of \Lambda^{\mu'}_{\nu} \Lambda^{\mu}_{\mu'} = \delta^{\mu}_{\nu}:
2'. V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\mu}\Gamma^{\beta}_{\mu\alpha} + \Lambda^{\alpha}_{\alpha'} V^{\nu} \Lambda^{\beta'}_{\nu, \alpha}
Now, it is clear that when we add these two equations, 1 and 2', the first parts are exactly the tensor transformations we want.
That means, that the second sum needs to somehow cancel out. But I don't see how/why it happens. Actually, the "extra parts" seem to be equal, so according to my "calculations", they add up. I cannot see where a minus sign could come from.
I either miss something here, or I did something wrong.
I'm sorry for not posting the whole way, it would just take forever... I'd really really appreciate an answer!
Tomer.
Last edited: