# Tensor Excercise.

1. Feb 8, 2012

### Tomer

This is really more maths than physics, but I think that's the sort of thing only physicists know...

1. The problem statement, all variables and given/known data
I need to show that although $V^{\beta}_{,\alpha}$ and $V^{\mu} \Gamma^{\beta}_{\mu \alpha}$ don't transform like tensors, their sum, the covariant derivative, does.

2. Relevant equations

Regular transformation laws for tensors: $T^{\alpha' \beta'}_{\gamma'\delta'} = \Lambda^{\alpha'}_{\alpha} \Lambda^{\beta'}_{\beta} \Lambda^{\gamma}_{\gamma'} \Lambda^{\delta}_{\delta'} T^{\alpha\beta}_{\gamma\delta}$

Also: $T_{,\alpha} \equiv \frac{dT}{dx^{\alpha}}$

3. The attempt at a solution

This is reaaaaly lengthy so I'll post only what I've gotten after trying to transform the terms in the question (which takes long enough).
I cannot show that their sum is transformed like a tensor though!

What I get:
1. $V^{\beta'}_{,\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\beta}_{,\alpha} + \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta'}_{\beta, \alpha} V^{\beta}$

2. $V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\mu'}_{\nu}V^{\nu} ( \Lambda^{\beta'}_{\beta} \Lambda^{\mu}_{\mu'} \Lambda^{\alpha}_{\alpha'} \Gamma^{\beta}_{\mu\alpha} + \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} \Lambda^{\beta}_{\mu', \alpha})$

Now, assuming that I properly played with the second expression, I get after some manipulations and usage of $\Lambda^{\mu'}_{\nu} \Lambda^{\mu}_{\mu'} = \delta^{\mu}_{\nu}$:

2'. $V^{\mu'}\Gamma^{\beta'}_{\mu'\alpha'} = \Lambda^{\beta'}_{\beta} \Lambda^{\alpha}_{\alpha'} V^{\mu}\Gamma^{\beta}_{\mu\alpha} + \Lambda^{\alpha}_{\alpha'} V^{\nu} \Lambda^{\beta'}_{\nu, \alpha}$

Now, it is clear that when we add these two equations, 1 and 2', the first parts are exactly the tensor transformations we want.
That means, that the second sum needs to somehow cancel out. But I don't see how/why it happens. Actually, the "extra parts" seem to be equal, so according to my "calculations", they add up. I cannot see where a minus sign could come from.
I either miss something here, or I did something wrong.

I'm sorry for not posting the whole way, it would just take forever.... I'd really really appreciate an answer!!!

Tomer.

Last edited: Feb 8, 2012
2. Feb 8, 2012

### Fredrik

Staff Emeritus
Looks like when you "played with the second expression" you tried to use the product rule like this: f'g=(fg)'-fg'
...and either missed the minus sign completely, or put it on the term that's =0 anyway.

3. Feb 8, 2012

### Tomer

I'm not sure what you mean.
By playing with it, I meant using relations like:

$\Lambda^{\beta'}_{\beta} \Lambda^{\beta}_{\mu',\alpha} = \Lambda^{\beta'}_{\mu',\alpha}$
Or is that wrong?
It now occurs to me that I might have assumed that $\Lambda^{\beta}_{\mu',\alpha}$ transforms like a tensor where it actually doesn't. Could this be it?
But then how do I simplify the second expression? It's so confusing :-\

For a scalar function, I know that $f_{,\alpha}=f_{;\alpha}$. therefore $f_{,\alpha}$ is a tensor. Can I not claim the same about the component of a tensor, which is also a scalar function? Or does that sound really dumb?

And thanks a lot for the reply, btw.

Edit: ok, it is dumb, a Tensor component is obviously not a scalar since it depends on the basis. But then do you have hints, how to simplify the sum of the extra terms? There are so many indexes I'm getting blind.

UPDATE: Ok!! I did it using your hint! (well, it more more of an assumption of what I did, which I didn't)!

I used the fact that:
$\Lambda^{\beta'}_{\beta} \Lambda^{\beta}_{\mu',\alpha} = (\Lambda^{\beta'}_{\beta} \Lambda^{\beta}_{\mu'})_{\alpha} - \Lambda^{\beta'}_{\beta,\alpha} \Lambda^{\beta}_{\mu'} = \delta^{\beta'}_{\mu',\alpha}-\Lambda^{\beta'}_{\beta,\alpha} \Lambda^{\beta}_{\mu'} = -\Lambda^{\beta'}_{\beta,\alpha} \Lambda^{\beta}_{\mu'}$
Which after some work gets to be exactly the opposite of the "extra term" in 1. I hope it's correct now. And I understand my (terrible) mistake.

Thanks a lot Fredrik.

Last edited: Feb 8, 2012
4. Feb 8, 2012

### Fredrik

Staff Emeritus
Sorry, I was away from the computer for a while after I made that post. Yes, what you did there at the end is exactly what I thought you had tried to do. I didn't check all the details, but since you're getting the result you want, it's probably correct now.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook