Differentiation with respect to covariant component of a vector

Telemachus
Messages
820
Reaction score
30
I want to prove that differentiation with respec to covariant component gives a contravariant vector operator. I'm following Jackson's Classical Electrodynamics. In the first place he shows that differentiation with respecto to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator.

For this, he implicitly uses a change of variables: ##\displaystyle x^{\alpha}=x^{\alpha} (x^{ \beta } )## So using the rule for implicit differentiation:

##\displaystyle \frac{ \partial}{ \partial x^{ \alpha} }= \frac{ \partial x^{\beta} }{ \partial x^{\alpha} } \frac{\partial}{ \partial x^{ \beta} }##

Now I want to show that: ##\displaystyle \frac{\partial}{\partial x_{\alpha}}=\frac{\partial x^{\alpha}}{\partial x^{\beta}} \frac{\partial}{\partial x_{ \beta}}## I think that this is the expresion I should find, but I'm not sure, Jackson didn't give it explicitly.

He suggests to use that ##x_{\alpha}=g_{\alpha \beta}x^{\beta}##, g is the metric tensor.


I need some guidance for this.

Thanks in advance.
 
Last edited:
Physics news on Phys.org
Ok, I worked it this way. I supposed I had a scalar function of ##x_{\alpha}##, ##f(x_{\alpha})##. And because ##x_{\alpha}=g_{\alpha \beta}x^{\beta}## I have that ##x_{\alpha}=x_{\alpha}(x^{\beta})##.

Then I differentiated f:

##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x^{\beta}}=g_{\alpha \beta}\frac{\partial f}{\partial x_{\alpha}}## I've used that the metric tensor is independt of the coordinates of the vectors, I'm not sure if that's right in general, I'm working with special relativity and I think that this hold in this particular case.

Then I have

##\displaystyle \frac{\partial }{\partial x_{\alpha}}=g_{\alpha \beta}\frac{\partial }{\partial x_{\alpha}}##

This isn't the law of transformation I'm looking for. Anyway, it seems pretty intuitive that it should hold that if the differentiation with respect to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator, then differentiation with respect to a covariant component should transform as a contravariant vector, because I think there must be some sort of symmetry between the space and its dual space.
 
Last edited:
An other way would be just to consider a change of variables ##x_{\alpha}=x_{\alpha}(x_{\beta})## and a function ##f(x_{\alpha})##, and differentiate using the chain rule.

Then I get the desired result, but is not clear to me why in the first case, when deriving with respect to a contravariant component, I should use implicit differentiation, and why in this other case I get the result just by direct differentiation and using the chain rule.

I was wrong, I don't get the desired result, I get: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

And I should have: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\beta}} \frac{\partial x^{\alpha}}{\partial x^{\beta}}##
 
Last edited:
I think I was making a bad use of the chain rule. So, here I go again: ##x_{\alpha}=x_{\alpha}(x_{\beta})##. ##f=f(x_{\alpha})##

##\displaystyle \frac{df}{dx_{\beta}}=\frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

Now I've used that ##g_{\alpha \beta}=g_{\beta \alpha}## and that ##dx^{\alpha}=g^{\alpha \beta}dx_{\beta}##

Then:

##\displaystyle \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}=\frac{g^{\beta \alpha}}{g^{\alpha \beta}}\frac{\partial x_{\alpha}}{\partial x_{\beta}} \frac{\partial f}{\partial x_{\alpha}}=\frac{\partial x^{\beta}}{\partial x^{\alpha}}\frac{\partial f}{\partial x_{\alpha}}##

So that

##\displaystyle \frac{\partial }{\partial x_{\beta}}= \frac{\partial x^{\beta}}{\partial x^{\alpha}} \frac{\partial }{\partial x_{\alpha}}##

As I wanted to show.

EDIT: I see I have a problem here too, I have repited indices more than twice, so this approach is wrong too.
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top