Differentiation with respect to covariant component of a vector

Click For Summary

Homework Help Overview

The discussion revolves around the differentiation with respect to covariant components of a vector, specifically in the context of proving that such differentiation yields a contravariant vector operator. The original poster references Jackson's Classical Electrodynamics and explores the transformation properties of differentiation with respect to contravariant and covariant components.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the use of implicit differentiation and the chain rule in the context of differentiating scalar functions of covariant components. There is exploration of the relationship between covariant and contravariant components, particularly regarding the symmetry between space and its dual space.

Discussion Status

Participants are actively engaging with the problem, attempting various approaches to derive the transformation properties of differentiation. Some have expressed uncertainty about their reasoning and the application of the chain rule, while others are questioning the assumptions made regarding the metric tensor and its independence from coordinates.

Contextual Notes

There is mention of working within the framework of special relativity, which may impose specific constraints on the assumptions being discussed. Additionally, participants note issues with repeated indices and the implications of using implicit versus direct differentiation methods.

Telemachus
Messages
820
Reaction score
30
I want to prove that differentiation with respec to covariant component gives a contravariant vector operator. I'm following Jackson's Classical Electrodynamics. In the first place he shows that differentiation with respecto to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator.

For this, he implicitly uses a change of variables: ##\displaystyle x^{\alpha}=x^{\alpha} (x^{ \beta } )## So using the rule for implicit differentiation:

##\displaystyle \frac{ \partial}{ \partial x^{ \alpha} }= \frac{ \partial x^{\beta} }{ \partial x^{\alpha} } \frac{\partial}{ \partial x^{ \beta} }##

Now I want to show that: ##\displaystyle \frac{\partial}{\partial x_{\alpha}}=\frac{\partial x^{\alpha}}{\partial x^{\beta}} \frac{\partial}{\partial x_{ \beta}}## I think that this is the expresion I should find, but I'm not sure, Jackson didn't give it explicitly.

He suggests to use that ##x_{\alpha}=g_{\alpha \beta}x^{\beta}##, g is the metric tensor.


I need some guidance for this.

Thanks in advance.
 
Last edited:
Physics news on Phys.org
Ok, I worked it this way. I supposed I had a scalar function of ##x_{\alpha}##, ##f(x_{\alpha})##. And because ##x_{\alpha}=g_{\alpha \beta}x^{\beta}## I have that ##x_{\alpha}=x_{\alpha}(x^{\beta})##.

Then I differentiated f:

##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x^{\beta}}=g_{\alpha \beta}\frac{\partial f}{\partial x_{\alpha}}## I've used that the metric tensor is independt of the coordinates of the vectors, I'm not sure if that's right in general, I'm working with special relativity and I think that this hold in this particular case.

Then I have

##\displaystyle \frac{\partial }{\partial x_{\alpha}}=g_{\alpha \beta}\frac{\partial }{\partial x_{\alpha}}##

This isn't the law of transformation I'm looking for. Anyway, it seems pretty intuitive that it should hold that if the differentiation with respect to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator, then differentiation with respect to a covariant component should transform as a contravariant vector, because I think there must be some sort of symmetry between the space and its dual space.
 
Last edited:
An other way would be just to consider a change of variables ##x_{\alpha}=x_{\alpha}(x_{\beta})## and a function ##f(x_{\alpha})##, and differentiate using the chain rule.

Then I get the desired result, but is not clear to me why in the first case, when deriving with respect to a contravariant component, I should use implicit differentiation, and why in this other case I get the result just by direct differentiation and using the chain rule.

I was wrong, I don't get the desired result, I get: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

And I should have: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\beta}} \frac{\partial x^{\alpha}}{\partial x^{\beta}}##
 
Last edited:
I think I was making a bad use of the chain rule. So, here I go again: ##x_{\alpha}=x_{\alpha}(x_{\beta})##. ##f=f(x_{\alpha})##

##\displaystyle \frac{df}{dx_{\beta}}=\frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

Now I've used that ##g_{\alpha \beta}=g_{\beta \alpha}## and that ##dx^{\alpha}=g^{\alpha \beta}dx_{\beta}##

Then:

##\displaystyle \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}=\frac{g^{\beta \alpha}}{g^{\alpha \beta}}\frac{\partial x_{\alpha}}{\partial x_{\beta}} \frac{\partial f}{\partial x_{\alpha}}=\frac{\partial x^{\beta}}{\partial x^{\alpha}}\frac{\partial f}{\partial x_{\alpha}}##

So that

##\displaystyle \frac{\partial }{\partial x_{\beta}}= \frac{\partial x^{\beta}}{\partial x^{\alpha}} \frac{\partial }{\partial x_{\alpha}}##

As I wanted to show.

EDIT: I see I have a problem here too, I have repited indices more than twice, so this approach is wrong too.
 
Last edited:

Similar threads

  • · Replies 18 ·
Replies
18
Views
3K
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K