# Differentiation with respect to covariant component of a vector

• Telemachus
In summary, the conversation discusses the proof of differentiation with respect to a covariant component giving a contravariant vector operator. This is shown using a change of variables and the rule for implicit differentiation. However, the desired result is not obtained and the conversation continues with alternative approaches, such as using the chain rule. Ultimately, it is concluded that the law of transformation for differentiation with respect to a covariant component is ##\displaystyle \frac{\partial}{\partial x_{\beta}}= \frac{\partial x^{\beta}}{\partial x^{\alpha}} \frac{\partial}{\partial x_{\alpha}}##.
Telemachus
I want to prove that differentiation with respec to covariant component gives a contravariant vector operator. I'm following Jackson's Classical Electrodynamics. In the first place he shows that differentiation with respecto to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator.

For this, he implicitly uses a change of variables: ##\displaystyle x^{\alpha}=x^{\alpha} (x^{ \beta } )## So using the rule for implicit differentiation:

##\displaystyle \frac{ \partial}{ \partial x^{ \alpha} }= \frac{ \partial x^{\beta} }{ \partial x^{\alpha} } \frac{\partial}{ \partial x^{ \beta} }##

Now I want to show that: ##\displaystyle \frac{\partial}{\partial x_{\alpha}}=\frac{\partial x^{\alpha}}{\partial x^{\beta}} \frac{\partial}{\partial x_{ \beta}}## I think that this is the expresion I should find, but I'm not sure, Jackson didn't give it explicitly.

He suggests to use that ##x_{\alpha}=g_{\alpha \beta}x^{\beta}##, g is the metric tensor.

I need some guidance for this.

Last edited:
Ok, I worked it this way. I supposed I had a scalar function of ##x_{\alpha}##, ##f(x_{\alpha})##. And because ##x_{\alpha}=g_{\alpha \beta}x^{\beta}## I have that ##x_{\alpha}=x_{\alpha}(x^{\beta})##.

Then I differentiated f:

##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x^{\beta}}=g_{\alpha \beta}\frac{\partial f}{\partial x_{\alpha}}## I've used that the metric tensor is independt of the coordinates of the vectors, I'm not sure if that's right in general, I'm working with special relativity and I think that this hold in this particular case.

Then I have

##\displaystyle \frac{\partial }{\partial x_{\alpha}}=g_{\alpha \beta}\frac{\partial }{\partial x_{\alpha}}##

This isn't the law of transformation I'm looking for. Anyway, it seems pretty intuitive that it should hold that if the differentiation with respect to a contravariant component of the coordinate vector transforms as the component of a covariant vector operator, then differentiation with respect to a covariant component should transform as a contravariant vector, because I think there must be some sort of symmetry between the space and its dual space.

Last edited:
An other way would be just to consider a change of variables ##x_{\alpha}=x_{\alpha}(x_{\beta})## and a function ##f(x_{\alpha})##, and differentiate using the chain rule.

Then I get the desired result, but is not clear to me why in the first case, when deriving with respect to a contravariant component, I should use implicit differentiation, and why in this other case I get the result just by direct differentiation and using the chain rule.

I was wrong, I don't get the desired result, I get: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

And I should have: ##\displaystyle \frac{df}{dx_{\alpha}}= \frac{\partial f}{\partial x_{\beta}} \frac{\partial x^{\alpha}}{\partial x^{\beta}}##

Last edited:
I think I was making a bad use of the chain rule. So, here I go again: ##x_{\alpha}=x_{\alpha}(x_{\beta})##. ##f=f(x_{\alpha})##

##\displaystyle \frac{df}{dx_{\beta}}=\frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}##

Now I've used that ##g_{\alpha \beta}=g_{\beta \alpha}## and that ##dx^{\alpha}=g^{\alpha \beta}dx_{\beta}##

Then:

##\displaystyle \frac{\partial f}{\partial x_{\alpha}} \frac{\partial x_{\alpha}}{\partial x_{\beta}}=\frac{g^{\beta \alpha}}{g^{\alpha \beta}}\frac{\partial x_{\alpha}}{\partial x_{\beta}} \frac{\partial f}{\partial x_{\alpha}}=\frac{\partial x^{\beta}}{\partial x^{\alpha}}\frac{\partial f}{\partial x_{\alpha}}##

So that

##\displaystyle \frac{\partial }{\partial x_{\beta}}= \frac{\partial x^{\beta}}{\partial x^{\alpha}} \frac{\partial }{\partial x_{\alpha}}##

As I wanted to show.

EDIT: I see I have a problem here too, I have repited indices more than twice, so this approach is wrong too.

Last edited:

## What is differentiation with respect to covariant component of a vector?

Differentiation with respect to covariant component of a vector is a mathematical operation that calculates the rate of change of a vector's covariant component with respect to a specific variable or parameter. It is a fundamental concept in calculus and is used to analyze the behavior of vector quantities in various applications.

## Why is differentiation with respect to covariant component of a vector important?

Differentiation with respect to covariant component of a vector is important because it allows us to understand how a vector quantity changes in response to changes in a particular variable. This can be useful in solving problems in physics, engineering, and other fields where vectors are used to describe quantities such as velocity, acceleration, and force.

## What is the difference between covariant and contravariant components of a vector?

Covariant and contravariant components of a vector refer to different ways of representing the same vector in a different coordinate system. In general, covariant components change in the same way as the coordinates of the new system, while contravariant components change in the opposite way. This distinction is important in understanding how vectors behave under different transformations.

## How do you perform differentiation with respect to covariant component of a vector?

To perform differentiation with respect to covariant component of a vector, you first need to express the vector in terms of its covariant components in the given coordinate system. Then, you can apply the standard rules of differentiation to each component separately, treating the other components as constants. The resulting derivative will be a vector with the same direction as the original vector, but with different magnitude.

## What are some practical applications of differentiation with respect to covariant component of a vector?

Differentiation with respect to covariant component of a vector has many practical applications. For example, it is used in physics to analyze the motion of objects in different coordinate systems, in engineering to design systems that respond to changing variables, and in computer graphics to manipulate the appearance of 3D objects. It is also a fundamental concept in the study of general relativity and differential geometry.

• Calculus and Beyond Homework Help
Replies
18
Views
2K
• Calculus and Beyond Homework Help
Replies
10
Views
736
• Calculus and Beyond Homework Help
Replies
1
Views
864
• Calculus and Beyond Homework Help
Replies
3
Views
951
• Calculus and Beyond Homework Help
Replies
18
Views
2K
• Calculus and Beyond Homework Help
Replies
12
Views
1K
• Calculus and Beyond Homework Help
Replies
12
Views
1K
• Calculus and Beyond Homework Help
Replies
6
Views
851
• Special and General Relativity
Replies
1
Views
362
• Calculus and Beyond Homework Help
Replies
1
Views
792