Understanding Covariant Derivative: d Explained

Click For Summary
SUMMARY

The discussion focuses on the definition and understanding of the covariant derivative, specifically the term 'd' in the formula. The covariant derivative is expressed as \(\nabla_{\alpha}T^{\beta\gamma}=\frac{\partial{T^{\beta\gamma}}}{\partial{x^{\alpha}}}+\Gamma^{\beta}_{d\alpha}T^{d\gamma}+\Gamma^{\gamma}_{d\alpha}T^{\beta d}\). Participants clarify that 'd' represents a summation index over the dimensions of the manifold, which can vary based on the context, such as in three-dimensional Euclidean space or in general relativity. The importance of consistent notation, particularly the use of Greek and Latin indices, is emphasized for clarity in tensor calculus.

PREREQUISITES
  • Understanding of tensor notation and indices
  • Familiarity with the concept of covariant derivatives
  • Knowledge of Einstein's summation convention
  • Basic principles of differential geometry
NEXT STEPS
  • Study the properties of covariant derivatives in differential geometry
  • Learn about the role of Christoffel symbols in tensor calculus
  • Explore the implications of covariant derivatives in general relativity
  • Review Sean Carroll's Lecture Notes on General Relativity for deeper insights
USEFUL FOR

Students and professionals in mathematics, physics, and engineering who are working with differential geometry, tensor analysis, or general relativity will benefit from this discussion.

TromboneNerd
Messages
18
Reaction score
1
I get in essence what the covariant derivative is, and what it does, but I am having trouble with the definition, of all things.\nabla_{\alpha}T^{\beta\gamma}=\frac{\partial{T^{\beta\gamma}}}{\partial{x^{\alpha}}}+\Gamma^{\beta}_{d\alpha}T^{d\gamma}+\Gamma^{\gamma}_{d\alpha}T^{\beta d}

Im good with everything except d. what is it? where did it come from? its just tacked into the formula but i don't understand what it is. any help would be greatly appreciated
 
Last edited:
Physics news on Phys.org
Have you learned Einstein's summation convention yet? You just sum over all values for d.

Whenever, an index is repeated with one as an upper index and one as a lower index, the understanding is to sum over all possible values of that index.
 
yes i am familiar with the summation convention, but is it assumed to be basic three dimensional euclidean space? so

\nabla_{\alpha}T^{\beta\gamma}=\sum_{i=1}^{3}\frac{\partial{T^{\beta\gamma}}}{ \partial{x^{\alpha}}}+\Gamma^{\beta}_{d^{i}\alpha}T^{d^{i}\gamma}+\Gamma^{\gamma}_{d^{i}\alpha}T^{\beta d^{i}}

where d1 d2 d3 each represent a different spatial dimension? or is it

\nabla_{\alpha}T^{\beta\gamma}=\sum_{d=1}^{n}\frac{\partial{T^{\beta\gamma}}}{ \partial{x^{\alpha}}}+\Gamma^{\beta}_{d\alpha}T^{d\gamma}+\Gamma^{\gamma}_{d\alpha}T^{\beta d}
where d itself is summed over, and if so, what determines n? the number of indices in the tensor? I don't see why you would insert 1,2...n into the equation where the d's are if this is the case. they don't seem like they are operating as coefficients. could you please go into further detail?
 
Last edited:
Should be:

<br /> (\nabla_{\alpha}T)^{\beta\gamma}=\frac{\partial{T^{\beta\gamma}}}{\partial{x^{\alpha}}}+\Gamma^{\beta} _{\alpha \sigma}T^{\sigma\gamma}+\Gamma^{\gamma}_{\alpha \sigma}T^{\beta \sigma}<br />

The above means: for every three fixed indices \alpha,\beta,\gamma, for instance (in 3 dimensions)
for \alpha=2,\beta=3,\gamma=2, we compute:

<br /> ( \nabla_{2}T)^{32}=\frac{\partial T^{32}}{\partial x^{2}} +\sum_{\sigma=1}^3\Gamma^{3}_{2 \sigma}T^{\sigma 3}+\sum_{\sigma=1}^3\Gamma^{2}_{2 \sigma}T^{3 \sigma}<br />

( \nabla_{2}T)^{32} should read: (32) component of the tensor \nabla_{2}T. Usually it is written without parenthesis as \nabla_{2}T^{32} but it is good to keep parenthesis in mind. It is not the covariant derivative of the component of the tensor, but it is a component of the covariant derivative of the tensor.
 
Last edited:
Neither of the expressions in #3 makes sense, since you have an extra index (c) on the right that isn't being summed over and doesn't appear on the left.

All the indices run from 1 to n where n is the number of dimensions of the manifold, unless the manifold is the spacetime of special or general relativity. In those cases, the convention is to have the indices run from 0 to 3 instead of from 1 to 4.

You shouldn't use greek letters for some of the indices and latin letters for others, because a different type of letter than the one you use for most indices usually means that it's not just another index. For example, in relativity, one convention that many people use is to have greek indices run from 0 to 3, and latin indices from 1 to 3.
 
@Fredrik
I was correcting my expression several times. Also changed summation index to Greek one - as you have indicated. In principle it can be anything, but better not to mix Greek and Latin, small and capital etc.
 
I wrote all of that before I saw your post. My comments were about the expressions in post #3.
 
that c should be an alpha. i changed it and it should be correct now. I wrote it on paper before that with c so i got myself a little mixed up.
 
I've visited this thread several times, without comprehending the nature of your stumbling-block. Now, I may understand.

Consider just the covariant derivative of a vector to keep things simple.

\nabla_{\mu}V^{\nu} = \partial_{\mu} V^{\nu} + \Gamma_{\mu}^{\nu}_{\lambda}V^\lambda

Rewrite this as

\nabla_{\mu}V^{\nu} = \partial_{\mu} V^{\nu} + (\Gamma_{\mu})^{\nu}_\lambda}V^\lambda

Take one dimension at time. For example,

\nabla_{1}V^{\nu} = \partial_{1} V^{\nu} + (\Gamma_{1})^{\nu}_{\lambda}V^\lambda

For each dimension in which we take a derivative, in this case the x1 direction. The connection is a correction to the derivative of the vector component in the x1 direction. It's just an n x n matrix that takes each component of the vector and gives a correction for this component.

Sean Carroll, Lecture Notes on General Relativity, does a much better job of this, available online, page 56, for a far more lucid explanation.
 

Similar threads

  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
8K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K