A Calculating Nabla w V in General Relativity

bres gres
Messages
18
Reaction score
1
TL;DR Summary
i try to understand

$$\nabla_{V} W =/ V(W)$$

but get stuck and i read some material online.
in the language of general relativity,we know that we can write
$$\nabla_{V}W $$
in this form such that:
$$\nabla_{V}W = = w^i d ( V^j e_j)/du^i = w^j e^i (V^j e_j ) = W( V)$$
where $$w^i * d/ (du^i) =W$$ will act on the vector V
where $$W = w^i d( ) /du^i $$ and W is a vector as a operator

but in non-torsion free form we know that $$\nabla_{w} V - \nabla_{v} W = [V,W] + T(v,w)$$

where T(v,w) is a torsion tensorwhich implied $$[V,W] =VW-WV = \nabla_{w} V - \nabla_{v}W$$
i just want to know why i am not correct in this derivation since i cannot prove they are NOT equal.
thank you
 
Last edited:
Physics news on Phys.org
##W(V)## where ##W## and ##V## are vector fields does not make any sense. Vector fields are directional derivatives that act on scalar fields. In order to have a directional derivative for vector (or more generally, tensor) fields, you need to introduce a connection. This connection cannot be specified using the commutator of vector fields because it does not satisfy linearity in the direction, which is a condition for an affine connection:
$$
\nabla_{fV}W = f\nabla_V W.
$$
This is not satisfied for ##[V,W]##. The definition of the torsion tensor is that
$$
T(V,W) = \nabla_V W - \nabla_W V - [V,W].
$$
 
  • Like
Likes PeroK
Orodruin said:
##W(V)## where ##W## and ##V## are vector fields does not make any sense. Vector fields are directional derivatives that act on scalar fields. In order to have a directional derivative for vector (or more generally, tensor) fields, you need to introduce a connection. This connection cannot be specified using the commutator of vector fields because it does not satisfy linearity in the direction, which is a condition for an affine connection:
$$
\nabla_{fV}W = f\nabla_V W.
$$
This is not satisfied for ##[V,W]##. The definition of the torsion tensor is that
$$
T(V,W) = \nabla_V W - \nabla_W V - [V,W].
$$

therefore what is ##[V,W]## itself ??
i think it is VW-WV actually
 
##[V,W]## is the vector field such that ##[V,W]f = V(W(f)) - W(V(f))## for all functions ##f##. This is sometimes (sloppily) denoted ##[V,W] = VW - WV##, but this is not the same as ##V(W) - W(V)##, which does not make sense.
 
Orodruin said:
##[V,W]## is the vector field such that ##[V,W]f = V(W(f)) - W(V(f))## for all functions ##f##. This is sometimes (sloppily) denoted ##[V,W] = VW - WV##, but this is not the same as ##V(W) - W(V)##, which does not make sense.
i see
i am watching the video in this link
in 4:37 the presenter expanded the vector in this form and i get confused...
because he let's $$ V(U)=v^i e_i(U^J e_j)$$
since this is what what we understand $$\nabla_{v} U$$ where $$\nabla_{v} U =
v^i e_i(U^j e_j)$$
i cannot see the difference between them
so what is the problem and i try to understand why there are the "same" in the 1 st step
 
Last edited:
i just fix the mistakes in my reply and the question
thank for your help :(

i am copying and pasting the Latex code from somewhere else and i try to modify them
 
Orodruin said:
##W(V)## where ##W## and ##V## are vector fields does not make any sense. Vector fields are directional derivatives that act on scalar fields. In order to have a directional derivative for vector (or more generally, tensor) fields, you need to introduce a connection. This connection cannot be specified using the commutator of vector fields because it does not satisfy linearity in the direction, which is a condition for an affine connection:
$$
\nabla_{fV}W = f\nabla_V W.
$$
This is not satisfied for ##[V,W]##. The definition of the torsion tensor is that
$$
T(V,W) = \nabla_V W - \nabla_W V - [V,W].
$$
do you mean [V,W] can be specified as it satisfy linearity in some direction? why?
i still try to understand what is [V,W] actually that can "link back" to the ##\nabla_V W - \nabla_W V##

i just think the [V,W] implied that the basis vectors are commutative but $$\nabla_V W - \nabla_W V $$ are not

is that correct in general case?

the main thing is i don't understand what makes [V,W] different from $$\nabla_V W - \nabla_W V $$ in the above calculation
 
Last edited:
By definition, torsion is a tensor of type (1, 2) given by ## T(X, Y) = \nabla_X Y - \nabla_Y X -\left[X, Y\right]##. If we assume the condition that ##T=0##, we get ##\nabla_X Y - \nabla_Y X =\left[X, Y\right]##. To see what this implies in terms of components, from the definition we have ##\left[X, Y\right]^i = X^j Y^{i}{}_{,j} - Y^j X^i{}_{,j}## and ##\nabla_X Y - \nabla_Y X = X^j Y^i{}_{;j} - Y^j X^i{}_{;j} = X^j Y^i{}_{,j}+X^j Y^k \Gamma^i{}_{jk} - Y^j X^i{}_{,j}-Y^j X^k \Gamma^i{}_{jk}##. Then ##T=0## implies ##X^j Y^k \Gamma^i{}_{jk}-Y^j X^k \Gamma^i{}_{jk}=X^j Y^k\left(\Gamma^i{}_{jk}-\Gamma^i{}_{kj}\right)=0##. We see that the torsion is zero if and only if the components of the connection are symmetric on the lower two indices.
 
  • Like
Likes vanhees71 and bres gres
Cem said:
By definition, torsion is a tensor of type (1, 2) given by ## T(X, Y) = \nabla_X Y - \nabla_Y X -\left[X, Y\right]##. If we assume the condition that ##T=0##, we get ##\nabla_X Y - \nabla_Y X =\left[X, Y\right]##. To see what this implies in terms of components, from the definition we have ##\left[X, Y\right]^i = X^j Y^{i}{}_{,j} - Y^j X^i{}_{,j}## and ##\nabla_X Y - \nabla_Y X = X^j Y^i{}_{;j} - Y^j X^i{}_{;j} = X^j Y^i{}_{,j}+X^j Y^k \Gamma^i{}_{jk} - Y^j X^i{}_{,j}-Y^j X^k \Gamma^i{}_{jk}##. Then ##T=0## implies ##X^j Y^k \Gamma^i{}_{jk}-Y^j X^k \Gamma^i{}_{jk}=X^j Y^k\left(\Gamma^i{}_{jk}-\Gamma^i{}_{kj}\right)=0##. We see that the torsion is zero if and only if the components of the connection are symmetric on the lower two indices.

this makes sense to me
thank you !
 
  • #10
However, note that torsion being equal to zero still does not mean that ##\nabla_V W## is equal to ##VW## as the title suggests. The former is a vector field (and therefore a first order derivative) whereas the second is a second order differential operator.
 
  • Like
Likes bres gres
Back
Top