# Lie derivative vector fields, show Leibniz rule holds

1. Jul 21, 2017

### binbagsss

1. The problem statement, all variables and given/known data

2. Relevant equations

$V=V^u \partial_u$

I am a bit confused with the notation used for the Lie Derivative of a vector field written as the commutator expression:

Not using the commutator expression I have:

$(L_vU)^u = V^u \partial_u U^v - U^u\partial_u V^v$ (1)

When using the commutator expression however I sometimes see it written as :

simply, without an index, indicating a tensor or rank zero $L_v w = [v,w]$
and sometimes with an index : $(L_v w)^u = [v,w]^u$

Due to this I am confused as to what the commutator should be once expanded out.

Going with the notation my guess is that :

$(L_v W)^u= V^u w^{\alpha} \partial_{\alpha} - W^u v^{\alpha} \partial_{\alpha}$

$L_v W = (v^u \partial_u w^v - w^u\partial_u v^v) \partial_v$

So I see the expression multiplying the $\partial_v$ is a tensor of rank 1 and the expression agrees with (1 ) ).

Is this what the notation means , or could someone please fill me in (non-sexually)?

Since the question specifies no index, I am going to assume the expression

3. The attempt at a solution

So for the LHS I get:

$v^u\partial_u f w^v \partial_v - f w^u \partial_u v^v \partial_v$

RHS:

$f(v^u\partial_u w^v \partial_v - w^u \partial_u v^v\partial_v) + w^v\partial_v v^u\partial_uf$

(where the last term comes from the expression given for the lie derivative acting on a scalar)

so the first term from the LHS and the third term from the RHS agree, but not the rest...

Many thanks in advance

2. Jul 21, 2017

### andrewkirk

First note the definition of the Lie derivative of a vector field (for brevity I'll use the lower comma notation to indicate differentiation of a scalar in a coordinate direction). In a local coordinate system, the $a$th component of $\mathcal L_v w$ is:
$$(\mathcal L_vu)^a=[v,u]^a=v^c\partial_cu^a-u^c\partial_c v^a = v^cu^a_{,c}-u^cv^a_{,c}$$
Then, substituting the vector field $fw$ for $u$ in the above, we have
\begin{align*}
\left(\mathcal L_v(fw) - f\mathcal L_vw - w \mathcal L_vf\right)^a
&= [v^c(fw)^a_{,c}-(fw)^cv^a_{,c}]
- f[v^cw^a_{,c}-w^cv^a_{,c}]
- w^a \left(v^c f_{,c}\right)
\\
&= [v^c(f_{,c}w^a+fw^a_{,c}-fw^cv^a_{,c}]
- f[v^cw^a_{,c}-w^cv^a_{,c}]
- w^a v^c f_{,c}
\end{align*}

and collecting terms, we see that this cancels out to zero.

I think where your attempt went off-course is here:
The LHS of this equation is a scalar, being a coordinate of a vector. But the RHS is a vector, being a linear sum of the items $\partial_\alpha$, which are vectors.
The correct version of this is in my first equation above, usaing $a$ instead of $u$ for the index, and $u$ instead of $W$ for the vector.

Also, to avoid confusion, either always use upper case, or always use lower case, for vectors. By mixing the two, as happens in that quote, confusion can arise as to what is a vector, what is a scalar coordinate and what is an index.

Last edited: Jul 21, 2017
3. Jan 16, 2018

### binbagsss

(Apologies to re-bump an old thread but thought better than starting a new on the same thing. )

I am unsure how you get this expression for the ath component from the vector field expression we are given , working with the $(0,0)$ rank tensors: $V=V^u \partial_u$. Should it be obvious via a index/dimension analysis or is it even more obvious than that?

I see that the expression is consistent with being covariant , if $\partial_u \to \nabla_u$ then the expression holds and is covariant, so could almost trial my way to the correct expression however, but would like a better understanding.

many thanks.

4. Jan 16, 2018

### binbagsss

I think I am also pretty confused with termonology perhaps, what is a vector etc, perhaps I am confusing the terminology of vectors/scalars flat-space to differntial geometry and curved space.

I am pretty sure that I was taught the following in my GR course:

$V= V^u\partial_u$ is termed a 'vector field'. However from what I have been taught (or most likely thought I had been taught XD) was this is a tensor of rank $(0,0)$ and therefore a scalar. a rank $(1,0)$ tensor is a vector, and a rank $(0,1)$ tensor is a covector.
But I can see that what you say about the LHS makes sense, the index selecting a component . regarding the RHS, from linear algebra a linear combination of vectors is a summatio of constants multiplying the vectors, so since the index selects a component, looking at is like this, yes I see you have a scalar.scalar x vector + scalar .scalar x vector .

5. Jan 16, 2018

### Staff: Mentor

$V=V^u\partial_u$ is a vector field, because it has yet not been evaluated at a certain point. You get vectors from $V_p=\left. V^u \partial_u\right|_{p}$ and scalars if applied to real valued functions (and evaluated at $p$) $V_p.f=\left. V^u \partial_u\right|_{p}(f)$

6. Jan 18, 2018

### binbagsss

yeh but are these still not both tensors of rank $(0,0)$ and I was talking about what I was given in my lecture notes that a rank $(1,0)$ is a vector and a rank $(0,1)$ covector etc...

I've also never seen this notation where the point being evaluated at becomes the index, or part of the index notation. it is just what it is a function of. and it isn't $V^p$ either no? but $V_p$?

7. Jan 18, 2018

### Orodruin

Staff Emeritus
$p$ is a point in the manifold, not an index.

8. Jan 18, 2018

### Orodruin

Staff Emeritus
I think you are confusing tensors with tensor components. In many cases, GR literature will not distinguish the two. Is the reason you think $V^a \partial_a$ is a scalar that there are no free indices? $\partial_a$ are not the components of another tensor, they are the basis vectors that span the tangent space. A sum of vectors is another vector, i.e., $V^a \partial_a$ is a tangent vector. A tensor by itself does not come with indices, its components have indices.

Compare how in a Euclidean space you would have the basis $\vec e_1$, $\vec e_2$, $\vec e_3$ and write an arbitrary vector $\vec v$ as
$$\vec v = v^1 \vec e_1 + v^2 \vec e_2 + v^3 \vec e_3 = v^i \vec e_i.$$

9. Jan 18, 2018

### Staff: Mentor

Yes.
Here's a short summary:
https://www.physicsforums.com/insights/what-is-a-tensor/
Somewhere the point has to be noted, at which the derivative is evaluated. If we have $f(x)=x^2$ then the derivative is $f'(x)=2x$ but the tangent vector at $x=p$ is defined by the slope $2p$ which is denoted by $\left. \dfrac{d}{dx}\right|_{p} f(x) = 2p\,$ a scalar, although $x \mapsto f'(x)$ is a function called derivative, and $f \mapsto f'$ is also a function, a linear function called gradient. And $\mathbf{v}_p=(1,2p)$ is the tangent vector at $p$. The vector field is $\{(\mathbf{v}(f),p)=((1,2x),x)\,\vert \,x \in \mathbb{R}\}$.
Here are a couple of different ways to view a derivative: (the first section)
https://www.physicsforums.com/insights/journey-manifold-su2mathbbc-part/

10. Jan 19, 2018

### binbagsss

ok thanks, how about the case when i don't have base vectors involved,no derivative, something like #a^u b_u $is this a scalar because it is rank$(0,0) $with no free indices, or is it a scalar because I should think of it as the component of a vector multiplied by the component of a vector. How about$ V^u $I thought this is a vector , rank$(1,0) ## , but instead should I think of it as a scalar , the component of a vector? As I seemed to have been told to from replies in this thread and the other thread?

11. Jan 19, 2018

### binbagsss

as suspected, but was thrown off by fresh-42 response, trying to make the response helpful to addressing my question.

In the component of the lie derivative expression, we seek an index corresponding to this component, not simply evaluating at some point right?