# Homework Help: Lie derivative vector fields, show the Leibniz rule holds

1. Jul 21, 2017

### binbagsss

1. The problem statement, all variables and given/known data

2. Relevant equations

$V=V^u \partial_u$

I am a bit confused with the notation used for the Lie Derivative of a vector field written as the commutator expression:

Not using the commutator expression I have:

$(L_vU)^u = V^u \partial_u U^v - U^u\partial_u V^v$ (1)

When using the commutator expression however I sometimes see it written as :

simply, without an index, indicating a tensor or rank zero $L_v w = [v,w]$
and sometimes with an index : $(L_v w)^u = [v,w]^u$

Due to this I am confused as to what the commutator should be once expanded out.

Going with the notation my guess is that :

$(L_v W)^u= V^u w^{\alpha} \partial_{\alpha} - W^u v^{\alpha} \partial_{\alpha}$

$L_v W = (v^u \partial_u w^v - w^u\partial_u v^v) \partial_v$

So I see the expression multiplying the $\partial_v$ is a tensor of rank 1 and the expression agrees with (1 ) ).

Is this what the notation means , or could someone please fill me in (non-sexually)?

Since the question specifies no index, I am going to assume the expression

3. The attempt at a solution

So for the LHS I get:

$v^u\partial_u f w^v \partial_v - f w^u \partial_u v^v \partial_v$

RHS:

$f(v^u\partial_u w^v \partial_v - w^u \partial_u v^v\partial_v) + w^v\partial_v v^u\partial_uf$

(where the last term comes from the expression given for the lie derivative acting on a scalar)

so the first term from the LHS and the third term from the RHS agree, but not the rest...

2. Jul 21, 2017

### andrewkirk

First note the definition of the Lie derivative of a vector field (for brevity I'll use the lower comma notation to indicate differentiation of a scalar in a coordinate direction). In a local coordinate system, the $a$th component of $\mathcal L_v w$ is:
$$(\mathcal L_vu)^a=[v,u]^a=v^c\partial_cu^a-u^c\partial_c v^a = v^cu^a_{,c}-u^cv^a_{,c}$$
Then, substituting the vector field $fw$ for $u$ in the above, we have
\begin{align*}
\left(\mathcal L_v(fw) - f\mathcal L_vw - w \mathcal L_vf\right)^a
&= [v^c(fw)^a_{,c}-(fw)^cv^a_{,c}]
- f[v^cw^a_{,c}-w^cv^a_{,c}]
- w^a \left(v^c f_{,c}\right)
\\
&= [v^c(f_{,c}w^a+fw^a_{,c}-fw^cv^a_{,c}]
- f[v^cw^a_{,c}-w^cv^a_{,c}]
- w^a v^c f_{,c}
\end{align*}

and collecting terms, we see that this cancels out to zero.

I think where your attempt went off-course is here:
The LHS of this equation is a scalar, being a coordinate of a vector. But the RHS is a vector, being a linear sum of the items $\partial_\alpha$, which are vectors.
The correct version of this is in my first equation above, usaing $a$ instead of $u$ for the index, and $u$ instead of $W$ for the vector.

Also, to avoid confusion, either always use upper case, or always use lower case, for vectors. By mixing the two, as happens in that quote, confusion can arise as to what is a vector, what is a scalar coordinate and what is an index.

Last edited: Jul 21, 2017
3. Jan 16, 2018

### binbagsss

(Apologies to re-bump an old thread but thought better than starting a new on the same thing. )

I am unsure how you get this expression for the ath component from the vector field expression we are given , working with the $(0,0)$ rank tensors: $V=V^u \partial_u$. Should it be obvious via a index/dimension analysis or is it even more obvious than that?

I see that the expression is consistent with being covariant , if $\partial_u \to \nabla_u$ then the expression holds and is covariant, so could almost trial my way to the correct expression however, but would like a better understanding.

many thanks.

4. Jan 16, 2018

### binbagsss

I think I am also pretty confused with termonology perhaps, what is a vector etc, perhaps I am confusing the terminology of vectors/scalars flat-space to differntial geometry and curved space.

I am pretty sure that I was taught the following in my GR course:

$V= V^u\partial_u$ is termed a 'vector field'. However from what I have been taught (or most likely thought I had been taught XD) was this is a tensor of rank $(0,0)$ and therefore a scalar. a rank $(1,0)$ tensor is a vector, and a rank $(0,1)$ tensor is a covector.
But I can see that what you say about the LHS makes sense, the index selecting a component . regarding the RHS, from linear algebra a linear combination of vectors is a summatio of constants multiplying the vectors, so since the index selects a component, looking at is like this, yes I see you have a scalar.scalar x vector + scalar .scalar x vector .

5. Jan 16, 2018

### Staff: Mentor

$V=V^u\partial_u$ is a vector field, because it has yet not been evaluated at a certain point. You get vectors from $V_p=\left. V^u \partial_u\right|_{p}$ and scalars if applied to real valued functions (and evaluated at $p$) $V_p.f=\left. V^u \partial_u\right|_{p}(f)$

6. Jan 18, 2018

### binbagsss

yeh but are these still not both tensors of rank $(0,0)$ and I was talking about what I was given in my lecture notes that a rank $(1,0)$ is a vector and a rank $(0,1)$ covector etc...

I've also never seen this notation where the point being evaluated at becomes the index, or part of the index notation. it is just what it is a function of. and it isn't $V^p$ either no? but $V_p$?

7. Jan 18, 2018

### Orodruin

Staff Emeritus
$p$ is a point in the manifold, not an index.

8. Jan 18, 2018

### Orodruin

Staff Emeritus
I think you are confusing tensors with tensor components. In many cases, GR literature will not distinguish the two. Is the reason you think $V^a \partial_a$ is a scalar that there are no free indices? $\partial_a$ are not the components of another tensor, they are the basis vectors that span the tangent space. A sum of vectors is another vector, i.e., $V^a \partial_a$ is a tangent vector. A tensor by itself does not come with indices, its components have indices.

Compare how in a Euclidean space you would have the basis $\vec e_1$, $\vec e_2$, $\vec e_3$ and write an arbitrary vector $\vec v$ as
$$\vec v = v^1 \vec e_1 + v^2 \vec e_2 + v^3 \vec e_3 = v^i \vec e_i.$$

9. Jan 18, 2018

### Staff: Mentor

Yes.
Here's a short summary:
https://www.physicsforums.com/insights/what-is-a-tensor/
Somewhere the point has to be noted, at which the derivative is evaluated. If we have $f(x)=x^2$ then the derivative is $f'(x)=2x$ but the tangent vector at $x=p$ is defined by the slope $2p$ which is denoted by $\left. \dfrac{d}{dx}\right|_{p} f(x) = 2p\,$ a scalar, although $x \mapsto f'(x)$ is a function called derivative, and $f \mapsto f'$ is also a function, a linear function called gradient. And $\mathbf{v}_p=(1,2p)$ is the tangent vector at $p$. The vector field is $\{(\mathbf{v}(f),p)=((1,2x),x)\,\vert \,x \in \mathbb{R}\}$.
Here are a couple of different ways to view a derivative: (the first section)
https://www.physicsforums.com/insights/journey-manifold-su2mathbbc-part/

10. Jan 19, 2018

ok thanks, how about the case when i don't have base vectors involved,no derivative, something like #a^u b_u $is this a scalar because it is rank$(0,0) $with no free indices, or is it a scalar because I should think of it as the component of a vector multiplied by the component of a vector. How about$ V^u $I thought this is a vector , rank$(1,0) $, but instead should I think of it as a scalar , the component of a vector? As I seemed to have been told to from replies in this thread and the other thread? 11. Jan 19, 2018 ### binbagsss as suspected, but was thrown off by fresh-42 response, trying to make the response helpful to addressing my question. In the component of the lie derivative expression, we seek an index corresponding to this component, not simply evaluating at some point right? 12. Mar 23, 2018 ### binbagsss okay and so the compared to the correct version you give, this correct version is not a linear sum of the$\partial_ \alpha $because ? The difference between$V^u\partial_u U^c $and$V^u u^a \partial_a $is only that in the first case the derivative acts on a vector component, whereas in the second case it doesn't, in both cases it is summed over, so how the fact that it acts on a vector component gives another vector component leaving vector component. vector component which is a scalar? 13. Mar 23, 2018 ### binbagsss Ok, so scalar fields and vector fields are invariant under coordinate transformations but the independent components are not. And so, if these components transform in a certain way we say the corresponding tensor of which these components belong to is a vector , scalar etc. But, looking at the statement ' a rank (1,0) object is a vector' ; rank is picked upon by looking at the indices, the indices label the componenyts and so when you look at$\partial_a V^a $you conclude (1,1) or do you just disregard the partial derivative with this notation tensor thing and treat completely differently? And so do you say both$V^u $are$ V^u \partial_u = V$are tensors of rank$(1,0) $, or do you not talk about vector 'fields' V in terminology of ranks? What about if I have$\partial_a V^u $: going with the above responses I conclude it is simply derivative of a vector component and so is$V'u $, let me label it, a rank (1,0) a tensor, but what about$\partial_u V^u $; summed over now instead, still rank (1,0)? thanks 14. Mar 23, 2018 ### Orodruin Staff Emeritus You have to be a bit more precise here. What you need to look at are the free indices. Do not count the summation indices. this is not a tensor component. It does not transform correctly. You just count the indices of the components. Here$\partial_u$is the basis vector. 15. Mar 23, 2018 ### binbagsss agreed, it is invariant, so we would not have a transformation rule... oh I've a contradicted myself with 'when we deduce rank we look at the components'... ha i.e. only free indices and not summed over okay, so can I ask though$a_ub^u$do I conclude it is a scalar? or can I not conclude since there are no free indices. I guess you conclude scalar, but I've got the$V^u\partial_u $case in my mind, but as above,$\partial_u$should not be included in this analysis? 16. Mar 23, 2018 ### andrewkirk Can you try re-writing the question? I don't understand it. It sounds like you are requesting a comparison of something to itself. 17. Mar 24, 2018 ### binbagsss Yeh I noticed it was a quote of a quote and so my intial line had not posted which is why I went to go on and explain writing out the relevant terms I was talking about ( missing however the - term from the commutator, irrelevant though since both objects are the same tensor properties for the question I had at hand ). .. The difference between, a sum then I should have said perhaps, of terms like$V^u\partial_u U^c $and terms like$V^u u^a \partial_a $is only that in the first case the derivative acts on a vector component, whereas in the second case it doesn't, in both cases it is summed over, so the fact that it acts on a vector component in the first case gives another vector component leaving vector component. vector component which is a scalar? 18. Mar 24, 2018 ### Dick$a_ub^u$is invariant because it's two factors transform in opposite ways. It's a scalar because both factors are numbers (presumably).$V^u\partial_u $is also invariant for the same reason. But it's not a scalar because$\partial_u$isn't a number, it's an operator. It's a vector. Is that what you are asking? 19. Mar 24, 2018 ### andrewkirk The first one is more precisely written as$(V^u\partial_u) U^c $, which is a vector, post-multiplied by a scalar. It is not a vector acting on a scalar. There is no defined action of a vector on a scalar. The closest we have to such a thing is the directional derivative of a scalar field$f$in direction$\vec V$. But that is written as$\nabla_{\vec V} f$, not$\vec V f$. Usually we write scalar multiplication as pre-multiplication. There is no separate definition for post-multiplication. It is poor style to use it, because it creates confusion. But if it is used, it means the same thing as pre-multiplication. So a clearer way to write the first item is$U^c(V^u\partial_u)$, which is a standard pre-multiplication of a vector by a scalar. I cannot tell what the second item$V^u u^a \partial_a $is supposed to mean. It looks like the symbol$u$is used both as an index for vector$\vec V$and as a vector itself, with components$u^a$. That generates a naming conflict that makes the symbol string uninterpretable. I find that, in seeking to understand symbol strings in this discipline, it is very helpful to identify the nature of each item in the string, and work out whether it can validly act upon the item that succeeds it. Rearranging the order of the terms can sometimes clarify this, subject to the restriction that we can only rearrange the order of two terms that are a scalar multiplication. 20. Mar 24, 2018 ### stevendaryl Staff Emeritus That's a matter of definition. Some authors do say that a vector field $V$ in a manifold $\mathcal{M}$ is defined to be an operator on scalar fields that obeys the Leibniz rules. 21. Mar 24, 2018 ### andrewkirk Fair enough. Let's adopt that definition and see if we can make something out of post #17: Before doing anything, to dispel confusion, let's insist on using upper case for vectors so that the second item becomes$V^u U^a \partial_a $. Next, let's remove the name conflict between$u$and$U$by replacing all lower case$u$, which are indices, by$b$. The first and second items then become: V^b\partial_b U^c \quad\textrm{and}\quad V^b U^a \partial_a Adopting the above-mentioned definition of action of a vector on a scalar, we interpret these as V^b \frac{dU^c}{dx^b} \quad\textrm{and}\quad V^b U^a \partial_a Then the first item, after Einstein-summing over$b$, is a scalar expression that has$c$as a free variable, and the second is a vector, that can also be written as$V^b\vec U$, thereby highlighting that$b## is a free variable.

22. Mar 24, 2018

### stevendaryl

Staff Emeritus
It's important to distinguish between vectors and the components of vectors.

If you have a collection of vector fields $e_1, e_2, ...$, and you have a collection of scalar fields $\phi^1, \phi^2, ...$, then you can combine them to get a new vector field: $V = \phi^1 e_1 + \phi^2 e_2 + ...$. The components of a vector field are just scalar fields. So $V^u$ is a scalar field. It's not a (1,0) tensor. It can be a component of a (1,0) tensor, but it's a perfectly good scalar field in its own right. And $\partial_a V^u$ is another scalar field. It isn't a (1,1) tensor.

What's confusing is that some authors use index notation to indicate tensors of particular ranks. So they would use $V^\mu$ to mean a vector, rather than a component of a vector.

23. Mar 24, 2018

### stevendaryl

Staff Emeritus
When people write $V^\mu e_\mu$, (where $e_\mu$ is a basis vector, often written written as $\partial_\mu$ when it's a coordinate basis) that does not mean a scalar. It means a vector. The index $\mu$ does not mean which component, it means which vector. $e_1, e_2, ...$ are vectors, and a linear combination of vectors is another vector. That's all that $V^\mu e_\mu$ means: a linear combination of the basis vectors.