# The Lagrangian/Hamiltonian dynamics of a particle moving in a manifold

1. Dec 29, 2009

### vertices

Hi

I am trying to work through the solution to the attached problem (see attachments). Now, I can't understand several things in the solution:

The Lagrangian in question is:

$$L={\frac{m}{2}}{g_{ij}(x)}.{\dot{x^{i}}{\dot{x^{j}}$$

1)is $$g_{ij}$$ a matrix with diag(-1,1,1,1), ie. the metric tensor? If so, why has it been labelled $$g_{ij}(x)$$, ie. a function of x? Is Einstein's summation convention implied in the Lagrangian?

2)Why is the momentum:

$$(p:=\frac{{\partial}L}{{\partial}\dot{x^{i}}})=mg_{ij}x^{j}$$

...surely there is a factor of a 1/2 missing (is the answer quoted in the solutions just wrong?)

3)The bit in the solution, where he says $${\delta}A=0$$, I think he is just setting the following expression to zero:

$${\delta}A=\int{dt[{\frac{{\partial}L}{{\partial}{x^{i}}}dx^{i}+\frac{{\partial}L}{{\partial}\dot{x^{i}}}d{\dot{x^{i}}]}=0$$ (from the theory of functionals)

what I don't understand is, why he is saying that:

$$\frac{{\partial}L}{{\partial}{x^{i}}}={\frac{m}{2}}{{\partial}_{i}}{g_{jk}}{\dot{x^{j}}{\dot{x^{k}}$$

-where has the 'k' come from?

and why is it that

$$m{g_{jk}}{{\delta}{\dot{x^{i}}{\dot{x^{j}}$$

equals the expression two lines below it?

As ever, any help would be much appreciated:)

#### Attached Files:

File size:
27.4 KB
Views:
57
• ###### SOLUTION.pdf
File size:
19.9 KB
Views:
48
Last edited: Dec 29, 2009
2. Dec 29, 2009

### George Jones

Staff Emeritus
I'm not sure what manifold is being used, but assume it is Minkowski spacetime. The coordinate system used doesn't have to be a standard Cartesian inertial coordinate system. What do the components the metric for special relativity look like when spherical coordinates are used?
No. Show why you think there is a factor of 1/2 missing. (I think I already know what you did, but I want you to show it.)

3. Dec 30, 2009

### vertices

Interesting question - I am not too sure about this but I suspect the metric would look exactly the same as we require:

$$cdt^{2}-dr^{2}+d{{\phi}^{2}}+d{{\theta}^{2}}=0$$, ofcourse, 't' also being a co-ordinate in space-time.

well:

$$L={\frac{m}{2}}{g_{ij}(x)}.{\dot{x^{i}}{\dot{x^{j}}$$

So the get the momentum, we just differentiate wrt $${\dot{x}}^{i}$$, holding all other variables constant. There is an $${\dot{x}}^{i}$$, which just dissappears from the Lagrangian. So the the Lagrangian is pretty much unchanged (so the 1/2 should be there?).

Thanks.

4. Dec 30, 2009

### George Jones

Staff Emeritus
This seem reasonable, but, take a look at

This can be seen geometrically as well.
But if you differentiate with respect to $\dot{x}^i$, then the repeated index $i$ appears three times, while the Einstein convention allows only for indices that appear once or twice in a term. For differentiation in this situation, use an index that has not yet appeared, for example, differentiate with respect $\dot{x}^k$. Don't forget to use the product rule, as it is the product rule that makes the factor of 1/2 disappear.

What do you get when you evaluate

$$\frac{d \dot{x}^i}{d \dot{x}^k}?$$

Last edited: Dec 30, 2009
5. Dec 30, 2009

### vertices

I see.

If I differentiate wrt $$\dot{x}^k$$, I get

$$\frac{{\delta}L}{d \dot{x}^k}=\frac{m}{2}g_{ij}[\frac{d \dot{x}^i}{d \dot{x}^k}{\dot{x}^j+\frac{d \dot{x}^j}{d \dot{x}^k}{\dot{x}^i]$$

This is equal to:

$$\frac{m}{2}g_{ij}[{\delta}_{ik}\dot{x}^{j}+{\delta}_{jk}\dot{x}^{i}]$$

which equals:

$$\frac{m}{2}g_{ij}\dot{x}^{j}$$ if k=i

OR

$$\frac{m}{2}g_{ij}{\dot{x}{^{i}$$ if k=j

So the factor of 1/2 doesn't cancel as such. Where I am going wrong?

Last edited: Dec 31, 2009
6. Dec 30, 2009

### vertices

And another question: by the summation convention, is the Lagrangian:

$$L={\frac{m}{2}}{g_{ij}(x)}.{\dot{x^{i}}{\dot{x^{j}}$$

really

$$L=\sum_{i}\sum_{j}{\frac{m}{2}}{g_{ij}(x)}.{\dot{x^{i}}{\dot{x^{j}}$$

?

7. Jan 1, 2010

### vertices

Hi George...

Hi was wondering if you had any thoughts about where I am going wrong in relation to the factor of half? I am very anxious to know!

Thanks:)

8. Jan 3, 2010

### diazona

Yes, that's what the summation convention is: any index which occurs once as a superscript and once as a subscript is implicitly summed over. Sometimes people use a "relaxed" summation convention in which any index that occurs twice, even when they're both subscripts or both superscripts, is implicitly summed over, but that's not all that common and it has the potential to be more confusing.
Remember, though, that i and j are repeated indices and are thus being summed over. So the case k = i corresponds to exactly one term in the sum over i, and the case k = j corresponds to exactly one term in the sum over j. It's not an either/or situation; both of those terms,
$$\frac{m}{2}g_{ij}\dot{x}^{j}$$
AND
$$\frac{m}{2}g_{ij}\dot{x}^{i}$$
occur exactly once, so the result is the sum of both.

Here's a better, less hand-wavy way of looking at it: you have
$$\frac{m}{2}g_{ij}\bigl[\delta^i_k \dot{x}^j + \delta^j_k \dot{x}^i\bigr]$$
(I fixed the index positioning on your delta tensors) Splitting this up into two terms, you get
$$\frac{m}{2}g_{ij}\delta^i_k \dot{x}^j + \frac{m}{2}g_{ij}\delta^j_k \dot{x}^i$$
which is equal to
$$\frac{m}{2}g_{kj} \dot{x}^j + \frac{m}{2}g_{ik} \dot{x}^i$$
Now, in one of these terms (doesn't matter which) you can (a) switch the order of indices in the metric, since it's a symmetric tensor, and (b) relabel the repeated index, either i or j. If I do this in the second term, I get
$$\frac{m}{2}g_{kj} \dot{x}^j + \frac{m}{2}g_{kj} \dot{x}^j = mg_{kj} \dot{x}^j$$

9. Jan 3, 2010

### vela

Staff Emeritus
The matrix diag(-1,1,1,1,1,...) is a representation of the Minkowski space metric in its natural basis. If you were to use a different set of coordinates, e.g. spherical, or you were looking at a different manifold, you'd have a different matrix, one that could have elements that are functions of the coordinates. Generally speaking, the matrix elements are functions of the coordinates.

In the second part of the problem (in the PDF), for example, the manifold is a 2-sphere, and $$g_{ij} = diag(1, sin^2 \phi)$$.

10. Jan 4, 2010

### vertices

Fantastic - I didn't realise you could do this but it makes perfect sense. Thanks again diazona.

Aah this is what George alluded to above, but I didn't put two and two together - thanks vela.

The issue now is to show why this is the case:

$${\frac{{\partial}L}{{\partial}{x^{i}}}= \frac{m}{2} {\partial}_i g_{jk} \dot{x}^{j}\dot{x}^{k} - ({\partial}_k g_{ij}) \dot{x}^{k}\dot{x}^{j}$$

I can not see how you can differentiate an expression with respect to a variable which isn't even in the expression? (x and x(dot) are independent variables in the Lagrangian formulation).

11. Jan 4, 2010

### vela

Staff Emeritus
Look at the way you wrote the Lagrangian the very first time in your first post. $$x_i$$ appears via the metric.

12. Jan 4, 2010

### vertices

Oh okay. So it is a function of x^i.

In which case, I do understand where the first term comes from, but since the velocities are independent of the position, I don't see where the second term comes from, and why it is negative. Can you shed any light on this?

(And, I don't want to go off on a tangent, but what does it mean to differentiate a matrix, do we literally differentiate each of the components of it - or has the matrix been converted into an expression somehow? I know that question is probably really stupid, so sorry for asking it!)

13. Jan 6, 2010

### vertices

I am unashamedly bumping this thread (I'm not normally so obsessed with Lagrangians believe it or not(!) - it's just that I have an exam coming up quite soon!)

Thanks to the helpful contributors in this thread, I understand that the metric $$g_{ij}$$ is a function of position, and thus can be differentiated wrt a position coordinate. This explains the first term in the expression below, but not the second one:

$${\frac{{\partial}L}{{\partial}{x^{i}}}= \frac{m}{2} \frac{\partial}{dx_i} g_{jk} \dot{x}^{j}\dot{x}^{k} - (\frac{\partial}{dx_k} g_{ij}) \dot{x}^{k}\dot{x}^{j}$$

As a reminder, the Lagrangian in question is:

$$L={\frac{m}{2}}{g_{ij}(x)}.{\dot{x^{i}}{\dot{x^{j} }$$

In particular, what is going on with the indices?

I'm guessing I can't see something quite obvious...

Last edited: Jan 6, 2010
14. Jan 8, 2010

### diazona

OK, so in summary: your Lagrangian is
$$L = \frac{m}{2}g_{ij}(x) \dot{x}^i \dot{x}^j$$
and you need to show that
$$\frac{\partial L}{\partial x^i} = \frac{m}{2}\partial_i g_{jk} \dot{x}^j \dot{x}^k - \frac{m}{2}(\partial_k g_{ij})\dot{x}^k \dot{x}^j$$
I'm guessing you just forgot to write the $\frac{m}{2}$ in the last term, it has to be there for unit consistency... anyway, just making sure I've got the problem straight. Having these two separate threads is confusing.

Now that I look at it, I don't think that second term should be there at all. If we relabel the i index in the Lagrangian to k, so
$$L = \frac{m}{2}g_{kj}(x) \dot{x}^k \dot{x}^j$$
and differentiate with respect to $x^i$,
$$\frac{\partial L}{\partial x^i} = \frac{m}{2}\frac{\partial}{\partial x^i}\bigl[g_{kj}(x) \dot{x}^k \dot{x}^j\bigr]$$
I can write this out component-by-component, for example in 2D space:
$$\frac{\partial L}{\partial x^0} = \frac{m}{2}\frac{\partial}{\partial x^0}\bigl[g_{00}(x) \dot{x}^0 \dot{x}^0 + g_{01}(x) \dot{x}^0 \dot{x}^1 + g_{10}(x) \dot{x}^1 \dot{x}^0 + g_{11}(x) \dot{x}^1 \dot{x}^1\bigr]$$
and similarly for the other component of the derivative.

Now switch to the other expression, the answer you're looking for. That can be written
$$\frac{\partial L}{\partial x^i} = \frac{m}{2}\frac{\partial}{\partial x^i}\bigl[g_{jk} \dot{x}^j \dot{x}^k\bigr] - \frac{m}{2}\frac{\partial g_{ij}}{\partial x^k}\dot{x}^k \dot{x}^j$$
and again, I can write it out in full components ($x^0$ only),
$$\frac{\partial L}{\partial x^0} = \frac{m}{2}\frac{\partial}{\partial x^0}\bigl[g_{00}(x) \dot{x}^0 \dot{x}^0 + g_{01}(x) \dot{x}^0 \dot{x}^1 + g_{10}(x) \dot{x}^1 \dot{x}^0 + g_{11}(x) \dot{x}^1 \dot{x}^1\bigr] - \frac{m}{2}\biggl[\frac{\partial g_{00}(x)}{\partial x^0} \dot{x}^0 \dot{x}^0 + \frac{\partial g_{01}(x)}{\partial x^0} \dot{x}^0 \dot{x}^1 + \frac{\partial g_{00}(x)}{\partial x^1} \dot{x}^1 \dot{x}^0 + \frac{\partial g_{01}(x)}{\partial x^1} \dot{x}^1 \dot{x}^1\biggr]$$
For one thing, this expression has derivatives with respect to $x^1$ which aren't present in the other expression above. That means it definitely can't be true for a general metric - for example, it looks like
$$g_{ij} = \begin{pmatrix}\bigl(x^1/a\bigr)^2 & 0 \\ 0 & 1\end{pmatrix}$$
would be a counterexample. And besides, the expansion of the first term is identical to the expansion of the derivative of the Lagrangian, from above. I really don't see any reason for the mysterious second term to be there, and it's hard to miss things when you write out all the components.

Where did that expected answer come from, anyway?

15. Jan 8, 2010

### diazona

Generally speaking, yeah, you just differentiate each component. It all follows from the definition of the derivative. For a matrix A that is a function of x,
$$\frac{\mathrm{d}A}{\mathrm{d}x} = \lim_{\epsilon\to 0}\frac{A(x + \epsilon) - A(x)}{\epsilon}$$
(I'm sure you know this) This definition involves only linear operations, i.e. addition, subtraction, and multiplication or division by scalars, so the derivative is a linear operator as well. And to apply any linear operator to a matrix, you just apply it to each element.

Things get a little messier in general relativity (or differential geometry), because the subtraction is no longer necessarily a linear operator. That's because $A(x + \epsilon)$ and $A(x)$ are defined at different points in space, so they're no longer part of the same vector space, and you have to specify a mapping between the vector space of matrices at $x + \epsilon$ and the vector space of matrices at $x$ in order to be able to subtract them.

(That's not what's going on in your Lagrangian problem, is it?)

16. Jan 8, 2010

### vertices

Thanks again for the replies diazona..

Yes, your penultimate post makes it very clear that the second term should not be there!

Perhaps I should state the problem in full. What I don't understand is why this:

$$\frac{d}{dt} (mg_{ij}\dot{x}^j ) - \frac{1}{2}m \frac{\partial}{\partial x^i}(g_{kj})\dot{x}^k \dot{x}^j =0$$

implies this:

$${\partial}_k g_{ij} \dot{x}^k \dot{x}^j + g_{ij} \ddot{x}^j - \frac{1}{2} {\partial}_i g_{jk} \dot{x}^j \dot{x}^k=0$$

(It's something my lecturer wrote down on the blackboard, there is a possibility I might have copied it down incorrectly, but this is unlikely)

As regards where

$$\frac{\partial L}{\partial x^i} = \frac{m}{2}\partial_i g_{jk} \dot{x}^j \dot{x}^k - \frac{m}{2}(\partial_k g_{ij})\dot{x}^k \dot{x}^j$$

came from.

I assumed that $$\frac{d}{dt} (mg_{ij}\dot{x}^j )$$ is just equal to the $$mg_{ij} \ddot{x}^j$$ term in the second expression. So if we get rid of both these equal terms we are left with:

$$- \frac{1}{2} \frac{\partial}{\partial x^i}(g_{kj})\dot{x}^k \dot{x}^j = {\partial}_k g_{ij} \dot{x}^k \dot{x}^j - \frac{1}{2} {\partial}_i g_{jk} \dot{x}^j \dot{x}^k$$

Although it seems a trivial step to me, perhaps I was wrong to do this?

Thanks

Last edited: Jan 8, 2010
17. Jan 8, 2010

### diazona

Aha Yeah, it was that assumption that messed you up. Metrics in general can be time dependent, so
$$\frac{\mathrm{d}}{\mathrm{d}t}\bigl[g_{ij}\dot{x}^j\bigr] = \frac{\mathrm{d}g_{ij}}{\mathrm{d}t}\dot{x}^j + g_{ij}\ddot{x}^j = \frac{\partial g_{ij}}{\partial x^k}\dot{x}^k\dot{x}^j + g_{ij}\ddot{x}^j$$
(the last equality follows from the chain rule)

18. Jan 8, 2010

### vertices

Fantastic! I can't believe I spent so many hours on something as trivial as this!

Thanks for all your help diazona!!