# Entropy and the Lagrangian

1. Mar 6, 2015

### Jimster41

just working my way through Susskind's "Theoretical Minimum". At the Langrangian formalism I'm in novel territory so this may be a dumb question. Kind of multiple choice or fill in a real answer.

Why is there no term for the Entropy of a system in the Lagrangian?
Is it because time is an independent background variable?
Or is Entropy included in potential energy?
Or is it because the Lagrangian is approximate (like Newton) or something?
It's implied by "the principle of least action" wher the L is minimized in the action
Or am I missing it altogether?

Last edited: Mar 6, 2015
2. Mar 6, 2015

### Staff: Mentor

Because in the formulation involved, there is no dissipation of mechanical energy (to thermal energy).

Chet

3. Mar 6, 2015

### Jimster41

Okay, thanks. I found this in trying to understand your use of terms. Hadn't seen it laid out so definitively across the "Types of Energy".

http://www.energyeducation.tx.gov/energy/section_1/topics/forms_of_energy/index.html [Broken]

Thermal Energy: The Energy of Moving or Vibrating Molecules
Mechanical Energy: The Energy of a substance or system because of its motion

But, uh.... Isn't every "thing" that is moving is made of "moving molecules"?

So if I had to pick an answer form my list. Would I be correct in saying the Lagrangian (at least as Susskind is introducing it in Chapter in Lecture 6 which is about Mechanics) is an approximation of "how things really work", restricted to an ideal mechanical world, and at the end of the day any "complete" Lagrangian has to account for entropy's non-conservative effect (which is the only non-conserved... thing?) - in other words in a real system there is always some dissipation of mechanical energy to thermal energy and therefore entropy.

I'm open to just holding onto the fact that this seems a bit confusing because it's decomposing reality step-wise. I just I hate that feeling that I'm missing something other people find totally clear.

Last edited by a moderator: May 7, 2017
4. Mar 6, 2015

### Staff: Mentor

It think you pretty much captured the essence of it. As far as applying the Lagrangian formalism to a macroscopic system, my background is limited, so I don't know whether inclusion of dissipative effects is easily done.

Chet

5. Mar 6, 2015

### Jimster41

Thank you sir.

Later:Digging around some more trying to see if I could get a sense of how meaningful or meaningless this question might be, or if there might be a pool of answers waiting down some strenuous path of complicated stuff, I can at least hope to reach someday, I did find this.

http://en.wikipedia.org/wiki/GENERIC_formalism

Last edited: Mar 6, 2015
6. Mar 6, 2015

### jfizzix

I could be wrong, but isn't entropy a statistical artifact that is a function of the possible configurations your system could be in while having the same macroscopically observable properties?

If so, there's no real reason for a lagrangian describing the specific microscopic dynamics to explicitly include the entropy.

7. Mar 6, 2015

### Jimster41

Do you mean "macroscopic dynamics" in the second paragraph?

If so, I think I get you, and that's consistent with where I got to, and I hadn't thought of it that way.

What was bugging me is that as I understand it no isolated system trajectory through phase space can be perfectly reversible over time (no physically real one anyway). Configuration space has a probabilistic slope, with an energy equivalence, so there has to be some "entropic" energy penalty, tacked onto at least one of the Lagragian steps in that trajectory, eventually.

I'd be lyin if I said this was all totally clear (which is why I was asking for some clarification). I'm just trying to make sure I'm not too confused to make it to the next chapter.

Last edited: Mar 6, 2015
8. Mar 6, 2015

### kith

The equations of motion of Lagrangian, Hamiltonian and Newtonian mechanics describe the actual dynamics of a system. Systems of our everyday experience have a very large number of degrees of freedom (a typical order of magnitude is given by $6.022 \times 10^{23}$ which is the number of particles in a mole). This makes it impossible to use the equations of motions directly in general. Doing so is only possible in special cases. For rigid bodies, for example, it is sufficient to consider 6 degrees of freedom because all the distances between the constituents are approximated as fixed. Furthermore, you can often approximate the motion of a body by the motion of its center of mass.

But for gases, for example, such approximations can't be made and the equations of motion can't be used directly. People like Boltzmann and Gibbs used statistical reasoning instead which led to statistical mechanics and the derivation of thermodynamics from fundamental mechanics.

For macroscopic observables like temperature, many possible microscopic states correspond to the same value. Such observables are not related to the actual microscopic state of the system but to the ensemble of possible microscopic states compatible with a given macroscopic state. Entropy, for example, is a measure of how many such microscopic states are compatible with the macroscopic state. This is expressed in Boltzmann's formula.

If we use the equations of motion directly, we are only looking at a single microscopic state and its time evolution. So the entropy is simply zero all the time.

Last edited: Mar 6, 2015
9. Mar 6, 2015

### Jimster41

I see, jfizzix, was saying it correctly. I believe I understand. I wouldn't have got confused if Susskind had qualified the introduction in that way, but that would have been awkward since he hasn't even introduced entropy yet. Part of the problem is that I'm coming at it with some (a teeny tiny bit) of knowledge and trying to get things back into the correct order...