# Entropy definition

1. Nov 10, 2009

### muzialis

Hi All,

I have a question on entropy.

I had some eposure to classic thermodynamics and I remmebr the entropy being defined as the integral of dq / T ("differential" heat / absolute temperature).

Hence for any adiabatic transformation the entropy change is zero.

Then I came across a different definition, involving the logarithm of the possible number of states, with its related ideas of order, disorder, time arrow, etc.

I tried to understand more and so far I am not even sure the two are equivalent.

Indeed one can imagine a system where work is done, achieving some level of order improvement, istill n adiabatic condiitons.

I understand this being a trivial argument and would welcome any help, reference or comment.

Thank you and Best Regards

Muzialis

2. Nov 10, 2009

### Gerenuk

Actually that's not a trivial question.
Heat is not difficult to define here. It's whats left over so $\mathrm{d}Q=\mathrm{d}E+p\mathrm{d}V$. However it's not obvious how one should define temperature here! The definition with Carnot reversible engines shows, that under some assumptions and if you manipulate a set of systems reversibly, then indeed the temperatures and entropy can be found from $\mathrm{d}S=\mathrm{d}Q/T$. For this definition you have to make sure you really know what reversible means for whatever system you are dealing with.

That's a definition I prefer. Using this definition one can show that if you want to maximize entropy while having a definite energy and volume, you get
$$\mathrm{d}E=x\mathrm{d}S-y\mathrm{d}V$$
where x and y are just parameters at this point. Logic in physics can identify y as the pressure provided it is constant at all of the boundary. Now if you also identify x as the temperature, you get back to the same definition as in the thermodynamic case.
All this derivation requires some statistical mechanics. You can look up some concepts. I could write down the derivation once you understand it a little.

Basically you can take the $S=\ln \Omega$ definition of entropy and if you are willing to identify x as the temperature (you don't have another definition of temperature at this point), then you get back to the thermodynamic definition.

Or you stay with macroscopic quantities only, search for reversible process and define entropy and temperature by $T_1/T_2=\mathrm{d}Q_1/\mathrm{d}Q_2$ and $\mathrm{d}S=\mathrm{d}Q/T$ if you observe a heat transfer (under reversible conditions for the set of systems) due to a Carnot engine which at the same time generates or consumes work.
With this macroscopic picture you cannot say anything about microscopic states. If your microscopic system turns out to have to required reversible dynamics, then you get you equate these definitions again.

Strictly speaking the Carnot definition can't say anything about entropy or temperature, if the total processes isn't reversible. In that case the best you can do is to hope you are dealing with a constant temperature process, so that you at least know the temperature (which is what it was before). I think there is no guarantee that the entropy changes wildly. The only thing you do, is to say that freely system will evolve only to a state of increased total entropy.

3. Nov 12, 2009

### muzialis

Gerenuk,

thank you for your useful explanation. I am looking up the concepts but would still be very interested in seeing the derivation of the equation you proposed.

I also thought I would present the concrete case stimulating my curiosity.
It is a classic theory on rubber elasticity, the main traits presented at for example http://www.doitpoms.ac.uk/tlplib/stiffness-of-rubber/entropy.php

Rubber stiffness is motivated by entropic considerations (using the log definition), but I am struggling to understand how rubber's stiffness could be justified in adiabatic (hence dS=0) conditions.

Thank you very much again

Muzialis

4. Nov 12, 2009

### Mapes

Make sure to distinguish between the total entropy change (which is zero for an adiabatic reversible process) and the configurational entropy change of the polymer chains (which decreases with deformation).

Just because the first is approximately zero doesn't disprove the model of rubber elasticity. Entropy can also change (or balance out to zero) due to a temperature change in the material. (Question: would adiabatically, reversibly deformed rubber get hotter or colder?)

5. Nov 12, 2009

### muzialis

Hi there,

and many thanks for your response. The introduction of total and partial entropy confused me even more, but this is just usual before understanding.

IF you stretch rubber it gives heat out. I appreciate, in adiabtaic conditions, this would mean the temperature would increase.

Still, if in adiabatic conditions the total entropy variation is zero, and the conformational entropy is changing, what else is balancing out the entropy? How can this be done by temperature?

thank you very much.

Kindest Regards

Muzialis

6. Nov 12, 2009

### Mapes

The entropy is balancing out by an increase in temperature. The configurational entropy of the straightened chains decreases, the vibrational entropy of the bonds increases with increasing temperature, and the total entropy change is approximately zero.

Interestingly, the opposite is predicted in metals; since stretching increases the entropy of primary bonds, the temperature is predicted to decrease during adiabatic reversible stretching.

(But note that no process is truly reversible, and irreversibility will tend to increase the temperature during stretching in both cases due to lost work.)

7. Nov 12, 2009

### muzialis

Mapes,

many thanks for this.

Basically the thermodynamic definition is concerned with global entropy, while you are talking about partial contributions.

But then, I am still perplexed. In the entropic theory of rubber elasticity, it is said the restoring force after a sample os tretched derives from the lower (conformational ) entropy of this better organized state, hence creating the tendency for going back to the relaxed configuration (higher entropy).

From what you say, this concept applies to conformational entropy. why it does not apply to the bond vibrational one?
In the equation dU = TdS + FdL (just the first law of thermodynamics for a piece of rubber stretched by a Force L along a length L), the dS is entropy, not vibrational or conformational..

But my most vivid thanks for all your posts, I am sure I am getting there

Muzialis

8. Nov 12, 2009

### Mapes

Because bond energy dominates over entropy when we're talking about primary bonds. The mechanism of elastic recovery in this case is the driving force to find a minimum-energy bond length rather than the driving force to increase conformational entropy. (Does this answer your question?)

9. Nov 12, 2009

### muzialis

10. Nov 12, 2009

### Mapes

I must be misunderstanding. What's your complete question?

11. Nov 12, 2009

### muzialis

Mapes,

in relation to our last conversation, the question is, why does rubber recover from the stretched configuration?

the bond energy is not deemed the cause at all in the link I sent you, rather the reason is the conformational entropy.

But as you say the conformational entropy is counter balanced by the vibrational one, (as one would expect from the fact the transformation is adiabatic,dS = 0).

So now,if this dS = 0, then the equation in the link F = dS / dL, indicated as the origin of the force, says that in adiabatic condition rubber has no stiffness!

Hope it is clearer now, and thnks for your patience

Muzialis

12. Nov 12, 2009

### Mapes

Ah, got it. It's two different entropies. For a polymer,

$$F\approx\frac{\partial}{\partial L}(T dS_\mathrm{conformational})$$

$$dS_\mathrm{total}=0$$

13. Nov 12, 2009

### muzialis

Mapel,

that is exactly the point! Why, for "a polymer", entropy equals conformational entropy? Is there not a level of arbitrareness there? In the laws of thermodynamics no specific entropy is mentioned!

Thanks again

Muzialis

14. Nov 12, 2009

### Mapes

Well, there's always vibrational, configurational, conformational, electronic, nuclear, etc., components of entropy. But the conformational entropy is what gets tweaked most when you stretch an elastomer. That's why the other terms are assumed to be negligible in that first equation.

15. Nov 12, 2009

### muzialis

Mapel,

you have to forgive this combination of laxck of undertsanding and stubborness...

I u8nderstand there might be various entropic components, but if as you say any but the conformational is considered negligible for the case of an elastomer, then also the entropy balance can not be zero (as the other negligible contribution ensure the total entropy variation in an adiabatic process is zero).

Are not facing a contraddcition here?

Basically I say, adiabatic hence dQ = 0, hence dS = 0. If any contribution to entropy apart form the conformational one is negligible, then dS can not be zero.

Thank you for this most interesting discussion

Muzialis

16. Nov 12, 2009

### Mapes

I do see what you're saying. The problem may lie in treating a polymer thermodynamically as a homogeneous system. When considering the stretching of individual chains, this assumption breaks down, and we have to assume that the only significant consequence of stretching is a decrease in conformational entropy. Sorry, I haven't seen a better explanation.

17. Nov 12, 2009

### muzialis

Mapes,

I see what you say.

Let memthank you very much for your valuable inputs.

Have a good evening

Marco

18. Nov 12, 2009

### Gerenuk

I'll do the derivation for energy and volume. It is similar if you have a string with a length (instead of a volume).
First you consider a lot of identical systems since all your statements about one system are actually rather a statistical statement about an ensemble of many copies of your systems. If you look at all these systems, then $n_i$ of them will be in a state $i$ with energy $E_i$ and volume $V_i$. Let's take an example. If you label your system with indices $i\in \{1,\dots,5\}$ now it might look like 112322355333343 and ordered is 112223333333455 (I ordered the copies according to their index). Now if all other combinations of 1,2,3,4,5 are also possible (say 111111111113335), how many of these combinations will give exactly the same combination as in my first example? Combinatorics tells us that (as order doesn't play a role) it will be
$\frac{15!}{2!3!7!1!2!}$ or in general
$$\Omega=\frac{n!}{\prod n_i!}$$
Of course we assume that looking at the problem this way with the labeling, a state like 112223333333455 is much more likely than 111111111111111, since the latter is unique whereas the first is the same as a reordered 131233223334535 and all other permutations. Defining
$$S=\ln\Omega$$
that's basically the second law of thermodynamics saying that entropy if an isolated system always increases (I skip the Boltzmann constant for shortness). Since there are so many particles involved in a gas, one can show that the "more likely states" are actually incredibly more likely than all others. That's why the second law isn't violated even though in could be.

--------------- TBC ---------------

Last edited: Nov 12, 2009
19. Nov 12, 2009

### Gerenuk

We can introduce the probabilities
$$p_i=\frac{n_i}{n}$$
that a copy of those many is in state i. With the assumption that n is a very large number (and some more large number assumptions), we can use Stirling's approximation of factorial to derive
$$S=-\sum p_i\ln p_i$$

And remember we argued that the entropy should attain a maximum
$$S\to\text{max}$$
Additionally we would want the average energy and volume of all these copies to be that values energy and volume that we measure on our single system. This corresponds to the equations
$$E=\sum_i E_i p_i$$
$$V=\sum_i V_i p_i$$
and of course
$$1=\sum_i p_i$$

Last edited: Nov 12, 2009
20. Nov 12, 2009

### Gerenuk

The task of maximizing the entropy under the given contraints can be solved with Lagrange multipliers to find
$$p_i=\frac{e^{\beta E_i-\alpha V_i}}{Z}$$
$$Z=\sum_i e^{\beta E_i-\alpha V_i}$$

Plugging $p_i$ back in into the definitions of $S$ and doing a bit of maths you get
$$\mathrm{d}S=\beta \mathrm{d}E-\alpha \mathrm{d}V$$
which you can rearrange to
$$\mathrm{d}E=\frac{1}{\beta}\mathrm{d}S-\frac{\alpha}{\beta}\mathrm{d}V$$
Now we define these global parameters, that so far have no meaning, as temperature and pressure. The identification with pressure actually makes sense since we know from physics where pressure has a meaning that $\mathrm{d}E=-p\mathrm{d}V$ (for adiabatic changes).

This way you arrive at your identity
$$\mathrm{d}E=T\mathrm{d}S-p\mathrm{d}V$$

The advantage over the direct thermodynamics definition is that you still have the framework behind with all it's equations about possible states of the copies $E_i$, $V_i$ and the probabilites $p_i$ which you can calculate with the partition function $Z$, provided you know these system state energies and volumes. So you can calculate one way, but not the other way round. You still know that
$$p_i=\frac{e^{-(E_i+pV_i)/T}}{Z}$$
$$Z=\sum_i e^{-(E_i+pV_i)/T}$$
$$S=\frac{E}{T}+\ln Z$$
(Note: maybe I got some signs wrong somewhere but the result is correct)

Last edited: Nov 12, 2009