What is the true definition of entropy?

In summary, In classic thermodynamics entropy is the integral of dq / T (differential heat / absolute temperature). For the definition involving the logarithm of the possible number of states, with its related ideas of order, disorder, time arrow, etc., one can show that if you want to maximize entropy while having a definite energy and volume, you get \mathrm{d}E=x\mathrm{d}S-y\mathrm{d}V. This equation requires some statistical mechanics.
  • #1
muzialis
166
1
Hi All,

I have a question on entropy.

I had some eposure to classic thermodynamics and I remmebr the entropy being defined as the integral of dq / T ("differential" heat / absolute temperature).

Hence for any adiabatic transformation the entropy change is zero.

Then I came across a different definition, involving the logarithm of the possible number of states, with its related ideas of order, disorder, time arrow, etc.

I tried to understand more and so far I am not even sure the two are equivalent.

Indeed one can imagine a system where work is done, achieving some level of order improvement, istill n adiabatic condiitons.

I understand this being a trivial argument and would welcome any help, reference or comment.

Thank you and Best Regards

Muzialis
 
Science news on Phys.org
  • #2
Actually that's not a trivial question.
muzialis said:
I had some eposure to classic thermodynamics and I remmebr the entropy being defined as the integral of dq / T ("differential" heat / absolute temperature).
Heat is not difficult to define here. It's what's left over so [itex]\mathrm{d}Q=\mathrm{d}E+p\mathrm{d}V[/itex]. However it's not obvious how one should define temperature here! The definition with Carnot reversible engines shows, that under some assumptions and if you manipulate a set of systems reversibly, then indeed the temperatures and entropy can be found from [itex]\mathrm{d}S=\mathrm{d}Q/T[/itex]. For this definition you have to make sure you really know what reversible means for whatever system you are dealing with.

muzialis said:
Then I came across a different definition, involving the logarithm of the possible number of states, with its related ideas of order, disorder, time arrow, etc.
That's a definition I prefer. Using this definition one can show that if you want to maximize entropy while having a definite energy and volume, you get
[tex]\mathrm{d}E=x\mathrm{d}S-y\mathrm{d}V[/tex]
where x and y are just parameters at this point. Logic in physics can identify y as the pressure provided it is constant at all of the boundary. Now if you also identify x as the temperature, you get back to the same definition as in the thermodynamic case.
All this derivation requires some statistical mechanics. You can look up some concepts. I could write down the derivation once you understand it a little.

Basically you can take the [itex]S=\ln \Omega[/itex] definition of entropy and if you are willing to identify x as the temperature (you don't have another definition of temperature at this point), then you get back to the thermodynamic definition.

Or you stay with macroscopic quantities only, search for reversible process and define entropy and temperature by [itex]T_1/T_2=\mathrm{d}Q_1/\mathrm{d}Q_2[/itex] and [itex]\mathrm{d}S=\mathrm{d}Q/T[/itex] if you observe a heat transfer (under reversible conditions for the set of systems) due to a Carnot engine which at the same time generates or consumes work.
With this macroscopic picture you cannot say anything about microscopic states. If your microscopic system turns out to have to required reversible dynamics, then you get you equate these definitions again.

muzialis said:
Indeed one can imagine a system where work is done, achieving some level of order improvement, istill n adiabatic condiitons.
Strictly speaking the Carnot definition can't say anything about entropy or temperature, if the total processes isn't reversible. In that case the best you can do is to hope you are dealing with a constant temperature process, so that you at least know the temperature (which is what it was before). I think there is no guarantee that the entropy changes wildly. The only thing you do, is to say that freely system will evolve only to a state of increased total entropy.
 
  • #3
Gerenuk,

thank you for your useful explanation. I am looking up the concepts but would still be very interested in seeing the derivation of the equation you proposed.

I also thought I would present the concrete case stimulating my curiosity.
It is a classic theory on rubber elasticity, the main traits presented at for example http://www.doitpoms.ac.uk/tlplib/stiffness-of-rubber/entropy.php

Rubber stiffness is motivated by entropic considerations (using the log definition), but I am struggling to understand how rubber's stiffness could be justified in adiabatic (hence dS=0) conditions.

Thank you very much again

Muzialis
 
  • #4
Make sure to distinguish between the total entropy change (which is zero for an adiabatic reversible process) and the configurational entropy change of the polymer chains (which decreases with deformation).

Just because the first is approximately zero doesn't disprove the model of rubber elasticity. Entropy can also change (or balance out to zero) due to a temperature change in the material. (Question: would adiabatically, reversibly deformed rubber get hotter or colder?)
 
  • #5
Hi there,

and many thanks for your response. The introduction of total and partial entropy confused me even more, but this is just usual before understanding.

IF you stretch rubber it gives heat out. I appreciate, in adiabtaic conditions, this would mean the temperature would increase.

Still, if in adiabatic conditions the total entropy variation is zero, and the conformational entropy is changing, what else is balancing out the entropy? How can this be done by temperature?

thank you very much.

Kindest Regards

Muzialis
 
  • #6
The entropy is balancing out by an increase in temperature. The configurational entropy of the straightened chains decreases, the vibrational entropy of the bonds increases with increasing temperature, and the total entropy change is approximately zero.

Interestingly, the opposite is predicted in metals; since stretching increases the entropy of primary bonds, the temperature is predicted to decrease during adiabatic reversible stretching.

(But note that no process is truly reversible, and irreversibility will tend to increase the temperature during stretching in both cases due to lost work.)
 
  • #7
Mapes,

many thanks for this.

Basically the thermodynamic definition is concerned with global entropy, while you are talking about partial contributions.

But then, I am still perplexed. In the entropic theory of rubber elasticity, it is said the restoring force after a sample os tretched derives from the lower (conformational ) entropy of this better organized state, hence creating the tendency for going back to the relaxed configuration (higher entropy).

From what you say, this concept applies to conformational entropy. why it does not apply to the bond vibrational one?
In the equation dU = TdS + FdL (just the first law of thermodynamics for a piece of rubber stretched by a Force L along a length L), the dS is entropy, not vibrational or conformational..

But my most vivid thanks for all your posts, I am sure I am getting there

Muzialis
 
  • #8
Because bond energy dominates over entropy when we're talking about primary bonds. The mechanism of elastic recovery in this case is the driving force to find a minimum-energy bond length rather than the driving force to increase conformational entropy. (Does this answer your question?)
 
  • #10
I must be misunderstanding. What's your complete question? :smile:
 
  • #11
Mapes,

in relation to our last conversation, the question is, why does rubber recover from the stretched configuration?

the bond energy is not deemed the cause at all in the link I sent you, rather the reason is the conformational entropy.

But as you say the conformational entropy is counter balanced by the vibrational one, (as one would expect from the fact the transformation is adiabatic,dS = 0).

So now,if this dS = 0, then the equation in the link F = dS / dL, indicated as the origin of the force, says that in adiabatic condition rubber has no stiffness!

Hope it is clearer now, and thnks for your patience

Muzialis
 
  • #12
Ah, got it. It's two different entropies. For a polymer,

[tex]F\approx\frac{\partial}{\partial L}(T dS_\mathrm{conformational})[/tex]

For an adiabatic, reversible process,

[tex]dS_\mathrm{total}=0[/tex]
 
  • #13
Mapel,

that is exactly the point! Why, for "a polymer", entropy equals conformational entropy? Is there not a level of arbitrareness there? In the laws of thermodynamics no specific entropy is mentioned!

Thanks again

Muzialis
 
  • #14
Well, there's always vibrational, configurational, conformational, electronic, nuclear, etc., components of entropy. But the conformational entropy is what gets tweaked most when you stretch an elastomer. That's why the other terms are assumed to be negligible in that first equation.
 
  • #15
Mapel,

you have to forgive this combination of laxck of undertsanding and stubborness...

I u8nderstand there might be various entropic components, but if as you say any but the conformational is considered negligible for the case of an elastomer, then also the entropy balance can not be zero (as the other negligible contribution ensure the total entropy variation in an adiabatic process is zero).

Are not facing a contraddcition here?

Basically I say, adiabatic hence dQ = 0, hence dS = 0. If any contribution to entropy apart form the conformational one is negligible, then dS can not be zero.

Thank you for this most interesting discussion

Muzialis
 
  • #16
I do see what you're saying. The problem may lie in treating a polymer thermodynamically as a homogeneous system. When considering the stretching of individual chains, this assumption breaks down, and we have to assume that the only significant consequence of stretching is a decrease in conformational entropy. Sorry, I haven't seen a better explanation.
 
  • #17
Mapes,

I see what you say.

Let memthank you very much for your valuable inputs.

Have a good evening

Marco
 
  • #18
muzialis said:
I am looking up the concepts but would still be very interested in seeing the derivation of the equation you proposed.
I'll do the derivation for energy and volume. It is similar if you have a string with a length (instead of a volume).
First you consider a lot of identical systems since all your statements about one system are actually rather a statistical statement about an ensemble of many copies of your systems. If you look at all these systems, then [itex]n_i[/itex] of them will be in a state [itex]i[/itex] with energy [itex]E_i[/itex] and volume [itex]V_i[/itex]. Let's take an example. If you label your system with indices [itex]i\in \{1,\dots,5\}[/itex] now it might look like 112322355333343 and ordered is 112223333333455 (I ordered the copies according to their index). Now if all other combinations of 1,2,3,4,5 are also possible (say 111111111113335), how many of these combinations will give exactly the same combination as in my first example? Combinatorics tells us that (as order doesn't play a role) it will be
[itex]\frac{15!}{2!3!7!1!2!}[/itex] or in general
[tex]
\Omega=\frac{n!}{\prod n_i!}
[/tex]
Of course we assume that looking at the problem this way with the labeling, a state like 112223333333455 is much more likely than 111111111111111, since the latter is unique whereas the first is the same as a reordered 131233223334535 and all other permutations. Defining
[tex]
S=\ln\Omega
[/tex]
that's basically the second law of thermodynamics saying that entropy if an isolated system always increases (I skip the Boltzmann constant for shortness). Since there are so many particles involved in a gas, one can show that the "more likely states" are actually incredibly more likely than all others. That's why the second law isn't violated even though in could be.

--------------- TBC ---------------
 
Last edited:
  • #19
We can introduce the probabilities
[tex]
p_i=\frac{n_i}{n}
[/tex]
that a copy of those many is in state i. With the assumption that n is a very large number (and some more large number assumptions), we can use Stirling's approximation of factorial to derive
[tex]
S=-\sum p_i\ln p_i
[/tex]

And remember we argued that the entropy should attain a maximum
[tex]
S\to\text{max}
[/tex]
Additionally we would want the average energy and volume of all these copies to be that values energy and volume that we measure on our single system. This corresponds to the equations
[tex]
E=\sum_i E_i p_i
[/tex]
[tex]
V=\sum_i V_i p_i
[/tex]
and of course
[tex]
1=\sum_i p_i
[/tex]
 
Last edited:
  • #20
The task of maximizing the entropy under the given contraints can be solved with Lagrange multipliers to find
[tex]
p_i=\frac{e^{\beta E_i-\alpha V_i}}{Z}
[/tex]
[tex]
Z=\sum_i e^{\beta E_i-\alpha V_i}
[/tex]

Plugging [itex]p_i[/itex] back in into the definitions of [itex]S[/itex] and doing a bit of maths you get
[tex]
\mathrm{d}S=\beta \mathrm{d}E-\alpha \mathrm{d}V
[/tex]
which you can rearrange to
[tex]
\mathrm{d}E=\frac{1}{\beta}\mathrm{d}S-\frac{\alpha}{\beta}\mathrm{d}V
[/tex]
Now we define these global parameters, that so far have no meaning, as temperature and pressure. The identification with pressure actually makes sense since we know from physics where pressure has a meaning that [itex]\mathrm{d}E=-p\mathrm{d}V[/itex] (for adiabatic changes).

This way you arrive at your identity
[tex]
\mathrm{d}E=T\mathrm{d}S-p\mathrm{d}V
[/tex]

The advantage over the direct thermodynamics definition is that you still have the framework behind with all it's equations about possible states of the copies [itex]E_i[/itex], [itex]V_i[/itex] and the probabilites [itex]p_i[/itex] which you can calculate with the partition function [itex]Z[/itex], provided you know these system state energies and volumes. So you can calculate one way, but not the other way round. You still know that
[tex]
p_i=\frac{e^{-(E_i+pV_i)/T}}{Z}
[/tex]
[tex]
Z=\sum_i e^{-(E_i+pV_i)/T}
[/tex]
[tex]
S=\frac{E}{T}+\ln Z
[/tex]
(Note: maybe I got some signs wrong somewhere :smile: but the result is correct)
 
Last edited:
  • #21
Just a small sidenote: Entropy in classical physics cannot be defined as "log of the number of possible states". The reason being is that the number of microstates is infinitely large, so the entropy would be ill-defined. But since you only encounters differences in entropy between two systems, instead of "absolute" entropy, you can still get around this.

Only in QM is the number of states accessible to a system finite.
 
  • #22
Gerenuk said:
Strictly speaking the Carnot definition can't say anything about entropy or temperature, if the total processes isn't reversible. In that case the best you can do is to hope you are dealing with a constant temperature process, so that you at least know the temperature (which is what it was before). I think there is no guarantee that the entropy changes wildly. The only thing you do, is to say that freely system will evolve only to a state of increased total entropy.

That's not true. The actual process you are looking at does not have to be reversible, nor do entropy and temperature have to be fixed (or even be defined) during the process. The decisive point is that entropy and temperature are state functions so that it is sufficient to know the (equilibrium) starting and end point of your process to calculate the change of entropy.
 
  • #23
DrDu said:
That's not true. The actual process you are looking at does not have to be reversible, nor do entropy and temperature have to be fixed (or even be defined) during the process.
I was talking about the Carnot definition only. Due to another thread here, I went through that derivation of Carnot with reversible engines in detail. I found that temperature is only defined for a set where the total entropy is constant. Moreover you need to identify constant temperature processes. But it's hard to tell what happens to entropy in these constant temperature processes.
 

1. What is entropy?

Entropy is a scientific concept used to describe the amount of disorder or randomness in a system. In other words, it is a measure of the energy in a system that is no longer available to do useful work.

2. How is entropy measured?

Entropy is typically measured in units of joules per kelvin (J/K) in the International System of Units (SI). It can also be measured in other units depending on the specific system being studied.

3. What is the relationship between entropy and thermodynamics?

Entropy is a fundamental concept in thermodynamics, which is the study of energy and its transformations. The second law of thermodynamics states that the entropy of a closed system will always increase over time, meaning that the amount of usable energy will decrease.

4. How is entropy related to information theory?

In information theory, entropy is used to measure the amount of uncertainty or randomness in a message. This concept was first introduced by Claude Shannon in 1948 and has since become an important tool in the fields of computer science, mathematics, and physics.

5. Can entropy be reversed?

According to the second law of thermodynamics, the total entropy of a closed system will always increase over time. While it is possible to decrease the entropy in one part of a system, the overall entropy of the system will still increase. Therefore, entropy cannot be reversed in a closed system.

Similar threads

Replies
11
Views
1K
Replies
13
Views
1K
Replies
22
Views
2K
Replies
9
Views
6K
  • Thermodynamics
Replies
1
Views
2K
Replies
8
Views
2K
  • Thermodynamics
Replies
8
Views
917
Replies
10
Views
1K
  • Thermodynamics
Replies
4
Views
1K
Back
Top