What is Entropy

On the Meaning of Entropy

[Total: 4    Average: 3.5/5]

Introduction

The intent of this article is to clarify the meaning of entropy and, while doing so, point out how a deeper analysis of entropy in classical statistical mechanics gives us some hints at an underlying quantum description of physical systems.

Thermodynamics and Entropy

The term entropy was coined by 19th century French physicist Sadi Carnot from his scientific study of steam engines.  Carnot discovered the mechanical equivalent of heat.  A demonstration of which is often given in high school and college physics labs where a falling weight stirs an insulated container of water and the rise in temperature compared to the amount of work done by the falling mass.  With that discovery he to some extend and later Rudolf Clausius were able to quantify the extent to which energetic processes were recoverable and how much heat energy could be recovered as useful work.

Clausius’ formula below expresses how this quantity, entropy, changes when a body absorbs or emits heat energy.

[tex]\delta S = \frac{\delta Q}{T}[/tex]

Somehow within the internal workings of material systems entropy relates heat energy and temperature.

How Entropy Is Used

In the case of a heat engine we can imagine taking a certain amount of energy out of a body maintained at a constant temperature ##T_1##.  Given we’re working with a constant temperature the Clausius formula will work for finite exchange.  (We don’t have to integrate.)

[tex]\Delta S = \frac{\Delta Q_1}{T_1}[/tex]

If we do some amount of work, ##W## with some of that energy it is a zero entropy application so there is no net entropy.  The remaining energy ##Q_2##  must be dumped at a lower temperature  ##T_2## raising its entropy by as much or more than we lowered the heat source.

[tex]\frac{\Delta Q_2}{T_2}=\frac{\Delta Q_1 -W}{T_2} \ge \frac{\Delta Q_1}{T_1}[/tex]

You can now solve this equation to see how much work you can draw off of a given amount of heat as a function of the ratio of absolute temperatures.  The crucial aspect allowing the conversion of heat to useful work is the tendency of heat energy to move from higher temperature to lower.

Entropy as a Measure of Ambiguity

Entropy has often been described as a measure of disorder.  However in my “disorderly” house, I remember where most everything is.  While it is a mess I can quickly identify what stack something is in, and guess fairly accuracy how deep it is in the stack based on how long it’s been since I last saw it.  However, when I had a housekeeper coming in to clean twice I week, I was hard pressed to find anything after she had put my house “in order”.  While I would often, tongue-in-cheek list entropy reversal in the ‘for’ line of my checks to my housekeeper I was in point of fact misrepresenting the meaning of entropy.

So let us consider a much simpler example than the disarray of my home.  Consider the situation where I have six blocks, three of which are red and three of which are green.  Now consider two arrangements:

R1 R2 R3 G4 G5 G6

Arrangement 1

 

G5 R2 R1 G6 R3 G4

Arrangement 2

One might use these as examples of a system with lower entropy (Arrangement 1) vs higher entropy (Arrangement 2).  However, technically both situations have zero entropy because there is no ambiguity in the arrangements (at the level at which we are considering it).  To see that the ‘order’ vs ‘disorder’ quality is relative consider now a distinct way of labeling the blocks.

If I label each block with the following sequences of 10 X’s and O’s.  You’ll notice that no matter how you arrange the blocks, there will be one column in which all of the X’s occur on one side and all of the O’s occur on the other.

1. XXXXXXXXXX
2. XXXXOOOOOO
3. XOOOXXXOOO
4. OXOOXOOXXO
5. OOXOOXOXOX
6. OOOXOOXOXX

For example with Arrangement 2 above we would see the third column show as XXXOOO.  Such ‘ordering’ is relative to our labeling of the cases.

5. OOXOOXOXOX
2. XXXXOOOOOO
1. XXXXXXXXXX
6. OOOXOOXOXX
3. XOOOXXXOOO
4. OXOOXOOXXO

So why does it seem that entropy is a measure of disorder?  The answer lies in how we typically describe systems.  So called orderly systems can be specified exactly with few words while disorderly systems require much longer description to give as exact a description.  It is the exactness that counts and we begin to understand that what entropy is quantifying is actually ambiguity.

Returning to or red and green blocks suppose I tell you only that the three left blocks are red and the three right blocks are green.  I don’t specify which red or green are in which position.

R? R? R? G? G? G?

Figure 3

Then there are [itex]3!=6[/itex] ways the red may be rearranged without changing my specification and likewise [itex]3!=6[/itex]  ways the green may be rearranged giving a total of [itex]6^2=36[/itex]  distinct arrangements, any one of which may be the case from what I specified.  We can assign an entropy to this by taking the logarithm: [itex]S=k_B \ln(36)[/itex].  What base logarithm is a question of the constant of proportionality when you look at the chang of base formula.  We’ll thus leave an arbitrary constant in the entropy formula.  This is Boltzmann’s formula and this constant is named for him.  Its value while arbitrary in this discussion will be fixed when we decide how to incorporate this statistical mechanical quantity in the actual thermodynamic usage.

As to why we take the logarithm, that is so that, in combining independent systems, the entropy will add.  Recall we have 6 possibilities for the red and 6 possibilities for the green.   These we can treat as subsystems and the total entropy is the sum of these partial entropies:

[tex]S = S_R + S_G = k_B \ln(6) + k_B \ln(6) = k_B \ln(12)[/tex]

Now one might be rightly worried about the fact that entropy is not actually a property of a physical system in a physical state.  It is rather a property of classes of system or of system descriptions.  This is worrisome because it seems one could say “Well I know this big tank of gas is in some physical state, so let’s call that state ‘Bunny’ and then my assertion that the gas is in the  ‘Bunny’ state is an exact specification and I’ve just reduced its entropy to zero.”

This is one of a couple of ways that the analysis of entropy presages the advent of quantum theory.  It is not enough for us to simply label a systems state and claim we know it.  To make an assertion about a physical system we are implicitly asserting that an observation has been made and such an observation is a physical interaction with the system.  The entropy of our statement then has a direct connection to the physical history of the actual physical system.  Also to sustain the truth of our assertion about the physical system we implicitly assert the dynamics of that system has been constrained in some specific way.

Entropy of Continuous  Systems

The Boltzmann entropy was extended by Gibbs who related it to the probability distribution of system states rather than just counts of equally likely substates.  For discrete systems we have:

[tex]S= -k_B \sum_{n} p_n \ln(p_n)[/tex]

where we are summing with our index ##n## over the set of all possible states, each with probability ##p_n##.  Note that for the zero probability cases the limit [itex]\lim_{x\to  0} x \ln(x) = 0[/itex] is asserted.  This is fine for our discrete cases.  When it comes to continuous system however we must deal with a few issues.  Certainly our summation will be replaced by integration over a continuous space or manifold.  The problem arises when it comes to units and scale.

For a classical point particle in the presence of known forces, all future observations can be determined by an initial knowledge of the particle’s mass, position, and velocity.  We could, equivalently give its mass, position, and momentum (velocity times mass), or say its mass, mass moment (position times mass) and velocity.  Or some increasingly complex function of these quantities.  But in the end we can express its state (for a fixed mass) as a point in some six dimensional space.   If we restriction the particle motion to two or one dimension then we can likewise reduce the state space to four and two dimensional spaces respectively.

It turns out that as we consider regions of phase space (in the 1 position plus 1 momentum case) it is the unit of action (momentum·distance = energy·time) which defines the measure of a region of state space so that entropy can be defined independent of the shape of that region and purely in terms of the action area.  In 2 dimensional phase space…

Phase Space

the two displayed areas being equal imply equal entropies.  When comparing different systems the action unit for entropy.  For more general systems we describe them in terms of probability distributions over the set of states, i.e. over phase space.  To then calculate the entropy we perform an area integral with

[tex] S = -k_B\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \rho\ln(\rho)dx dp [/tex]

where  ##\rho(x,p)## is the probability density over phase space.

How Ambiguity Relates to Thermodynamic Entropy

Above two very different seeming definitions of entropy have been given.  One is a number associated with the evolution of thermodynamic systems and another is related to information, a quantifier of ambiguity.  To reconcile the two we need to understand how we describe and prescribe physical systems.

The state of a given classical physical system is completely determined by numerically specifying the configurations of all its components and then also their rates of change.  Then given a deterministic dynamic we can predict any past or future state (supposing no unaccounted for external influences perturbs the dynamics) exactly.  Even if there is some ambiguity, and hence entropy in our knowledge of the system state, that is to say we at best know a probability distribution over the set of all possible states, we still, given deterministic dynamics, will not lose any information.  The informational entropy of the system will not change over time.  But no system is ever completely immune from unaccountable external influences.  For example we have recently detected gravitational waves, the ripples of undulating space-time itself spreading outward from a distant cataclysmic merging of two black holes.  The very fact that our hypothesized physical systems exist in a larger universe means we can never completely remove random influences in the dynamics.    So a system evolving over time will of necessity increase in ambiguity unless and until that ambiguity, and hence entropy reaches some maximal value.

So we go back to one of the most basic thermodynamic thought experiments.  Two thermal systems at different temperatures are allowed to exchange energy between each other but with no net exchange of energy to the outside environment.  The environment can only “jostle the elbows” of the interactions, we make sure that it can draw from or add to the net energy.

So we have System 1, with all of its available energy maximally randomized so that its entropy is maximized to ##S_1## and it is at a given temperature ##T_1##.  Likewise System 2, has it’s entropy maximized to ##S_2## and temperature is ##T_2##.  When we consider the composite system which, classically will have a product state space that is the Cartesian product of the two component systems we will see that their entropies will add to give the entropy of the composite system, ##S=S_1 + S_2##.

Now we consider a small amount of dynamic evolution of the composite system wherein the total entropy must necessarily seek its maximal value.

[tex] \delta S = \delta S_1 + \delta S_2 = \frac{1}{T_1} \delta Q_1 + \frac{1}{T_2} \delta Q_2[/tex]

But by our prescription any change in energy of one system will be balanced by an opposite change in the energy of the other, ##\delta Q_2 = – \delta Q_1##.  We thus see that energy flowing in one direction (in a random fashion) will increase the entropy of the composite system.  Assume for the moment that ##T_2 > T_1##.  Then we can, assuming the composite entropy increases derive the following:

[tex]\delta S > 0\quad \to\quad \left( \frac{1}{T_1} – \frac{1}{T_2}\right)\delta Q_1>0, \quad \to \quad \delta Q_1 > 0[/tex]

since [tex] T_2>T_1 \to \frac{1}{T_1} > \frac{1}{T_2} \to \left(\frac{1}{T_1}-\frac{1}{T_2}\right)>0[/tex]

So if energy is allowed to jump randomly between the two systems, energy flowing from the warmer to the cooler will allow the total entropy to increase.  This is a differential relationship momentarily treating the temperatures as constant.  But in point of fact, for most systems temperature increases with increasing energy and thus the cooler system will warm and the warmer system cool until both reach a common temperature and we can then describe that as the temperature of the composite system.  The entropy of the composite system has maximized and it has a definite temperature.

Maxwell’s Demon and Quantum Nature

There should still be something nagging at the mind of the astute reader here.  Take your physical system with its temperature and positive entropy and assume for a moment you can observe it completely.  You then remove any ambiguity in its state and instantly, without physically affecting the system you’ve reduced its entropy to zero.  Or less dramatically assume you measure one somewhat ambiguous property of that system, you will  thereby have reduced it entropy by whatever amount that knowledge reduced its statistical ambiguity.  How can we change a system’s entropy without physically affecting it?

This issue, in another form, was best dramatized by James Clerk Maxwell’s infernal thought experiment.  Maxwell imagined two tanks of gas at equilibrium with a valve between them controlled by a demon who could observe the states of gas atoms near the valve.  The demon, when he saw a faster than average atom approaching from one side (and none from the other) would open the value just long enough to let the atom through.  Likewise when a slower than average atom approached the valve from the other side he would open it just long enough to let that through.  This demon, using his knowledge of the states of the nearby particles could patiently, without changing the energies of any individual atoms or exerting any energy himself seemingly violate the 2nd Law of Thermodynamics.  The demon with pure knowledge is reversing the thermodynamically irreversible.  There are many variations bout the basic premise is this, knowledge can be utilized to affect changes of thermodynamic entropy.

There is a profound resolution to this seeming paradox.  There was an assumption that must be made in order to invoke Maxwell’s demon or any similar observation based entropy change.  That assumption was stated above as “You can observe the physical system without affecting it.” In point of fact acts of observation are changes in the physical state of the observer (or measuring device) caused by the specific state of the system.  The observer must causally interact with that which he observes and that interaction is always two way.  What’s more, to amplify (and thereby copy arbitrarily) the information observed the measuring device must have a fundamentally thermodynamic component.

In short, to physically observe a system enough to decrease its entropy by a certain amount, the observing mechanism is required to increase entropy elsewhere more.  You see this with the heat sinks on amplifiers and the cryogenic systems for our more sensitive detectors.  Most importantly you see this as a fundamental aspect of the quantum description of nature.

Comments here

 

1 reply
  1. Chestermiller
    Chestermiller says:
    I think it would be helpful to expand the section on How Entropy is Used. Its practical uses are much more significant than in understanding heat and work. Its major use is in chemical thermodynamics to allow us to quantify interphase chemical equilibrium of multicomponent systems (distillation, absorption, adsorption, crystallization, liquid-liquid extraction, etc.) and also in quantifying the equilibrium constant for chemical reactions.

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply