Entropy: Definition, Misconceptions & Increase in Closed System

  • Context: Undergrad 
  • Thread starter Thread starter Saado
  • Start date Start date
  • Tags Tags
    Definition Entropy
Click For Summary

Discussion Overview

The discussion centers around the concept of entropy, addressing its definitions, common misconceptions, and the implications of entropy's behavior in closed systems. Participants explore theoretical aspects, practical examples, and the relationship between entropy and energy availability.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express confusion over the use of "disorder" as a definition of entropy, questioning its subjectivity and relevance.
  • One participant clarifies that entropy can be understood as a measure of statistical dispersion, with a formal definition involving Boltzmann's constant and the number of configurations of a system.
  • It is noted that while entropy in closed systems tends to increase or remain constant, this is a statistical tendency rather than a strict law of motion.
  • A participant raises the idea that in small systems, it is possible to observe decreases in entropy, although this is highly unlikely in larger systems.
  • Another participant suggests that there are multiple definitions of entropy that apply in different contexts, emphasizing the importance of the second law of thermodynamics across these definitions.
  • One participant discusses the challenges of finding true closed systems in real-world scenarios, using the example of a cooling kettle affected by external factors.
  • There is a proposal that entropy can be viewed as a form of equilibrium, with analogies drawn to physical processes like water settling or the breakdown of materials over time.
  • Concerns are raised about the concept of "available energy" in relation to entropy, particularly regarding how energy disperses and the implications for closed systems.

Areas of Agreement / Disagreement

Participants express a range of views on the definitions and implications of entropy, with no clear consensus reached on the best way to understand the concept or the nature of closed systems.

Contextual Notes

Some definitions of entropy may depend on specific contexts, and there are unresolved questions about the practical existence of closed systems outside of experimental conditions. Additionally, the relationship between entropy and energy availability remains complex and not fully clarified.

Saado
Messages
44
Reaction score
0
A lot of the less maths-y definitions of entropy talk about disorder and how disordered a system is. I'm given to understand that entropy is a measure of energy over temperate. Could someone clear up these misconceptions? I don't understand why 'disorder' is used. Isn't that subjective?

Second question. I don't understand why entropy has to increase or stay constant in a closed system over time. Surely say, in a room where air molecules are bouncing around, they would at some point move to a more 'ordered' state?
 
Science news on Phys.org
So, there's two definitions of entropy.
The first (and oldest) one is based on heat and temperature.
The second, more modern definition is based on "disorder".

"disorder" is a confusing term.
To put it a touch more precisely, entropy is a measure of statistical dispersion, or how spread out the system is, among all its possible configurations.

The formal definition of entropy S if a system is:
S=k_{B}log(W)
where k_{B} is Boltzmann's constant, and W is the number of "ways" that all the states of the atoms that make the system up can be arranged so as to give the same overall state of the object (total energy, particle number, volume, etc). Boltzmann's constant is there to relate the modern definition of entropy to the old definition of entropy, so everything works out right. There is a little bit of arbitrariness in what counts as a "way", but whether or not we include, say, nuclear spin into the entropy, only shifts it by a constant amount, and doesn't affect most calculations.

Technically, there is no dynamical reason why the entropy has to increase or stay constant in a closed system over time. It is a fact of statistics, rather than something written into the laws of motion that it does.

The only reason we see that the entropy of large scale systems seems to inexorably increase or stay constant in a closed system is that it is overwhelmingly more likely than all other possibilities.

For a small chamber with an ultra high vacuum of maybe a dozen atoms, you could wait long enough and see at some point of time, all of those atoms being on one side of the container, and not the other. It may take a while, but it almost certainly will happen.

If you instead take a liter size vessel with air at room temperature and atmospheric pressure, you're waiting for the time when roughly a hundred billion trillion atoms will happen to be on one side of the bottle and not the other. Although there is a nonzero probability, you would have to wait an unbelievably long time to see such an occurrence (unbelievably long, even compared to the age of the Universe).

On the less absurd scale of things, you would probably be able to see minute fluctuations back and forth in the entropy of a closed system. The entropy of a closed system is not truly nondecreasing, but that is overwhelmingly the most likely thing that we see.
 
I suggest you start at entropy disambiguation on Wikipedia, and follow some of the links. There are many definitions of entropy, all correct but with differing contexts and emphasis.

In some contexts the disorder makes sense. In others the unknown information makes sense. In thermo, the definitions jfizzix gave make sense.

The important thing IMHO is that the second law of thermodynamics applies to all the definitions.
 
Thank you both. That helped a lot.
 
Hi, this is also one that I struggle with. And every time I read something that defines it, I get lost in the jargon and the formulas. I'm good with algebra, but I struggle to tie an equation to a theory or law, which is why analogies work much better for me. There are a few aspects of entropy I'd like to understand and please please do correct me.

The first aspect is the closed system explanation. Is there anywhere outside experimental situations where you would find a closed system? Whenever I try to think of one, I think it's closed but then realize there are outside factors. For example, a kettle that's cooling is doing so because of the room it is in, and the room temperature is affected by airconditioning, external temperatures, insulation etc. And the external temperatures are affected by local weather, heat energy from the sun etc. If there are no closed systems in real situations, are the experimental ones really only for establishing baseline formulas etc?

Secondly, my understanding of entropy is twofold, yet related. I understand it as a kind of equilibrium. Like the boiled kettle going from a higher temperature on a steady slope down to room temperature. In the same way, I understand it as analogous to water finding its lowest place to settle. In that same way, all "ordered" materials will eventually find their lowest point by breaking down and becoming disordered, like a piece of steel or a human body will both eventually become part of the soil again.

One definition I've read is the hardest one of all in some respects. I've read it as the decrease in energy available to do work. Conservation of energy says that we never lose energy, it just changes form. So it's my understanding that "available" energy is like electricity in a battery, which is converted into work. But after the battery is depleted of electricity, it still has energy, but not in a form we can use. In some respects, a battery is both an analogy of entropy and also an observable mechanism of that decrease in available energy, except that it breaks down as both analogy and true observation on so many levels. Energy has been removed from (is no longer part of) the battery and been converted to heat or movement or photons, and so that energy has gone elsewhere. What remains of the depleted battery are elements that can no longer offer energy. And the energy that has gone elsewhere has dispersed, but into what I don't know. Does all "available" energy eventually radiate out to space as heat or photons? I guess this is where a closed system becomes a useful tool.

Apologies for the length and meandering.
 
narrator said:
Hi, this is also one that I struggle with. And every time I read something that defines it, I get lost in the jargon and the formulas. I'm good with algebra, but I struggle to tie an equation to a theory or law, which is why analogies work much better for me. There are a few aspects of entropy I'd like to understand and please please do correct me.

The first aspect is the closed system explanation. Is there anywhere outside experimental situations where you would find a closed system? Whenever I try to think of one, I think it's closed but then realize there are outside factors. For example, a kettle that's cooling is doing so because of the room it is in, and the room temperature is affected by airconditioning, external temperatures, insulation etc. And the external temperatures are affected by local weather, heat energy from the sun etc. If there are no closed systems in real situations, are the experimental ones really only for establishing baseline formulas etc?
You are mistaken about the definition of a Closed System. A closed system is one that does not exchange mass with the surroundings, but is fully capable to exchanging energy with the surroundings in the form of both work and heat.
Secondly, my understanding of entropy is twofold, yet related. I understand it as a kind of equilibrium. Like the boiled kettle going from a higher temperature on a steady slope down to room temperature. In the same way, I understand it as analogous to water finding its lowest place to settle. In that same way, all "ordered" materials will eventually find their lowest point by breaking down and becoming disordered, like a piece of steel or a human body will both eventually become part of the soil again.

One definition I've read is the hardest one of all in some respects. I've read it as the decrease in energy available to do work. Conservation of energy says that we never lose energy, it just changes form. So it's my understanding that "available" energy is like electricity in a battery, which is converted into work. But after the battery is depleted of electricity, it still has energy, but not in a form we can use. In some respects, a battery is both an analogy of entropy and also an observable mechanism of that decrease in available energy, except that it breaks down as both analogy and true observation on so many levels. Energy has been removed from (is no longer part of) the battery and been converted to heat or movement or photons, and so that energy has gone elsewhere. What remains of the depleted battery are elements that can no longer offer energy. And the energy that has gone elsewhere has dispersed, but into what I don't know. Does all "available" energy eventually radiate out to space as heat or photons? I guess this is where a closed system becomes a useful tool.

Apologies for the length and meandering.
The concept of entropy originally developed from the theoretical understanding that evolved during the development of the second law of thermodynamics. The second law was the result of experimental observations. These experiments indicated that all materials and closed systems must exhibit a unique physical property that was dubbed entropy. This physical property applies to thermodynamic equilibrium states of the material or system. The statement of the second law as captured mathematically by noting that the change in entropy in going from one thermodynamic equilibrium state to another is greater or equal to the heat transferred to the system in any process transitioning the system from one equilibrium state to the other divided by the temperature at the boundary of the system through which the heat is flowing. For a so-called reversible process path (a subset of all the possible process paths between the two equilibrium states), the equal sign applies. This provides a way of measuring or calculating the entropy change between the two equilibrium states.

Chet
 
Last edited:
Reading about Maxwell's demon helped me understand better why entropy doesn't decrease in closed systems. Its a thought experiment, very easy to visualize and follow.
 
Enclose said:
Reading about Maxwell's demon helped me understand better why entropy doesn't decrease in closed systems. Its a thought experiment, very easy to visualize and follow.
I think you mean isolated systems, not closed systems. Entropy can certainly decrease in closed systems.

Chet
 
Oops, indeed I do Chestermiller, thanks for pointing that out.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
908
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
10
Views
4K
  • · Replies 29 ·
Replies
29
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K