Thermodynamics: Explaining Entropy for KiltedEngineer

  • Thread starter Thread starter KiltedEngineer
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary
SUMMARY

The discussion clarifies the concept of entropy in thermodynamics, emphasizing that it is a measure of energy that is unavailable for doing useful work rather than merely a representation of disorder. Entropy (S) is mathematically defined as S = -k ∑ p_i log(p_i), where k is Boltzmann's constant and p_i represents the probability of the system being in state i. A system with a single state has zero entropy, indicating order, while multiple states lead to higher entropy. The relationship between microstates and macrostates is illustrated through examples such as rolling dice and the behavior of gases during thermodynamic processes.

PREREQUISITES
  • Understanding of basic thermodynamic concepts
  • Familiarity with statistical mechanics
  • Knowledge of Boltzmann's constant
  • Basic mathematical skills for probability and logarithms
NEXT STEPS
  • Study the implications of the second law of thermodynamics
  • Learn about the relationship between entropy and temperature in thermodynamic processes
  • Explore statistical mechanics and its application to thermodynamic systems
  • Investigate the concept of microstates and macrostates in greater detail
USEFUL FOR

Mechanical engineering students, physics enthusiasts, and professionals in thermodynamics seeking to deepen their understanding of entropy and its implications in energy systems.

KiltedEngineer
Messages
17
Reaction score
0
I am a mechanical engineering student, but have yet to take thermodynamics. For months now, I have been reading up on thermo and am actually quite interested in it. However, I am still quite confused with the concept of entropy. Its pretty much the consencus that the "disorder" explanation is not true, and it is the measure of energy that is wasted or unavailable to do useful work in a system. Can someone explain how this concept of disorder arises from the mthematically definition and if there is any truth to it?

Thank you!
KiltedEngineer
 
Engineering news on Phys.org
One way to mathematically define entropy S is to relate it to the probability distribution describing a system: S=-k \sum_i^N p_i \log(p_i)
The index i labels the different possible configurations for a system, of which there are N total, k is Boltzmann's constant, and pi is the probability that the system is in the state i. [The pi are all numbers from 0 to 1 so their logarithms monotonically increase from -∞ to 0, so the entropy is always a number greater than or equal to zero.]

If a system has only one possible state, then N=1 and p1=1, and thus S=0. Thus if we know exactly the configuration of a system, then it has zero entropy, and we can describe this state as "ordered." For example, if we have a crystal which is so cold that all the atoms sit exactly in their lattice sites, then we know exactly the state of the crystal, so its entropy is zero. The regular pattern of the location of the atoms is what we refer to as the crystal's order.

On the other hand, if the system could be in many different states, (i.e. N is big and there are many pi's which are non-negligable compared to the rest), then the entropy will be large. As an example, when we heat up a crystal, the atoms start wiggling in their lattice sites, so if we were to look at the crystal at various times, the atoms could be in many different possible positions. Thus it has a higher entropy than a cool crystal. There is somewhat less order if the atoms do not conform exactly to the lattice pattern.
 
Last edited:
"Disorder" is really an imprecise word to describe entropy because it makes you define the word "order". We're saying that ordered means there are relatively few microstates that correspond to the macrostate. Disorder just means the opposite, that a larger number of microstates correspond to the macrostate.

An example of how microstates and macrostates are related is rolling a pair of dice. There are 11 macrostates of the system, 2-12, the sum of the numbers on the dice. There are 36 microstates, the combinations of ways the two dice can roll. There are 6 microstates that correspond to the macrostate "7", but 1 microstate that corresponds to the macrostate "2".

We can talk about the entropy of a substance, like a gas, without referring to a thermodynamic process. You can think of it like a property, like temperature or pressure. When we measure the temperature of a system, it's really a kind of average of the total energy. Excluding units for a while, let's say the energy of the system is 10 and we have 3 things in our system. All 3 things can have an energy of 10, making the average 10. Also, 1 of them can have energy 28 and the other 2 can have energy 1, making the average 10 again. There are a number of ways we can distribute the energy to the particles (microstates) still get the average (macrostate), just like in the dice example. The entropy is classically defined as the energy divided by the temperature.

Now, we talk about entropy as well when we refer to a thermodynamic process, like heat transfer. Heat is always transferred from a higher temperature to a lower temperature, and as the energy is transferred, the entropy of the "hotter" source is decreased and the entropy of the "colder" source increases. This happens because the number of states is tied to the temperature of the substance. The second law of thermodynamics states that the combined entropy of the total system ("hot" and "cold" sources combined) can never decrease as a result of the heat transfer. Since "disorder" is linked to the number of microstates that correspond to the macrostate, the "disorder" either stays the same or increases.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K