Thermodynamics: Explaining Entropy for KiltedEngineer

  • Thread starter Thread starter KiltedEngineer
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
Entropy is mathematically defined as S=-k ∑ p_i log(p_i), where S represents entropy, k is Boltzmann's constant, and p_i is the probability of the system being in state i. A system with a single possible state has zero entropy, indicating a high level of order, while systems with multiple states exhibit higher entropy, reflecting greater disorder or complexity. The concept of disorder is misleading; it is more accurate to describe entropy in terms of the number of microstates corresponding to a macrostate. In thermodynamic processes, heat transfer results in a decrease in entropy for the hot source and an increase for the cold source, ensuring that the total entropy of the system never decreases. Understanding entropy involves recognizing its relationship to energy distribution and the statistical nature of thermodynamic states.
KiltedEngineer
Messages
9
Reaction score
0
I am a mechanical engineering student, but have yet to take thermodynamics. For months now, I have been reading up on thermo and am actually quite interested in it. However, I am still quite confused with the concept of entropy. Its pretty much the consencus that the "disorder" explanation is not true, and it is the measure of energy that is wasted or unavailable to do useful work in a system. Can someone explain how this concept of disorder arises from the mthematically definition and if there is any truth to it?

Thank you!
KiltedEngineer
 
Engineering news on Phys.org
One way to mathematically define entropy S is to relate it to the probability distribution describing a system: S=-k \sum_i^N p_i \log(p_i)
The index i labels the different possible configurations for a system, of which there are N total, k is Boltzmann's constant, and pi is the probability that the system is in the state i. [The pi are all numbers from 0 to 1 so their logarithms monotonically increase from -∞ to 0, so the entropy is always a number greater than or equal to zero.]

If a system has only one possible state, then N=1 and p1=1, and thus S=0. Thus if we know exactly the configuration of a system, then it has zero entropy, and we can describe this state as "ordered." For example, if we have a crystal which is so cold that all the atoms sit exactly in their lattice sites, then we know exactly the state of the crystal, so its entropy is zero. The regular pattern of the location of the atoms is what we refer to as the crystal's order.

On the other hand, if the system could be in many different states, (i.e. N is big and there are many pi's which are non-negligable compared to the rest), then the entropy will be large. As an example, when we heat up a crystal, the atoms start wiggling in their lattice sites, so if we were to look at the crystal at various times, the atoms could be in many different possible positions. Thus it has a higher entropy than a cool crystal. There is somewhat less order if the atoms do not conform exactly to the lattice pattern.
 
Last edited:
"Disorder" is really an imprecise word to describe entropy because it makes you define the word "order". We're saying that ordered means there are relatively few microstates that correspond to the macrostate. Disorder just means the opposite, that a larger number of microstates correspond to the macrostate.

An example of how microstates and macrostates are related is rolling a pair of dice. There are 11 macrostates of the system, 2-12, the sum of the numbers on the dice. There are 36 microstates, the combinations of ways the two dice can roll. There are 6 microstates that correspond to the macrostate "7", but 1 microstate that corresponds to the macrostate "2".

We can talk about the entropy of a substance, like a gas, without referring to a thermodynamic process. You can think of it like a property, like temperature or pressure. When we measure the temperature of a system, it's really a kind of average of the total energy. Excluding units for a while, let's say the energy of the system is 10 and we have 3 things in our system. All 3 things can have an energy of 10, making the average 10. Also, 1 of them can have energy 28 and the other 2 can have energy 1, making the average 10 again. There are a number of ways we can distribute the energy to the particles (microstates) still get the average (macrostate), just like in the dice example. The entropy is classically defined as the energy divided by the temperature.

Now, we talk about entropy as well when we refer to a thermodynamic process, like heat transfer. Heat is always transferred from a higher temperature to a lower temperature, and as the energy is transferred, the entropy of the "hotter" source is decreased and the entropy of the "colder" source increases. This happens because the number of states is tied to the temperature of the substance. The second law of thermodynamics states that the combined entropy of the total system ("hot" and "cold" sources combined) can never decrease as a result of the heat transfer. Since "disorder" is linked to the number of microstates that correspond to the macrostate, the "disorder" either stays the same or increases.
 
Posted June 2024 - 15 years after starting this class. I have learned a whole lot. To get to the short course on making your stock car, late model, hobby stock E-mod handle, look at the index below. Read all posts on Roll Center, Jacking effect and Why does car drive straight to the wall when I gas it? Also read You really have two race cars. This will cover 90% of problems you have. Simply put, the car pushes going in and is loose coming out. You do not have enuff downforce on the right...
Thread 'Physics of Stretch: What pressure does a band apply on a cylinder?'
Scenario 1 (figure 1) A continuous loop of elastic material is stretched around two metal bars. The top bar is attached to a load cell that reads force. The lower bar can be moved downwards to stretch the elastic material. The lower bar is moved downwards until the two bars are 1190mm apart, stretching the elastic material. The bars are 5mm thick, so the total internal loop length is 1200mm (1190mm + 5mm + 5mm). At this level of stretch, the load cell reads 45N tensile force. Key numbers...
I'm trying to decide what size and type of galvanized steel I need for 2 cantilever extensions. The cantilever is 5 ft. The space between the two cantilever arms is a 17 ft Gap the center 7 ft of the 17 ft Gap we'll need to Bear approximately 17,000 lb spread evenly from the front of the cantilever to the back of the cantilever over 5 ft. I will put support beams across these cantilever arms to support the load evenly

Similar threads

Replies
3
Views
2K
Replies
1
Views
2K
Replies
2
Views
1K
Replies
3
Views
1K
Replies
2
Views
1K
Replies
2
Views
3K
Replies
10
Views
4K
Back
Top