Trying to reconcile two definitions of Entropy

In summary, entropy is a measure of the tendency to disperse heat and is related to temperature. When two systems with different temperatures are placed in contact, the colder body will experience a greater increase in entropy. Entropy is also a measure of the number of states in a system, and can be reduced by providing more information. The "order" analogy should be used carefully, as it can be difficult to define what is considered ordered. The difference between macrostates and microstates is important in understanding entropy. In the Ising model, entropy plays a role in the breakdown of magnetism and phase transitions. A higher number of accessible microstates leads to a higher entropy state.
  • #1
badatstuff
3
0
My question is regarding a few descriptions of Entropy. I'm actually unsure if my understanding of each version of entropy is correct, so I'm looking for a two birds in one stone answer of fixing my misunderstanding of each and then hopefully linking them together.

1) A measure of the tendency to disperse heat: eg maxwells relations (dS/sU)_[v constant] = 1/T says if you increase the energy of a system while T is low, it has more of an affect on entropy than if you increased the energy by the same amount on a system with high T.

If you place two systems S_1 and S_2 in contact with each other with temperatures T_1 > T_2 then eventually it reaches equilibrium. But the 1/T relation means the entropy of the colder body increases faster than loss of entropy of the warmer body?

2) Entropy is a measure of "disorder" eg a typical examples given in textbooks - someone unbiased of the final outcome stacking of books on a book shelf in completely random orientations, the odds of them being all in alphabetically in order is low due to the number of states. Entropy is a measure of the number of states. You can reduce the number of states by giving the book stacker more information, telling him he can stack them upright reducing the number of permutations and reducing entropy. In terms of system of particles more energy is more accessible states and so higher entropy?

I'm struggling to see the similarities between having a lack of information and seeing a colder body and wanting give it heat/feel the need to ruin its "order" too and make it disordered.

Thanks for your time
 
Science news on Phys.org
  • #2
badatstuff said:
1) A measure of the tendency to disperse heat: eg maxwells relations (dS/sU)_[v constant] = 1/T says if you increase the energy of a system while T is low, it has more of an affect on entropy than if you increased the energy by the same amount on a system with high T.
Thats equation says actually more about temperature than it does about entropy. You can take it as the thermodynamic definition of temperature.

badatstuff said:
If you place two systems S_1 and S_2 in contact with each other with temperatures T_1 > T_2 then eventually it reaches equilibrium. But the 1/T relation means the entropy of the colder body increases faster than loss of entropy of the warmer body?
Correct, so the overall entropy increases when heat flows from warm to cold.

badatstuff said:
2) Entropy is a measure of "disorder" eg a typical examples given in textbooks - someone unbiased of the final outcome stacking of books on a book shelf in completely random orientations, the odds of them being all in alphabetically in order is low due to the number of states. Entropy is a measure of the number of states. You can reduce the number of states by giving the book stacker more information, telling him he can stack them upright reducing the number of permutations and reducing entropy. In terms of system of particles more energy is more accessible states and so higher entropy?

I'm struggling to see the similarities between having a lack of information and seeing a colder body and wanting give it heat/feel the need to ruin its "order" too and make it disordered.
You have to be careful with the "order" analogy. It breaks down very rapidly (especially since it can be hard to define what is ordered).

Do you know the difference between a macrostate and a microstate?
 
  • Like
Likes NFuller and badatstuff
  • #3
Thank you for your reply.

DrClaude said:
Do you know the difference between a macrostate and a microstate?

microstates are the different ways you can rearrange your system while remaining in the same macrocscopic state, would that be a correct difference?

DrClaude said:
You have to be careful with the "order" analogy. It breaks down very rapidly (especially since it can be hard to define what is ordered).

Throughout my undergraduate studies senior lectures kept referring to disorder/order, and then when I went online to clarify what it meant people didn't seem to like that definition so I left "disorder" in quotations.

I'm currently doing something on the Ising model, and this was a rabbit hole I entered. I'm trying to solidify my understanding so I can look at what entropy means for the Ising model eg the curie temperature and the break down of magnetism, the relationship between the "disorder" of the spins at high temperature and entropy as well as its involvement in phase transitions (if any)
 
  • #4
badatstuff said:
I'm struggling to see the similarities between having a lack of information and seeing a colder body and wanting give it heat/feel the need to ruin its "order" too and make it disordered.
Macrostates with a greater number of accessible microstates are higher entropy states than macrostates with fewer microstates. Thus the number of high entropy configurations is larger than the number of low entropy configurations. Since the particle dynamics are considered stochastic, these states are selected at random. This means that the system will have a preference for the high entropy states simply because there are more of them.

If the system is allowed to evolve in time from a non-equilibrium state to an equilibrium state, the system will naturally keep moving between macrostates at random until it finds the one with an overwhelmingly large number of microstates. At this point the system will keep randomly selecting this same configuration and will have reached thermal equilibrium. The second law says that this equilibrium state is also a maximum entropy state. Thus states with a large number of microstates correspond to high entropy states. I recently wrote an article detailing this here https://www.physicsforums.com/insights/statistical-mechanics-part-equilibrium-systems/
 
  • Like
Likes badatstuff
  • #5
NFuller said:
Macrostates with a greater number of accessible microstates are higher entropy states than macrostates with fewer microstates. Thus the number of high entropy configurations is larger than the number of low entropy configurations. Since the particle dynamics are considered stochastic, these states are selected at random. This means that the system will have a preference for the high entropy states simply because there are more of them.

If the system is allowed to evolve in time from a non-equilibrium state to an equilibrium state, the system will naturally keep moving between macrostates at random until it finds the one with an overwhelmingly large number of microstates. At this point the system will keep randomly selecting this same configuration and will have reached thermal equilibrium. The second law says that this equilibrium state is also a maximum entropy state. Thus states with a large number of microstates correspond to high entropy states. I recently wrote an article detailing this here https://www.physicsforums.com/insights/statistical-mechanics-part-equilibrium-systems/

Thanks very much for your explanation, as well as the article. Cleared a lot up for me!
 

1. What is entropy and how is it defined?

Entropy is a measure of the disorder or randomness in a system. It is defined differently in different fields, but the most commonly used definition is the thermodynamic definition, which states that entropy is a measure of the distribution of energy in a system.

2. What is the difference between the thermodynamic definition and the information theory definition of entropy?

The thermodynamic definition of entropy is based on the physical properties of a system, such as temperature and energy, while the information theory definition is based on the amount of uncertainty or information in a system. The two definitions are related, but they are not interchangeable.

3. How do we reconcile these two definitions of entropy?

One way to reconcile the two definitions is to view entropy as a measure of the randomness or uncertainty in a system, whether it is physical or informational. This approach allows for the application of both definitions in different contexts.

4. Can entropy be negative?

In the thermodynamic definition, entropy is always positive or zero. However, in the information theory definition, entropy can be negative if there is a redundancy of information. This means that there is less uncertainty and more predictability in the system.

5. How is entropy used in different fields of science?

Entropy is a fundamental concept in thermodynamics, statistical mechanics, and information theory. It is also used in fields such as biology, ecology, and economics to understand the flow of energy and information in complex systems.

Similar threads

Replies
13
Views
1K
  • Thermodynamics
Replies
3
Views
788
Replies
17
Views
1K
Replies
9
Views
1K
Replies
2
Views
844
Replies
3
Views
968
Replies
1
Views
787
  • Quantum Physics
Replies
4
Views
778
  • Thermodynamics
Replies
2
Views
931
Back
Top