Entropy and enthelpy difference

  • Context: Undergrad 
  • Thread starter Thread starter Mr_Bojingles
  • Start date Start date
  • Tags Tags
    Difference Entropy
Click For Summary

Discussion Overview

The discussion centers around the concept of entropy, its definitions, and its relationship to the second law of thermodynamics. Participants explore various interpretations of entropy, including its connection to disorder, energy dispersion, and microstates. The conversation reflects a mix of theoretical understanding and personal frustration with the complexities of the topic.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant expresses frustration in understanding entropy, noting conflicting definitions found online, particularly regarding its relationship to disorder and energy dispersion.
  • Another participant introduces the Boltzmann definition of entropy (S = klnΩ), explaining that it relates to the number of microscopic states a system can occupy, suggesting that systems tend to move towards states of higher entropy.
  • A different viewpoint challenges the disorder analogy, arguing that disorder is subjective and that the relationship between microstates and entropy is more fundamental.
  • One participant discusses a specific example involving two rings of material at low temperatures, arguing that the concept of energy dispersal does not adequately explain the increase in entropy when the rings are brought into contact.
  • Another participant emphasizes the importance of thinking in terms of microstates and suggests that entropy can be viewed as something that "flows" between systems, although clarifying that nothing is actually flowing.
  • There is a mention that entropy is conserved in reversible processes but created during energy transfer in response to gradients, highlighting the complexity of the concept.

Areas of Agreement / Disagreement

Participants generally agree that entropy is a challenging concept with multiple interpretations, but there is no consensus on a singular definition or understanding. Various competing views on how to conceptualize entropy remain present in the discussion.

Contextual Notes

Participants note the difficulty in defining energy and its relationship to entropy, as well as the subjective nature of disorder. The discussion reflects a range of assumptions and interpretations that are not fully resolved.

  • #31
Count Iblis said:
Yes, unless you let the energy depend on what side is up, this won't affect energy or temperature. So, let's assume that each dot on the dice has a small mass. The energy of a die then depends on which side is up. You can then consider a large number N of dice and then consider how the laws of thermodynamics are modified.

then the dice aren't fair. See, for example, Las Vegas dice- those are balanced.

http://www.gpigaming.com/usa_products_dice.shtml
 
Last edited by a moderator:
Science news on Phys.org
  • #32
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.
 
  • #33
Yes, the "dice number degrees of freedom" will not reach thermal equilibrium. But once they are thrown and are rolling, then thermodynamics can beapplied. When they stop rolling, they will be frozen in some state.

This looks like how in the early universe the neutron/proton ratio was frozen at exp(-delta m c^2/k T) with delta m the mass difference between neutrons and protons and T the freeze-out temperature, explaining the present day ratio of hydrogen and helium.
 
  • #34
But that doesn't answer my other questions.

In particular the one about just stating the number. Of course I could pick any number.

And doesn't that bring us straight to that famous phrase

"In the beginning was the Word"

Where have we stopped talking Physics?
 
  • #35
DrDu said:
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.

It's an interesting problem... if you roll dice but don't look at them, you can apply statistical methods and ergodic processes to make predictions. But once you look at the result, you can't. In the same token, while you are receiving a message from me, the entropy has a meaning. Once you have the message, the entropy is zero, because there is no uncertainty.

Sounds suspiciously like a 'measurement problem'...
 
  • #36
In my opinion after many years of observing many opinions, Entropy is the "state" or condition of relative order of the entire process contained within the envelope of the closed system and with an irreversible process; with perfect order being 0% Entropy, and total disorder being 100% Entropy. Since it is in a closed system, and irreversible, any process will proceed toward the completion of balancing of its forces, energies, pressures, etc. until no further action is possible. Theory says 100%, but real world things never achieve that.

In thermodynamics, the 100% Entropy would be the total and uniform distribution of all the heat elements contained within the envelope, with no more ability to perform work. Anything less than 100% would imply higher areas of concentration of heat, and therefore the ability to perform work.

The key words a "closed system" and "irreversible".
Bob.
 
  • #37
What is the relationship of enthalpy and entropy?
 
  • #38
Enthalpy is a measure of energy. Entropy is direction. When energy decreases, entropy increases. According to Brian Greene (The Fabric of the Cosmos), the entropy of the universe has been increasing since the "Big Bang."
 
Last edited:
  • #39
How is energy defined as "unavailable"
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
7K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
7K
  • · Replies 45 ·
2
Replies
45
Views
6K