Entropy and enthelpy difference

  • Thread starter Thread starter Mr_Bojingles
  • Start date Start date
  • Tags Tags
    Difference Entropy
Click For Summary
Entropy is a complex concept often described as a measure of disorder or the distribution of energy within a system, closely tied to the second law of thermodynamics. It quantifies the number of microscopic states a system can occupy, with higher entropy indicating a greater number of possible configurations. The confusion arises from differing interpretations, such as viewing entropy as disorder versus energy dispersal, which can lead to misconceptions. Understanding entropy through the lens of microstates clarifies that as systems exchange energy, the number of accessible microstates increases, contributing to overall entropy. Ultimately, grasping entropy requires a shift in perspective from traditional definitions to a statistical understanding of energy distribution.
  • #31
Count Iblis said:
Yes, unless you let the energy depend on what side is up, this won't affect energy or temperature. So, let's assume that each dot on the dice has a small mass. The energy of a die then depends on which side is up. You can then consider a large number N of dice and then consider how the laws of thermodynamics are modified.

then the dice aren't fair. See, for example, Las Vegas dice- those are balanced.

http://www.gpigaming.com/usa_products_dice.shtml
 
Last edited by a moderator:
Science news on Phys.org
  • #32
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.
 
  • #33
Yes, the "dice number degrees of freedom" will not reach thermal equilibrium. But once they are thrown and are rolling, then thermodynamics can beapplied. When they stop rolling, they will be frozen in some state.

This looks like how in the early universe the neutron/proton ratio was frozen at exp(-delta m c^2/k T) with delta m the mass difference between neutrons and protons and T the freeze-out temperature, explaining the present day ratio of hydrogen and helium.
 
  • #34
But that doesn't answer my other questions.

In particular the one about just stating the number. Of course I could pick any number.

And doesn't that bring us straight to that famous phrase

"In the beginning was the Word"

Where have we stopped talking Physics?
 
  • #35
DrDu said:
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.

It's an interesting problem... if you roll dice but don't look at them, you can apply statistical methods and ergodic processes to make predictions. But once you look at the result, you can't. In the same token, while you are receiving a message from me, the entropy has a meaning. Once you have the message, the entropy is zero, because there is no uncertainty.

Sounds suspiciously like a 'measurement problem'...
 
  • #36
In my opinion after many years of observing many opinions, Entropy is the "state" or condition of relative order of the entire process contained within the envelope of the closed system and with an irreversible process; with perfect order being 0% Entropy, and total disorder being 100% Entropy. Since it is in a closed system, and irreversible, any process will proceed toward the completion of balancing of its forces, energies, pressures, etc. until no further action is possible. Theory says 100%, but real world things never achieve that.

In thermodynamics, the 100% Entropy would be the total and uniform distribution of all the heat elements contained within the envelope, with no more ability to perform work. Anything less than 100% would imply higher areas of concentration of heat, and therefore the ability to perform work.

The key words a "closed system" and "irreversible".
Bob.
 
  • #37
What is the relationship of enthalpy and entropy?
 
  • #38
Enthalpy is a measure of energy. Entropy is direction. When energy decreases, entropy increases. According to Brian Greene (The Fabric of the Cosmos), the entropy of the universe has been increasing since the "Big Bang."
 
Last edited:
  • #39
How is energy defined as "unavailable"
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
7K
  • · Replies 19 ·
Replies
19
Views
7K
  • · Replies 17 ·
Replies
17
Views
2K
Replies
10
Views
3K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 2 ·
Replies
2
Views
1K