A system is always in a well-defined state, so the entropy is constant?

Click For Summary
SUMMARY

The discussion centers on the concept of entropy in statistical mechanics, specifically addressing the relationship between knowledge of a system and its entropy. It establishes that when a system's state is fully defined by the positions and momenta of its particles (microstate), the entropy is zero. Conversely, when described by macroscopic quantities (macrostate), the entropy reflects a collection of microstates. The conversation highlights that entropy variation is contingent upon the observer's knowledge, supported by Liouville's Theorem, which states that entropy increases only when individual particle tracking is insufficient.

PREREQUISITES
  • Understanding of statistical mechanics principles
  • Familiarity with microstates and macrostates
  • Knowledge of Liouville's Theorem
  • Basic grasp of phase space and entropy calculations
NEXT STEPS
  • Study the implications of Liouville's Theorem in classical mechanics
  • Explore the mathematical formulation of microstates and macrostates
  • Investigate the role of phase space in statistical mechanics
  • Learn about the relationship between entropy and information theory
USEFUL FOR

This discussion is beneficial for physicists, particularly those specializing in statistical mechanics, as well as students and researchers interested in the foundational concepts of entropy and its implications in thermodynamics.

spocchio
Messages
20
Reaction score
0
This is my apparently-trivial problem, that probably means i don't understood what entropy is.

We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is

\rho=exp(-\beta H(q,p))

In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.

\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})

It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
 
Last edited:
Science news on Phys.org
Yes, you're exactly right. If the state of a system is specified completely in terms of the positions and momenta of each individual particle, it's called a "microstate". Such a state has zero entropy. If the state is specified in terms of macroscopic quantities such as density and pressure, it's called a "macrostate." A macrostate is a collection of microstates.
someone who know exactly the position and momentum of each particle should not see a variation of entropy.
And even stronger, someone who follows the position and momentum of each particle will not see a variation of entropy. (Liouville's Theorem) Increase in entropy only happens when you fail to track the individual particles.
 
our system

Should you not add the word 'closed' before system?
 
Studiot said:
Should you not add the word 'closed' before system?
nice question, I don't know the answer, what should change?

Bill_K said:
Yes, you're exactly right.
can we say that is not the knowledge of H(q,p) that gives the desired entropy?. In conclusion (in statistical mechanics) only rho(q,p) that determinate uniquely the entropy, with the request that \frac{d \rho}{dt}=0.
 

Similar threads

Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
39
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K