A system is always in a well-defined state, so the entropy is constant?

Click For Summary

Discussion Overview

The discussion revolves around the concept of entropy in statistical mechanics, particularly whether a system is always in a well-defined state and how this affects the calculation of entropy. Participants explore the relationship between microstates, macrostates, and the knowledge of the observer regarding the system's state.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant suggests that if a system's distribution is a Dirac delta function in phase space, then entropy remains constant, as there is no integral to calculate.
  • Another participant agrees, stating that a complete specification of a system's state in terms of positions and momenta results in zero entropy, distinguishing between microstates and macrostates.
  • There is a query about whether the term "closed" should be added before "system," indicating uncertainty about its implications.
  • A later reply questions whether it is the knowledge of the Hamiltonian that determines entropy, proposing that only the distribution function uniquely determines entropy under certain conditions.

Areas of Agreement / Disagreement

Participants generally agree on the relationship between knowledge of the system and entropy, but there is uncertainty regarding the implications of defining the system as "closed" and the role of the Hamiltonian in determining entropy.

Contextual Notes

The discussion includes assumptions about the definitions of microstates and macrostates, as well as the conditions under which entropy is considered constant. There are unresolved questions about the implications of system closure and the role of the Hamiltonian in entropy calculations.

spocchio
Messages
20
Reaction score
0
This is my apparently-trivial problem, that probably means i don't understood what entropy is.

We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is

\rho=exp(-\beta H(q,p))

In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.

\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})

It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
 
Last edited:
Science news on Phys.org
Yes, you're exactly right. If the state of a system is specified completely in terms of the positions and momenta of each individual particle, it's called a "microstate". Such a state has zero entropy. If the state is specified in terms of macroscopic quantities such as density and pressure, it's called a "macrostate." A macrostate is a collection of microstates.
someone who know exactly the position and momentum of each particle should not see a variation of entropy.
And even stronger, someone who follows the position and momentum of each particle will not see a variation of entropy. (Liouville's Theorem) Increase in entropy only happens when you fail to track the individual particles.
 
our system

Should you not add the word 'closed' before system?
 
Studiot said:
Should you not add the word 'closed' before system?
nice question, I don't know the answer, what should change?

Bill_K said:
Yes, you're exactly right.
can we say that is not the knowledge of H(q,p) that gives the desired entropy?. In conclusion (in statistical mechanics) only rho(q,p) that determinate uniquely the entropy, with the request that \frac{d \rho}{dt}=0.
 

Similar threads

Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K