A system is always in a well-defined state, so the entropy is constant?

  • Thread starter spocchio
  • Start date
  • #1
20
0
This is my apparently-trivial problem, that probably means i don't understood what entropy is.

We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is

[itex]\rho=exp(-\beta H(q,p))[/itex]

In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.

[itex]\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})[/itex]

It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
 
Last edited:

Answers and Replies

  • #2
Bill_K
Science Advisor
Insights Author
4,155
199
Yes, you're exactly right. If the state of a system is specified completely in terms of the positions and momenta of each individual particle, it's called a "microstate". Such a state has zero entropy. If the state is specified in terms of macroscopic quantities such as density and pressure, it's called a "macrostate." A macrostate is a collection of microstates.
someone who know exactly the position and momentum of each particle should not see a variation of entropy.
And even stronger, someone who follows the position and momentum of each particle will not see a variation of entropy. (Liouville's Theorem) Increase in entropy only happens when you fail to track the individual particles.
 
  • #3
5,439
9
our system
Should you not add the word 'closed' before system?
 
  • #4
20
0
Should you not add the word 'closed' before system?
nice question, I dont know the answer, what should change?

Yes, you're exactly right.
can we say that is not the knowledge of H(q,p) that gives the desired entropy?. In conclusion (in statistical mechanics) only rho(q,p) that determinate uniquely the entropy, with the request that [itex]\frac{d \rho}{dt}=0[/itex].
 

Related Threads on A system is always in a well-defined state, so the entropy is constant?

Replies
8
Views
2K
Replies
0
Views
3K
Replies
29
Views
7K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
8
Views
573
  • Last Post
Replies
1
Views
2K
Replies
1
Views
589
Replies
55
Views
2K
Replies
13
Views
9K
Replies
3
Views
3K
Top