A system is always in a well-defined state, so the entropy is constant?

AI Thread Summary
Entropy is fundamentally linked to the knowledge of a system's microstates versus macrostates. When a system's state is fully defined by the positions and momenta of its particles, it exhibits zero entropy, as described by Liouville's Theorem. In statistical mechanics, entropy variations arise when the individual particles are not tracked, leading to a reliance on macroscopic quantities. The discussion raises the question of whether the term "closed system" should be specified, indicating a potential nuance in understanding entropy. Ultimately, the distribution function, rather than just the Hamiltonian, uniquely determines entropy in statistical mechanics.
spocchio
Messages
20
Reaction score
0
This is my apparently-trivial problem, that probably means i don't understood what entropy is.

We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is

\rho=exp(-\beta H(q,p))

In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.

\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})

It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
 
Last edited:
Science news on Phys.org
Yes, you're exactly right. If the state of a system is specified completely in terms of the positions and momenta of each individual particle, it's called a "microstate". Such a state has zero entropy. If the state is specified in terms of macroscopic quantities such as density and pressure, it's called a "macrostate." A macrostate is a collection of microstates.
someone who know exactly the position and momentum of each particle should not see a variation of entropy.
And even stronger, someone who follows the position and momentum of each particle will not see a variation of entropy. (Liouville's Theorem) Increase in entropy only happens when you fail to track the individual particles.
 
our system

Should you not add the word 'closed' before system?
 
Studiot said:
Should you not add the word 'closed' before system?
nice question, I don't know the answer, what should change?

Bill_K said:
Yes, you're exactly right.
can we say that is not the knowledge of H(q,p) that gives the desired entropy?. In conclusion (in statistical mechanics) only rho(q,p) that determinate uniquely the entropy, with the request that \frac{d \rho}{dt}=0.
 
Back
Top