- #1
- 20
- 0
This is my apparently-trivial problem, that probably means i don't understood what entropy is.
We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is
[itex]\rho=exp(-\beta H(q,p))[/itex]
In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.
[itex]\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})[/itex]
It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
We all have faced the statement that entropy always increase.
With the power of statistical mechanics we can calculate the variation of entropy from two states, considering the difference of entropy of the final and the initial state.
To do it we suppose that the distribution of our system is
[itex]\rho=exp(-\beta H(q,p))[/itex]
In the reality the distribution of a system is always a dirac delta in the phase space, and so there is no integral to do in the phase space to calculate entropy and so it is always constant.
[itex]\rho=\Pi_i \delta (q_i-q_{i,t})\delta (p_i-p_{i,t})[/itex]
It seems to me that entropy depends on the knowledge that we have of the system. Observers with a deeper knowledge should see different variation of entropy, in particular, someone who know exactly the position and momentum of each particle should not see a variation of entropy.
Am I right?
Last edited: