Problem understanding entropy (two different definitions?)

AI Thread Summary
Entropy has two main definitions: one related to randomness and the other to energy unavailable for work. Statistical entropy is broader in scope, while thermodynamic entropy applies specifically to systems near equilibrium. The Gibbs entropy formula demonstrates the equivalence of these two concepts under certain conditions, particularly within a canonical ensemble. For systems close to thermodynamic equilibrium, statistical entropy can be accurately approximated by thermodynamic entropy. Understanding this relationship is crucial for grasping the principles of thermodynamics and statistical mechanics.
James Brown
Messages
21
Reaction score
11
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
 
Science news on Phys.org
  • Like
Likes Demystifier and vanhees71
James Brown said:
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
The former is statistical entropy and the latter is thermodynamic entropy. Statistical entropy is more general, while thermodynamic entropy is only defined for systems close to thermodynamic equilibrium. It can be shown (see the box in the link in the post above) that statistical entropy for systems close to thermodynamic equilibrium can be well approximated by thermodynamic entropy.
 
Back
Top