Problem understanding entropy (two different definitions?)

  • Context: Undergrad 
  • Thread starter Thread starter James Brown
  • Start date Start date
  • Tags Tags
    Definitions Entropy
Click For Summary
SUMMARY

This discussion clarifies the two definitions of entropy: statistical entropy and thermodynamic entropy. Statistical entropy, which pertains to randomness, is more general, while thermodynamic entropy relates to energy unavailable for work and is applicable primarily to systems near thermodynamic equilibrium. The relationship between these two forms of entropy is demonstrated through the Gibbs entropy formula, which aligns with the classical entropy formula under canonical ensemble conditions. This equivalence highlights the approximation of statistical entropy by thermodynamic entropy in equilibrium scenarios.

PREREQUISITES
  • Understanding of Gibbs entropy formula
  • Familiarity with classical thermodynamics
  • Knowledge of statistical mechanics
  • Concept of canonical ensembles
NEXT STEPS
  • Study the Gibbs entropy formula in detail
  • Explore classical thermodynamic principles
  • Research statistical mechanics and its applications
  • Investigate canonical ensemble theory and its implications
USEFUL FOR

Students and professionals in physics, particularly those studying thermodynamics and statistical mechanics, as well as researchers interested in the foundational concepts of entropy.

James Brown
Messages
21
Reaction score
11
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
 
Science news on Phys.org
  • Like
Likes   Reactions: Demystifier and vanhees71
James Brown said:
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
The former is statistical entropy and the latter is thermodynamic entropy. Statistical entropy is more general, while thermodynamic entropy is only defined for systems close to thermodynamic equilibrium. It can be shown (see the box in the link in the post above) that statistical entropy for systems close to thermodynamic equilibrium can be well approximated by thermodynamic entropy.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
620
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K