Difference in conventional and fundamental entropy?

Click For Summary
SUMMARY

The discussion clarifies the distinction between conventional entropy and fundamental entropy as presented in Kettle's "Thermal Physics." Conventional entropy, also referred to as thermodynamic entropy, is measurable and expressed in specific units, while fundamental entropy, known as information entropy or Shannon entropy, is a dimensionless quantity representing "missing information" in a system. The relationship between the two is established through Boltzmann's constant, which connects the measurable thermodynamic state of a system to its underlying microstates. Understanding this relationship is crucial for grasping the broader implications of entropy in both thermodynamics and information theory.

PREREQUISITES
  • Understanding of thermodynamic concepts, including temperature and pressure.
  • Familiarity with statistical mechanics and microstates versus macrostates.
  • Basic knowledge of information theory, particularly Shannon entropy.
  • Awareness of Boltzmann's constant and its significance in thermodynamics.
NEXT STEPS
  • Study the relationship between thermodynamic entropy and information entropy in detail.
  • Explore Boltzmann's equation and its applications in statistical mechanics.
  • Investigate practical examples of entropy in thermodynamic systems.
  • Learn about the implications of entropy in information theory and data compression.
USEFUL FOR

Students of thermal physics, physicists interested in statistical mechanics, and professionals in fields related to thermodynamics and information theory will benefit from this discussion.

tmbrwlf730
Messages
41
Reaction score
0
I'm reading Thermal Physics by Kettle and I feel I'm having trouble really grasping what entropy is. From my lower division pretty much entropy can be defined as a measurement of randomness in a system. Like with ice, very little randomness=lower entropy, water, more randomness because the molecules can move around=higher entropy. Kettle brings up conventional entropy and fundamental entropy. I'm having trouble really understanding the difference and why do we have both. Conventional entropy seems to be the one you learn in lower division courses, it has units. It's the fundamental entropy multiplied by the Boltzmann's constant. The book states the fundamental entropy is a pure number. So why have both, and how does one benefit over the in certain problems? Thank you.
 
Science news on Phys.org
What you are calling "fundamental entropy" is, I think, better known as "information entropy", or "Shannon entropy" as it is applied to thermodynamics. "Conventional entropy" is probably better known as "thermodynamic entropy". I think the best way to think of entropy is as "missing information". Any time you have probabilities, you don't know completely what is going on, so you have "missing information", or information entropy. Take a coin flip. Before you flip the coin, you have 50 percent possibility of heads, 50 percent tails. Information theory says you are missing one "bit" of information, the entropy is one bit. In thermodynamics, suppose you have a gas and you know its temperature and pressure and number of particles. That means you know its thermodynamic state, or "macrostate". But you don't know where every particle of the gas is, or their velocity. That's the "microstate". Many different microstates could give you the same macrostate, and which microstate the gas is in, is "missing information" or the "fundamental entropy" of the gas. Thermodynamic entropy is something you can thermodynamically measure and it was Boltzmann's great discovery that the two are related by a constant, called "Boltzmann's constant".
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
578
Replies
39
Views
6K
  • · Replies 22 ·
Replies
22
Views
6K
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K