Why is Entropy defined as a fraction of heat over temperature?

Click For Summary

Discussion Overview

The discussion revolves around the definition of entropy as a fraction of heat over temperature (S = Q/T) and seeks to understand the conceptual basis for this formulation. Participants explore the implications of this definition in the context of thermodynamics, randomness, and probability, while also questioning its intuitive understanding and historical development.

Discussion Character

  • Exploratory
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions how the formula S = Q/T can be understood similarly to density, seeking a logical example that connects heat and temperature to randomness or probability.
  • Another participant discusses the differentiation of the entropy definition, noting that dQ = TdS leads to the understanding of entropy as a rate of increase per unit of heat energy given to the system.
  • A participant expresses curiosity about the historical reasoning behind defining entropy as dS = dQ/T, suggesting that it feels like a concept that should be more intuitively understood.
  • One participant provides an analogy comparing entropy to wealth distribution in countries, emphasizing the need for a parameter to distinguish different states of systems.
  • Another participant shares a practical example involving thermal energy and an ideal engine, illustrating how energy requirements relate to entropy changes.
  • Several participants emphasize the importance of understanding the change in entropy rather than its absolute value, noting that it cannot be directly measured.

Areas of Agreement / Disagreement

Participants express a range of views on the conceptual understanding of entropy, with no consensus reached on a singular intuitive explanation or historical reasoning for its definition. The discussion remains unresolved regarding the clarity of the formula's implications.

Contextual Notes

Some participants highlight limitations in their understanding of thermodynamics, which may affect their ability to engage deeply with the topic. There is also a recognition that the definition of entropy is context-dependent, particularly in relation to reversible processes.

Who May Find This Useful

This discussion may be of interest to students and enthusiasts of thermodynamics, particularly those seeking to deepen their understanding of entropy and its implications in physical systems.

  • #31
The thermal energy of a classical ideal gas is ##U=f k_{\text{B}} N T/2##, where ##f## is the number of "relevant degrees of freedom" (i.e., ##f=3## for a monatomic, ##f=5## for a twoatomic, and ##f=6## for multiatomic gases); ##k## is Boltzmann's constant and ##N## is the number of particles/molecules, and ##T## the (absolute) temperature.

Then you have the ideal-gas law ##p V=N k_{\text{B}} T##, which is obviously different from ##U## by a factor.
 
Science news on Phys.org
  • #32
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
 
  • Skeptical
  • Like
Likes   Reactions: Philip Koeck, Motore and Ale_Rodo
  • #33
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
Can you please provide an example for this contention?
 
  • Like
Likes   Reactions: binis and Philip Koeck
  • #34
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
I don't want to give away too much here, but do please follow up on Chestermiller's suggestion.
How would your statement work out for an ideal gas, for example?
 
  • Like
Likes   Reactions: binis
  • #35
Chestermiller said:
Can you please provide an example for this contention?
Think of boiling water. Molecules transit from a more "ordered" state (liquid) to a less "ordered" state (gas).
 
  • #36
binis said:
Think of boiling water. Molecules transit from a more "ordered" state (liquid) to a less "ordered" state (gas).
Are you portraying this as a situation where it is difficult to determine the entropy change using the classical equation?
 
  • Like
Likes   Reactions: binis
  • #37
Philip Koeck said:
I don't want to give away too much here, but do please follow up on Chestermiller's suggestion.
Steam liquefaction is an example
Philip Koeck said:
How would your statement work out for an ideal gas, for example?
Gas compression. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
 
Last edited:
  • #38
binis said:
Steam liquefaction is an example

Gas condensation. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
You mean "compression," not "condensation," right? Your equation is only for an ideal gas at constant temperature, not for a real gas.
 
  • Like
Likes   Reactions: binis and Philip Koeck
  • #39
binis said:
Gas condensation. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
Well, an ideal gas can't really condensate. There are no forces between the gas molecules.

Even in the real world we have to be able to discuss other processes than phase changes.

What I actually reacted to was that you specified constant T, but at the same time said that the random motion of the molecules (the average kinetic energy or the inner energy) increases due to added heat.
For an ideal gas that would be a contradiction, since the inner energy depends only on T.

So, clearly, if you add heat to an ideal gas in an isothermal process entropy increase is not due to increased motion of the molecules. There's something else going on.

As Chestermiller has just pointed out you've stated the expression for entropy change of an ideal gas at constant T. You could actually derive this starting from the basic formula for entropy change.
 
  • #40
Philip Koeck said:
What I actually reacted to was that you specified constant T, but at the same time said that the random motion of the molecules (the average kinetic energy or the inner energy) increases due to added heat.
I mean the randomness increases, not the motion.
Philip Koeck said:
So, clearly, if you add heat to an ideal gas in an isothermal process entropy increase is not due to increased motion of the molecules. There's something else going on.
Yes, the motion becames more random. Mind that the entropy concept is aligned with math probability theory.
 
  • #41
binis said:
Yes, the motion becomes more random.
How does that look when the motion of gas molecules is more random?
 
  • #42
Philip Koeck said:
How does that look when the motion of gas molecules is more random?
It looks less oriented.
 
  • Skeptical
Likes   Reactions: weirdoguy and Motore
  • #43
binis said:
It looks less oriented.
The motion of gas molecules never has a preferred direction when the gas is in equilibrium.
How can it become less oriented?
 
  • Like
Likes   Reactions: Chestermiller
  • #44
Philip Koeck said:
The motion of gas molecules never has a preferred direction when the gas is in equilibrium.
How can it become less oriented?
When motion is limited by a smaller volume we consider as it is less random. Anyway,this is my perception about Clausious entropy. You know that there are also Boltzmann,Shannon,Renyi,Tsallis & other definitions. It is a large debate of what entropy really is.
 
  • Skeptical
Likes   Reactions: weirdoguy and Motore
  • #45
binis said:
When motion is limited by a smaller volume we consider as it is less random. Anyway,this is my perception about Clausious entropy. You know that there are also Boltzmann,Shannon,Renyi,Tsallis & other definitions. It is a large debate of what entropy really is.
I think we've arrived at an important conclusion.
Analysis of the Carnot process leads to a state function, S, and we can calculate the change of S in a reversible process using dS = dQ / T.
If we apply this to a reversible, isothermal process for an ideal gas it turns out that ΔS is related to a change of volume only, not an increase in random motion, for example.
 
Last edited:
  • Skeptical
Likes   Reactions: binis

Similar threads

  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 60 ·
3
Replies
60
Views
10K
  • · Replies 16 ·
Replies
16
Views
2K