Undergrad Why is Entropy defined as a fraction of heat over temperature?

Click For Summary
SUMMARY

The discussion centers on the definition of entropy as S = Q/T, where Q represents heat and T represents temperature. Participants seek to understand the rationale behind this formula, particularly how it relates to concepts of randomness and probability within thermodynamic systems. The conversation highlights the importance of entropy in distinguishing states of energy distribution and its implications in both physical and economic systems. Key insights include the relationship between heat transfer and entropy change, as well as the historical context of entropy's formulation in thermodynamics.

PREREQUISITES
  • Understanding of basic thermodynamic concepts, including heat (Q) and temperature (T).
  • Familiarity with the laws of thermodynamics, particularly the second law.
  • Knowledge of state functions and reversible processes in thermodynamics.
  • Basic mathematical skills for interpreting differential equations and integrals.
NEXT STEPS
  • Study the derivation of the entropy formula S = Q/T in the context of reversible processes.
  • Explore the implications of the second law of thermodynamics on entropy and energy distribution.
  • Research the relationship between entropy and statistical mechanics to understand randomness in systems.
  • Examine real-world applications of entropy in fields such as chemistry and economics.
USEFUL FOR

Students and professionals in physics, chemistry, and engineering, as well as economists interested in the concept of entropy as a measure of distribution and disorder in systems.

  • #31
The thermal energy of a classical ideal gas is ##U=f k_{\text{B}} N T/2##, where ##f## is the number of "relevant degrees of freedom" (i.e., ##f=3## for a monatomic, ##f=5## for a twoatomic, and ##f=6## for multiatomic gases); ##k## is Boltzmann's constant and ##N## is the number of particles/molecules, and ##T## the (absolute) temperature.

Then you have the ideal-gas law ##p V=N k_{\text{B}} T##, which is obviously different from ##U## by a factor.
 
Science news on Phys.org
  • #32
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
 
  • Skeptical
  • Like
Likes Philip Koeck, Motore and Ale_Rodo
  • #33
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
Can you please provide an example for this contention?
 
  • Like
Likes binis and Philip Koeck
  • #34
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
I don't want to give away too much here, but do please follow up on Chestermiller's suggestion.
How would your statement work out for an ideal gas, for example?
 
  • Like
Likes binis
  • #35
Chestermiller said:
Can you please provide an example for this contention?
Think of boiling water. Molecules transit from a more "ordered" state (liquid) to a less "ordered" state (gas).
 
  • #36
binis said:
Think of boiling water. Molecules transit from a more "ordered" state (liquid) to a less "ordered" state (gas).
Are you portraying this as a situation where it is difficult to determine the entropy change using the classical equation?
 
  • Like
Likes binis
  • #37
Philip Koeck said:
I don't want to give away too much here, but do please follow up on Chestermiller's suggestion.
Steam liquefaction is an example
Philip Koeck said:
How would your statement work out for an ideal gas, for example?
Gas compression. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
 
Last edited:
  • #38
binis said:
Steam liquefaction is an example

Gas condensation. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
You mean "compression," not "condensation," right? Your equation is only for an ideal gas at constant temperature, not for a real gas.
 
  • Like
Likes binis and Philip Koeck
  • #39
binis said:
Gas condensation. For stable temperature and pressure it is dS= nRlnVf/Vi
n=number of molecules, Vi=initial volume, Vf=final volume
If you extract an amount dQ then dQ<0 => dS<0 => Vi>Vf
Smaller volume means more "ordered" situation.
Well, an ideal gas can't really condensate. There are no forces between the gas molecules.

Even in the real world we have to be able to discuss other processes than phase changes.

What I actually reacted to was that you specified constant T, but at the same time said that the random motion of the molecules (the average kinetic energy or the inner energy) increases due to added heat.
For an ideal gas that would be a contradiction, since the inner energy depends only on T.

So, clearly, if you add heat to an ideal gas in an isothermal process entropy increase is not due to increased motion of the molecules. There's something else going on.

As Chestermiller has just pointed out you've stated the expression for entropy change of an ideal gas at constant T. You could actually derive this starting from the basic formula for entropy change.
 
  • #40
Philip Koeck said:
What I actually reacted to was that you specified constant T, but at the same time said that the random motion of the molecules (the average kinetic energy or the inner energy) increases due to added heat.
I mean the randomness increases, not the motion.
Philip Koeck said:
So, clearly, if you add heat to an ideal gas in an isothermal process entropy increase is not due to increased motion of the molecules. There's something else going on.
Yes, the motion becames more random. Mind that the entropy concept is aligned with math probability theory.
 
  • #41
binis said:
Yes, the motion becomes more random.
How does that look when the motion of gas molecules is more random?
 
  • #42
Philip Koeck said:
How does that look when the motion of gas molecules is more random?
It looks less oriented.
 
  • Skeptical
Likes weirdoguy and Motore
  • #43
binis said:
It looks less oriented.
The motion of gas molecules never has a preferred direction when the gas is in equilibrium.
How can it become less oriented?
 
  • Like
Likes Chestermiller
  • #44
Philip Koeck said:
The motion of gas molecules never has a preferred direction when the gas is in equilibrium.
How can it become less oriented?
When motion is limited by a smaller volume we consider as it is less random. Anyway,this is my perception about Clausious entropy. You know that there are also Boltzmann,Shannon,Renyi,Tsallis & other definitions. It is a large debate of what entropy really is.
 
  • Skeptical
Likes weirdoguy and Motore
  • #45
binis said:
When motion is limited by a smaller volume we consider as it is less random. Anyway,this is my perception about Clausious entropy. You know that there are also Boltzmann,Shannon,Renyi,Tsallis & other definitions. It is a large debate of what entropy really is.
I think we've arrived at an important conclusion.
Analysis of the Carnot process leads to a state function, S, and we can calculate the change of S in a reversible process using dS = dQ / T.
If we apply this to a reversible, isothermal process for an ideal gas it turns out that ΔS is related to a change of volume only, not an increase in random motion, for example.
 
Last edited:
  • Skeptical
Likes binis

Similar threads

  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 60 ·
3
Replies
60
Views
10K
Replies
10
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K