Exploring Entropy in Intro to Statistical Physics by Huang

In summary, the section on entropy in Introduction to statistical physics by Huang discusses the Clausius' theorem and the relationship between reversible and irreversible paths. Choosing a closed cycle can result in different inequalities and there are restrictions when considering irreversible paths. These restrictions prevent the forward and reverse paths from being identical.
  • #1
Haorong Wu
413
89
TL;DR Summary
Are there any restrictions of choosing circles in Clausius theorem?
Hi, I am currently reading Introduction to statistical physics by Huang. In the section of entropy, it reads

Let ##P## be an arbitrary path from ##A## to ##B##, reversible or not. Let ##R## be a reversible path with the same endpoints. Then the combined process ##P-R## is a closed cycle, and therefore by Clausius' theorem ##\int_{P-R} dQ/T \leq 0##, or
##\int_{P} \frac {dQ} {T} \leq \int_{R} \frac {dQ} {T}##.
Since the right side is the definition of the entropy difference between the final state ##B## and the initial state ##A##, we have ##S \left ( B \right ) - S \left ( A \right ) \geq \int_{A}^{B} \frac {dQ} {T}## where the equality holds if the process is reversible.

But what if I choose ##R-P## as a closed cycle? Then in a similar process, I should have ##\int_{R} \frac {dQ} {T} \leq \int_{P} \frac {dQ} {T}## and ##S \left ( B \right ) - S \left ( A \right ) \leq \int_{A}^{B} \frac {dQ} {T}##, which are contradicted to the equations above. I am not sure what goes wrong. Maybe there are some restrictions when I choose a closed cycle, but I did not find any relevant context in the book.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
In your setting you should have
[tex]\int_R \frac{dQ}{T} \leq -\int_{-P}\frac{dQ}{T}[/tex] where
[tex]-\int_{-P} \frac{dQ}{T} \neq \int_{P}\frac{dQ}{T}[/tex]
because P is not a reversible process.
 
  • #3
anuttarasammyak said:
In your setting you should have
[tex]\int_R \frac{dQ}{T} \leq -\int_{-P}\frac{dQ}{T}[/tex] where
[tex]-\int_{-P} \frac{dQ}{T} \neq \int_{P}\frac{dQ}{T}[/tex]
because P is not a reversible process.

Thanks, anuttarasammyak. I understand it now.
 
  • #4
Haorong Wu said:
Summary:: Are there any restrictions of choosing circles in Clausius theorem?

Hi, I am currently reading Introduction to statistical physics by Huang. In the section of entropy, it reads
But what if I choose ##R-P## as a closed cycle? Then in a similar process, I should have ##\int_{R} \frac {dQ} {T} \leq \int_{P} \frac {dQ} {T}## and ##S \left ( B \right ) - S \left ( A \right ) \leq \int_{A}^{B} \frac {dQ} {T}##, which are contradicted to the equations above. I am not sure what goes wrong. Maybe there are some restrictions when I choose a closed cycle, but I did not find any relevant context in the book.
For the irreversible paths, P can't be made the same for any opposite path and for the forward path. If it happens spontaneously for the forward path, it will not be spontaneous for the reverse path, and you can't even force it to follow the exact reverse path.
 
  • #5
Chestermiller said:
For the irreversible paths, P can't be made the same for any opposite path and for the forward path. If it happens spontaneously for the forward path, it will not be spontaneous for the reverse path, and you can't even force it to follow the exact reverse path.

Thanks, Chestermiller. I just start learning statistical physics, and thanks for pointing this important point for me.
 

1. What is entropy and why is it important in statistical physics?

Entropy is a measure of the disorder or randomness of a system. In statistical physics, it is used to describe the distribution of energy and particles within a system. It is important because it helps us understand the behavior of complex systems and predict their future states.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that as a system becomes more disordered, its entropy increases. In other words, entropy is a measure of the irreversibility of a process.

3. Can entropy be negative?

No, entropy cannot be negative. It is a measure of disorder, so it can only increase or stay the same. A negative entropy value would imply that a system is becoming more ordered, which goes against the second law of thermodynamics.

4. How is entropy calculated in statistical physics?

In statistical physics, entropy is calculated using the Boltzmann formula: S = k ln(W), where S is the entropy, k is the Boltzmann constant, and W is the number of microstates (possible arrangements of particles) in a given macrostate (overall state of the system).

5. How does the concept of entropy apply to real-world systems?

The concept of entropy applies to a wide range of real-world systems, from chemical reactions to the behavior of gases. It helps us understand the direction and efficiency of processes, such as heat transfer and chemical reactions. For example, a system with high entropy tends to be more disordered and less efficient, while a system with low entropy tends to be more ordered and more efficient.

Similar threads

  • Classical Physics
Replies
0
Views
150
Replies
1
Views
591
  • Classical Physics
Replies
4
Views
700
Replies
19
Views
1K
Replies
16
Views
850
Replies
11
Views
1K
  • Classical Physics
Replies
21
Views
1K
Replies
1
Views
638
  • Introductory Physics Homework Help
Replies
14
Views
710
Back
Top