Entropy as log of omega (phase space volume)

In summary, the problem involves showing that the entropy function S, which is dependent on the volume of the energy shell, can be expressed as a constant multiplied by the logarithm of the volume. This can be done by considering two subsystems and using the additivity of S and the multiplicative character of the volume. The solution involves solving a differential equation and leads to the conclusion that S can be expressed as a logarithmic function.
  • #1
diegzumillo
173
18

Homework Statement


I've seen this problem appear in more than one textbook almost without any changes. It goes like this:
Assume the entropy ##S## depends on the volume ##\bar{\Omega}## inside the energy shell: ##S(\bar{\Omega})=f(\bar{\Omega})##. Show that from the additivity of ##S## and the multiplicative character of ##\bar{\Omega}##, it follows that ##S=const \times log \bar{\Omega}##

Homework Equations

The Attempt at a Solution


I've found a couple of solutions already (one in Pathria), that consists of considering two subsystems and calculating the derivatives of S in respect to the omegas, plus a bunch of assumptions. But from the problem statement I can't help but think there's got to be a simpler way. I'm trying to expand S as a sum of ##f(\Omega_i)##, expand each f as a power series, and then use the fact that omega is the product of the omegas of the subsystems. But that leads nowhere.

Slightly off-topic: I'm taking a grad course on statistical mechanics but my previous knowledge on stat mech is very weak, so I'll probably be on these forums frequently throughout the semester. Is this the appropriate forum for stat mech homework questions?
 
Physics news on Phys.org
  • #2
I don't see how power series would be easier than the described approach.

Looking at two subsystems together leads to ##f(xy)=f(x)+f(y)## (where x,y are the Ω of two different systems). Calculating the derivative with respect to x gives ##yf'(xy)=f'(x)##, the derivative of this with respect to y leads to ##f'(xy)+xyf''(xy)=0## or (using z=xy) ##f'(z) = -zf''(z)##. This differential equation is solved by ##f'(z)=\frac{c}{z}## which leads to ##f(z)=c \log(z)##.

diegzumillo said:
Is this the appropriate forum for stat mech homework questions?
It is.
 
  • #3
Thanks :) That makes perfect sense. I solved it in a similar way but made some unnecessary turns here and there and it made things look more complicated.
 

1. What is the definition of entropy as log of omega (phase space volume)?

Entropy as log of omega (phase space volume) is a mathematical expression that describes the amount of disorder or randomness in a system. The "omega" represents the number of microstates that a system can have, and the logarithm of this value is known as the entropy.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. This means that the disorder or randomness in the system will tend to increase, which is reflected in the increase of the logarithm of the phase space volume.

3. What is the significance of using logarithm in the expression for entropy?

The use of logarithm in the expression for entropy allows for a more convenient representation of the phase space volume, which can be extremely large. It also enables the addition of entropy values for different systems, which is not possible with the original definition of entropy.

4. Can entropy ever decrease in a system?

According to the second law of thermodynamics, the total entropy of a closed system will never decrease. However, it is possible for the entropy of a specific part of the system to decrease, as long as there is an overall increase in the entropy of the entire system.

5. How does entropy relate to the concept of disorder?

Entropy and disorder are closely related, as entropy is a measure of the amount of disorder in a system. A high entropy value indicates a high level of disorder, while a low entropy value indicates a more ordered system. This is because a system with more microstates has a higher level of randomness and therefore a higher entropy value.

Similar threads

  • Advanced Physics Homework Help
Replies
4
Views
2K
  • Advanced Physics Homework Help
Replies
6
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
3K
  • Advanced Physics Homework Help
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
2
Views
2K
  • Advanced Physics Homework Help
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
6
Views
6K
  • Advanced Physics Homework Help
Replies
4
Views
3K
Back
Top