Is entropy the volume in phase space of energy E or LESS than E?

AI Thread Summary
The discussion centers on the definition of entropy in statistical mechanics, specifically regarding the microcanonical ensemble. It contrasts two definitions: one where entropy is based on the volume of phase space with energy equal to E (S = k ln Ω) and another where it is based on the volume of phase space with energy less than E (S = k ln Σ(E)). The confusion arises from the apparent conflict between these definitions and their implications for systems where lowering energy increases entropy. A referenced source suggests that both definitions are equivalent up to a constant dependent on the number of particles, which is not clearly understood by the participants. The discussion highlights the complexities in reconciling these definitions within the framework of statistical mechanics.
nonequilibrium
Messages
1,412
Reaction score
2
Hello,

I thought the statistical definition of entropy for an isolated system of energy E (i.e. microcanonical ensemble) was S=k \ln \Omega where \Omega is the volume in phase space of all the microstates with energy E.

However, if you take a look here http://en.wikipedia.org/wiki/Equipartition_theorem#The_microcanonical_ensemble
there is the line
\textrm{... Similarly, $\Sigma(E)$ is defined to be the total volume of phase space where the energy is less than $E$ ...} \textrm{By the usual definitions of statistical mechanics, the entropy $S$ equals $k_B \log \Sigma(E)$ ...}

so they use the volume in phase space where energy < E instead of the surface where energy = E. Do these notions coincide? I would think they'd conflict. Why do they say "by the usual definitions", I'm confused.
 
Science news on Phys.org
I'm finding a source (Huang, Statistical Mechanics, 2nd edition, p134) that states S = k \log \Omega and S = k \log \Sigma are indeed equivalent up to a constant dependent of N. The reason for that, I don't seem to get, as the text is a bit too advanced for me atm.

In a way I'm willing to accept the equivalency (as it would clear up my problem), but there's one thing that bothers me: take for example a state of a certain system such that if you lower the energy, entropy goes up (think of a system with bounded energy), doesn't the \Sigma(E) (= the volume in phase space where energy < E) definition make this behavior impossible, because surely (by definition) E_1 &lt; E_2 \Rightarrow \Sigma(E_1) &lt; \Sigma(E_2) \Rightarrow S(E_1) &lt; S(E_2)?

What am I overlooking?
 
I need to calculate the amount of water condensed from a DX cooling coil per hour given the size of the expansion coil (the total condensing surface area), the incoming air temperature, the amount of air flow from the fan, the BTU capacity of the compressor and the incoming air humidity. There are lots of condenser calculators around but they all need the air flow and incoming and outgoing humidity and then give a total volume of condensed water but I need more than that. The size of the...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...
I was watching a Khan Academy video on entropy called: Reconciling thermodynamic and state definitions of entropy. So in the video it says: Let's say I have a container. And in that container, I have gas particles and they're bouncing around like gas particles tend to do, creating some pressure on the container of a certain volume. And let's say I have n particles. Now, each of these particles could be in x different states. Now, if each of them can be in x different states, how many total...
Back
Top