How Does Isothermal Expansion Increase Entropy Despite Quantized Energy Levels?

  • Thread starter Thread starter gemma786
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
Entropy is defined as a measure of randomness or disorder in a system, with its relationship to probability being central to understanding it. The statistical mechanics equation S = -K∑ [Pi log Pi] indicates that systems with lower probabilities of microstates have higher entropy, which can seem counterintuitive. The discussion clarifies that entropy is maximized in compound states by considering the product of microstate realizations, leading to the formulation S = log(N). Additionally, the quantization of molecular motional energies raises questions about how isothermal expansion can introduce new energy levels while maintaining quantization. This highlights the complexity of entropy in relation to energy distribution and system behavior.
gemma786
Messages
29
Reaction score
0
Hi,
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
please help me in getting out of this confusion,
thanks.
 
Science news on Phys.org
First let me explain why entropy is related to probability. If you have two systems, then each of them will be in an unknown macrostate X1 and X2 respectively. For one isolated system entropy makes no sense.
These two macrostates have some number of micro state realizations N1(X1) and N2(X2). So the compound state (X1,X2) will have N1*N2 realizations which is also proportional to the probability of the compound state p(X1, X2).
So the most likely state is the one with N1*N2 -> max
To avoid multiplication we introduce S=log(N) and now our condition is S_1+S_2 -> max (which is no more than saying we want the most likely compound state)

In our model we have (energy) micro states k1, k2, ..., kn for each system. One macro state X specifies how many particles are in each of thes states. So
X= a1 particles in state k1; a2 particles in state k2; ...
To distribute (a1+a2+...+an) particle in such a way there are a1!a2!...an!/(sum a_i)! possibilities (multinomial distribution).
If you now define p_i = a_i/(sum a) and use the approximation a!\sim a^a, then you get the equation for entropy
S\propto \sum_i p_i\ln(p_i)

You see it's all connected and derives from pure probability theory.
 
Thanks Gerenuk.
I have a another question which is as follows:-
In statistical mechanics it is said that molecular motional energies is quantised and in a given microstate of a macrostate they have definite arrangements in energy levels for total energy of that macrostate. And if moleular motional energies are quantised it would mean that energy levels are also quantised , isn't it? Then how is it possible that in isothermal expansion of gas in vacuum adds new energy levels and thus causes an increase in entropy inspite of having quantisation in energy levels?
 
Back
Top