How Does Isothermal Expansion Increase Entropy Despite Quantized Energy Levels?

  • Context: Graduate 
  • Thread starter Thread starter gemma786
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary
SUMMARY

This discussion clarifies the relationship between entropy, probability, and quantized energy levels in statistical mechanics. Entropy, defined by the equation S = -K∑ [Pi log Pi], indicates that systems with lower probabilities correspond to lower entropy. The conversation highlights how the introduction of the logarithmic function allows for the maximization of compound states through the relationship S = log(N). Furthermore, it addresses the apparent contradiction of increasing entropy during isothermal expansion despite quantized energy levels, emphasizing that new energy states can emerge in such processes.

PREREQUISITES
  • Understanding of statistical mechanics principles
  • Familiarity with the concept of entropy and its mathematical representation
  • Knowledge of microstates and macrostates in thermodynamics
  • Basic grasp of quantum mechanics and quantized energy levels
NEXT STEPS
  • Study the derivation of the entropy formula S = -K∑ [Pi log Pi]
  • Explore the implications of the Boltzmann distribution in statistical mechanics
  • Investigate the concept of isothermal processes in thermodynamics
  • Learn about the relationship between quantum mechanics and thermodynamic properties
USEFUL FOR

Students and professionals in physics, particularly those focusing on thermodynamics, statistical mechanics, and quantum mechanics, will benefit from this discussion.

gemma786
Messages
29
Reaction score
0
Hi,
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
please help me in getting out of this confusion,
thanks.
 
Science news on Phys.org
First let me explain why entropy is related to probability. If you have two systems, then each of them will be in an unknown macrostate X1 and X2 respectively. For one isolated system entropy makes no sense.
These two macrostates have some number of micro state realizations N1(X1) and N2(X2). So the compound state (X1,X2) will have N1*N2 realizations which is also proportional to the probability of the compound state p(X1, X2).
So the most likely state is the one with N1*N2 -> max
To avoid multiplication we introduce S=log(N) and now our condition is S_1+S_2 -> max (which is no more than saying we want the most likely compound state)

In our model we have (energy) micro states k1, k2, ..., kn for each system. One macro state X specifies how many particles are in each of thes states. So
X= a1 particles in state k1; a2 particles in state k2; ...
To distribute (a1+a2+...+an) particle in such a way there are a1!a2!...an!/(sum a_i)! possibilities (multinomial distribution).
If you now define p_i = a_i/(sum a) and use the approximation [itex]a!\sim a^a[/itex], then you get the equation for entropy
[tex]S\propto \sum_i p_i\ln(p_i)[/tex]

You see it's all connected and derives from pure probability theory.
 
Thanks Gerenuk.
I have a another question which is as follows:-
In statistical mechanics it is said that molecular motional energies is quantised and in a given microstate of a macrostate they have definite arrangements in energy levels for total energy of that macrostate. And if moleular motional energies are quantised it would mean that energy levels are also quantised , isn't it? Then how is it possible that in isothermal expansion of gas in vacuum adds new energy levels and thus causes an increase in entropy inspite of having quantisation in energy levels?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
9K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K