How Does Isothermal Expansion Increase Entropy Despite Quantized Energy Levels?

  • Thread starter gemma786
  • Start date
  • Tags
    Entropy
In summary, according to statistical mechanics, molecular motional energies are quantised, which means that energy levels are also quantised. However, in isothermal expansion of gas in vacuum, new energy levels are created, which then causes an increase in entropy.
  • #1
gemma786
29
0
Hi,
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
please help me in getting out of this confusion,
thanks.
 
Science news on Phys.org
  • #2
First let me explain why entropy is related to probability. If you have two systems, then each of them will be in an unknown macrostate X1 and X2 respectively. For one isolated system entropy makes no sense.
These two macrostates have some number of micro state realizations N1(X1) and N2(X2). So the compound state (X1,X2) will have N1*N2 realizations which is also proportional to the probability of the compound state p(X1, X2).
So the most likely state is the one with N1*N2 -> max
To avoid multiplication we introduce S=log(N) and now our condition is S_1+S_2 -> max (which is no more than saying we want the most likely compound state)

In our model we have (energy) micro states k1, k2, ..., kn for each system. One macro state X specifies how many particles are in each of thes states. So
X= a1 particles in state k1; a2 particles in state k2; ...
To distribute (a1+a2+...+an) particle in such a way there are a1!a2!...an!/(sum a_i)! possibilities (multinomial distribution).
If you now define p_i = a_i/(sum a) and use the approximation [itex]a!\sim a^a[/itex], then you get the equation for entropy
[tex]S\propto \sum_i p_i\ln(p_i)[/tex]

You see it's all connected and derives from pure probability theory.
 
  • #3
Thanks Gerenuk.
I have a another question which is as follows:-
In statistical mechanics it is said that molecular motional energies is quantised and in a given microstate of a macrostate they have definite arrangements in energy levels for total energy of that macrostate. And if moleular motional energies are quantised it would mean that energy levels are also quantised , isn't it? Then how is it possible that in isothermal expansion of gas in vacuum adds new energy levels and thus causes an increase in entropy inspite of having quantisation in energy levels?
 

What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a fundamental concept in thermodynamics and statistical mechanics, and is often used to describe the direction and efficiency of energy transfer and transformation.

How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of an isolated system will never decrease over time. This means that in any natural process, the entropy of the universe will always increase, or at best remain constant.

What is the difference between macroscopic and microscopic entropy?

Macroscopic entropy refers to the overall disorder or randomness of a large-scale system, such as a room or a chemical reaction. Microscopic entropy, on the other hand, refers to the disorder or randomness at the molecular or atomic level. While macroscopic entropy tends to increase over time, microscopic entropy can decrease in certain cases, such as when energy is added to a system.

How is entropy calculated?

The exact calculation of entropy depends on the system and the specific variables being considered. In general, entropy is calculated by dividing the amount of heat transferred by the temperature at which the transfer occurred. It can also be calculated using statistical methods, such as counting the number of possible arrangements or states in a system.

What are some real-world examples of entropy?

Some common examples of entropy include the diffusion of gases, the melting of ice into liquid water, and the expansion of a gas into a larger volume. These processes all involve an increase in disorder or randomness, and therefore an increase in entropy. Additionally, the decay of radioactive atoms and the mixing of different substances are also examples of entropy in action.

Similar threads

  • Thermodynamics
Replies
1
Views
734
Replies
2
Views
843
Replies
45
Views
3K
Replies
4
Views
2K
Replies
1
Views
1K
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Introductory Physics Homework Help
Replies
1
Views
638
Back
Top