Entropy in thermodynamics

In summary: However, in different ensembles (Canonical, Grand Canonical etc.) the expression for entropy changes but the underlying idea is the same. In information theory, entropy is a measure of uncertainty and randomness in a system, and can be calculated using the Shannon entropy formula. In both cases, the concept of entropy is used to quantify the level of disorder or randomness in a system.
  • #1
aaaa202
1,169
2
Entropy is in my book defined as:

S = k ln[itex]\Omega[/itex] , where [itex]\Omega[/itex] is the multiplicity of the microstates.

Now I have several questions regarding this definition (please try to answer them all! :) )

1) Why do you take the log of the multiplicity? Is it just because additivity as well as the fact that even for a lot of particles S is still a small number, just makes it a lot more convenient to work with this definition?
At first I thought so, but then ln increases less per step the farther you are on the x axis. So wouldn't that make problems?

2) I can understand that generally thermodynamics is more deeply described in statistical mechanics. Is entropy then on a deeper level still defined as the the above, or is that just something that also appears to be true? I'm asking this because, the k factor seems to indicate, that there's more to it than at first glance.
Also I have seen a few videos, which discuss entropy in information theory, and here it seems that entropy is something more deep than just an expression for the multiplicity of a system.

That covers all. Thanks :)
 
Physics news on Phys.org
  • #2
(1) As you already said , the natural log grantees the additive property which is anticipated for an extensive quantity such as entropy. However, more generally you can define the entropy for any probability distribution function. In the microcanonical ensemble (Fixed, N,V,E) , the probability of any state of energy E is just 1/Ω (E). For this PDF , S has to be lnΩ. The extra factor of kB was added to match the thermodynamic dimensions of entropy (energy unit/degree kelvin).

(2) Up to a constant , the thermodynamic entropy, Shanon entropy (information theory), statistical mechanics entropy, and H in Boltzmann H-theorem are equivalent.
In statistical mechanics, the entropy in the microcanonical ensemble is given by the expression you wrote.
 

1. What is entropy in thermodynamics?

Entropy is a measure of the disorder or randomness in a system. In thermodynamics, it is a measure of the amount of energy in a system that is unavailable to do work.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that the disorder or randomness within the system will always tend to increase, and energy will become more evenly distributed.

3. Can entropy be reversed?

No, entropy cannot be reversed. The second law of thermodynamics states that the total entropy of a closed system will always increase or stay the same. While it is possible for local decreases in entropy to occur, the overall trend will be towards increasing entropy.

4. How does entropy affect energy efficiency?

Entropy affects energy efficiency by limiting the amount of energy available to do work. As entropy increases, the amount of useful energy decreases, making it more difficult to convert energy into useful work.

5. What is the relationship between entropy and temperature?

As temperature increases, the amount of disorder or randomness within a system also increases, resulting in an increase in entropy. This is because higher temperatures lead to more molecular motion and interactions, creating more disorder within the system.

Similar threads

  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Special and General Relativity
Replies
7
Views
144
  • Thermodynamics
Replies
18
Views
3K
Replies
12
Views
1K
Replies
22
Views
1K
  • Thermodynamics
Replies
1
Views
696
  • Beyond the Standard Models
Replies
9
Views
2K
  • Thermodynamics
Replies
2
Views
721
Replies
2
Views
816
Replies
17
Views
1K
Back
Top