Does entropy increase with improbability?

Click For Summary
SUMMARY

Entropy, as defined by the Gibbs entropy equation \( S = -k \sum_{i}^{l} p_{i} \ln p_{i} \), quantifies the spread of a probability distribution within a system. The equation indicates that entropy reaches its minimum value of zero when the system is in a delta function distribution and its maximum when the distribution is uniform. Therefore, a system with a greater number of equally probable states exhibits higher entropy compared to one with fewer probable states. This establishes that entropy does indeed increase with improbability in terms of probability distribution spread.

PREREQUISITES
  • Understanding of the Gibbs entropy equation
  • Familiarity with probability distributions
  • Basic knowledge of statistical mechanics
  • Concept of equilibrium in thermodynamic systems
NEXT STEPS
  • Research the implications of the Gibbs entropy equation in statistical mechanics
  • Explore different types of probability distributions and their characteristics
  • Study the relationship between entropy and thermodynamic equilibrium
  • Investigate applications of entropy in information theory
USEFUL FOR

Physicists, statisticians, and anyone interested in the foundational principles of thermodynamics and statistical mechanics.

MeneMestre
Messages
1
Reaction score
0
Well, maybe that's a simple question, but it has been pissing me off for some time ... Does entropy increase with improbability?
 
Science news on Phys.org
What is the definition of entropy (the real one that has an equation in it, not the vague English-language bit about "disorder")? The answer to that question will suggest an answer to your question.
 
MeneMestre said:
Does entropy increase with improbability?
Entropy is defined for all systems (whether or not the are at equilibrium) by the Gibbs entropy equation
$$S=-k\sum_{i}^{l}p_{i}\ln p_{i}$$
where ##p_{i}## is the probability for the system to enter the ##i##th state. Notice that the entropy has a minimum value of zero for the delta function distribution ##p_{i}=\delta_{ij}## and a maximum value for the uniform distribution ##p_{i}=1/l##. Thus the entropy is how spread out the probability distribution ##p_{i}## is. A system which has many equal probability states is at higher entropy than a system with only a few probable states.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
5K
  • · Replies 1 ·
Replies
1
Views
1K