Prove that Entropy is Extensive

  • Thread starter Tsar_183
  • Start date
  • Tags
    Entropy
In summary: Taking the logarithm and multiplying by the Boltzmann constant k gives the entropy S(A,B). This shows that the entropy is extensive, as it is a linear function of the number of microstates.
  • #1
Tsar_183
6
0

Homework Statement



Show explicitly that Entropy as defined by the Gibbs Entropy Formula is extensive. That is, for two independent (noninteracting) systems A and B,

S(A,B) = S(A) + S(B)

where S(A,B) is the entropy of A and B considered as part of a larger system.

Homework Equations



S = -k [itex]\sum[/itex] pi ln(pi)

The Attempt at a Solution



I honestly have no idea where to start! I tried letting pi = 1/Ω, to obtain,

S = k [itex]\sum[/itex] (1/Ω)ln(Ω), and then tried summing S(A) and S(B) together to obtain S(A,B), but it didn't work out. I also tried just summing up S(A) and S(B) without writing in terms of Ω...didn't work either. I then tried,

S = -k [itex]\sum[/itex] pi ln(pi) ==>
S = k [itex]\sum[/itex] (1/Ω) ln(Ω) ==>
S = k (1/Ω) ln(Ω)[itex]\sum[/itex] 1, [itex]\sum[/itex] 1 = Ω
S = k (1/Ω) ln(Ω)Ω
S = k ln(Ω)
and then I summed up S(A) and S(B) which WORKED,
S(A,B) = k ln(Ω(A))+k ln(Ω(B)) = k ln(Ω(A)Ω(B)) = k ln (Ω(A,B)), but I don't think this argument works. Plus the prof derived the Gibbs Entropy Formula from k ln Ω... so I don't think I'm even on the right track! Any ideas or suggestions? Thanks!
 
Last edited:
Physics news on Phys.org
  • #2
Tsar_183 said:
S(A,B) = k ln(Ω(A))+k ln(Ω(B)) = k ln(Ω(A)Ω(B)) = k ln (Ω(A,B)), but I don't think this argument works.
This looks fine to me. The number of microstates for two noninteracting systems Ω(A,B) is the product of the number of microstates for each system individually Ω(A)Ω(B).
 

1. What is entropy?

Entropy is a thermodynamic property that measures the disorder or randomness of a system. It is often described as the amount of unavailable energy in a system.

2. How is entropy related to extensive properties?

Extensive properties are those that depend on the size or amount of a system, while intensive properties do not. Entropy is considered an extensive property because it scales with the size or amount of the system.

3. Can you provide an example of how entropy is extensive?

One example of how entropy is extensive is in a gas system. As the volume of the gas increases, so does the number of arrangements of particles within that volume, leading to an increase in entropy.

4. How is the extensivity of entropy mathematically proven?

The mathematical proof of the extensivity of entropy involves showing that the entropy of a composite system is equal to the sum of the entropies of its individual components. This can be done using statistical mechanics and the concept of microstates and macrostates.

5. Why is the extensivity of entropy important?

The extensivity of entropy is important because it allows us to accurately describe and predict the behavior of larger systems based on the properties of their smaller components. It also plays a crucial role in understanding and studying the second law of thermodynamics.

Similar threads

Replies
22
Views
1K
  • Biology and Chemistry Homework Help
Replies
1
Views
1K
  • Biology and Chemistry Homework Help
Replies
1
Views
2K
  • Thermodynamics
Replies
4
Views
2K
  • Biology and Chemistry Homework Help
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
11
Views
2K
  • Biology and Chemistry Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
711
  • Introductory Physics Homework Help
Replies
6
Views
218
  • Introductory Physics Homework Help
Replies
1
Views
629
Back
Top