Absolute Entropy (Statistical)

In summary: Therefore,\mathrm{d}(\sum P_i\ln P_i)=\sum\ln P_i\mathrm{d}P_i=-k\mathrm{d}(\sum P_i\ln P_i)and hence\mathrm{d}S=-k\mathrm{d}(\sum P_i\ln P_i)In summary, the conversation discusses a derivation of entropy expressed as an absolute probability and questions the use of a minus sign in the equation. It also explains the use of the product rule and shows how the whole expression is equal to zero.
  • #1
Master J
226
0
I was looking at a derivation of entropy expressed as an absolute probability:

S = -k. SUM P.lnP
(What is the name of this by the way?)

In the derivation, it makes the following statements which I really just don't get!

U = SUM E.P

so therefore dU = SUM E.dP - SUM P.dE

Where does the minus sign come from? Should it not be a plus??

Then, it goes from dS = -k. SUM lnP.dP to dS = -k.d(SUM P.lnP)

How is that true?? First it was just P that was a differential element, now its the whole expression in the bracket??
 
Science news on Phys.org
  • #2
How does the whole derivation go?

Master J said:
U = SUM E.P
so therefore dU = SUM E.dP - SUM P.dE
Where does the minus sign come from? Should it not be a plus??
I agree.

Master J said:
Then, it goes from dS = -k. SUM lnP.dP to dS = -k.d(SUM P.lnP)

How is that true?? First it was just P that was a differential element, now its the whole expression in the bracket??
Check what the whole expression is equal to. By the product rule:
[tex]\mathrm{d}(\sum P_i\ln P_i)=\sum(\ln P_i\mathrm{d}P_i+P_i\mathrm{d}\ln P_i)=\sum\ln P_i\mathrm{d}P_i+\sum P_i\frac{\mathrm{d}P_i}{P_i}=\sum\ln P_i\mathrm{d}P_i+\sum\mathrm{d}P_i[/tex]
The last term is zero since
[tex]\sum P_i=1[/tex]
and hence
[tex]\sum \mathrm{d}P_i=\mathrm{d}(\sum P_i)=\mathrm{d}1=0 [/tex]
 
  • #3



The derivation you are referring to is known as the Boltzmann entropy, which is a statistical definition of entropy based on the probability of a system being in a particular state. This approach is often used in thermodynamics and statistical mechanics to describe the behavior of systems at the microscopic level.

Regarding your question about the minus sign in the expression dU = SUM E.dP - SUM P.dE, this comes from the fact that energy (E) and probability (P) have opposite signs. When we take the derivative of U with respect to P, we need to take into account this sign difference, which results in the minus sign in front of the second term.

As for your confusion about the change from dS = -k. SUM lnP.dP to dS = -k.d(SUM P.lnP), this is simply a mathematical manipulation known as the chain rule. In the first expression, we are taking the derivative of lnP with respect to P, which results in dP in the denominator. In the second expression, we are taking the derivative of the entire expression SUM P.lnP, which results in d(SUM P.lnP) in the denominator. The two expressions are equivalent and both are used in different contexts.

I hope this helps clarify your questions about the derivation of absolute entropy. It is a complex concept, but it is a fundamental aspect of understanding the behavior of systems at the microscopic level.
 

What is absolute entropy?

Absolute entropy, also known as statistical entropy, is a measure of the amount of disorder or randomness in a system at a specific temperature. It is a thermodynamic quantity that describes the number of possible arrangements of a system's particles or molecules.

How is absolute entropy different from other types of entropy?

Absolute entropy is different from other types of entropy, such as thermal entropy, because it takes into account the total number of microstates or arrangements of a system, rather than just the energy distribution among particles. It is a more comprehensive measure of a system's disorder.

What is the relationship between absolute entropy and temperature?

The relationship between absolute entropy and temperature is described by the second law of thermodynamics, which states that the entropy of a closed system always increases over time. As temperature increases, the number of possible microstates also increases, leading to a higher absolute entropy.

How is absolute entropy calculated?

Absolute entropy can be calculated using statistical mechanics, which involves counting the number of possible microstates of a system and using the Boltzmann equation to convert this into a value of entropy. It can also be measured experimentally using techniques such as calorimetry.

Why is absolute entropy important in thermodynamics?

Absolute entropy is important in thermodynamics because it is a fundamental property of a system that allows us to understand and predict how it will behave. It is used in calculations involving energy transfer, chemical reactions, and phase transitions, and plays a crucial role in determining the direction and efficiency of processes in nature.

Similar threads

  • Thermodynamics
Replies
1
Views
732
Replies
4
Views
4K
  • Introductory Physics Homework Help
Replies
8
Views
1K
Replies
19
Views
2K
Replies
7
Views
3K
Replies
4
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
2
Views
3K
Back
Top