# Absolute Entropy (Statistical)

1. Jan 22, 2010

### Master J

I was looking at a derivation of entropy expressed as an absolute probability:

S = -k. SUM P.lnP
(What is the name of this by the way?)

In the derivation, it makes the following statements which I really just dont get!

U = SUM E.P

so therefore dU = SUM E.dP - SUM P.dE

Where does the minus sign come from? Should it not be a plus??

Then, it goes from dS = -k. SUM lnP.dP to dS = -k.d(SUM P.lnP)

How is that true?? First it was just P that was a differential element, now its the whole expression in the bracket??

2. Jan 23, 2010

### Gerenuk

How does the whole derivation go?

I agree.

Check what the whole expression is equal to. By the product rule:
$$\mathrm{d}(\sum P_i\ln P_i)=\sum(\ln P_i\mathrm{d}P_i+P_i\mathrm{d}\ln P_i)=\sum\ln P_i\mathrm{d}P_i+\sum P_i\frac{\mathrm{d}P_i}{P_i}=\sum\ln P_i\mathrm{d}P_i+\sum\mathrm{d}P_i$$
The last term is zero since
$$\sum P_i=1$$
and hence
$$\sum \mathrm{d}P_i=\mathrm{d}(\sum P_i)=\mathrm{d}1=0$$