Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Absolute Entropy (Statistical)

  1. Jan 22, 2010 #1
    I was looking at a derivation of entropy expressed as an absolute probability:

    S = -k. SUM P.lnP
    (What is the name of this by the way?)

    In the derivation, it makes the following statements which I really just dont get!

    U = SUM E.P

    so therefore dU = SUM E.dP - SUM P.dE

    Where does the minus sign come from? Should it not be a plus??

    Then, it goes from dS = -k. SUM lnP.dP to dS = -k.d(SUM P.lnP)

    How is that true?? First it was just P that was a differential element, now its the whole expression in the bracket??
     
  2. jcsd
  3. Jan 23, 2010 #2
    How does the whole derivation go?

    I agree.

    Check what the whole expression is equal to. By the product rule:
    [tex]\mathrm{d}(\sum P_i\ln P_i)=\sum(\ln P_i\mathrm{d}P_i+P_i\mathrm{d}\ln P_i)=\sum\ln P_i\mathrm{d}P_i+\sum P_i\frac{\mathrm{d}P_i}{P_i}=\sum\ln P_i\mathrm{d}P_i+\sum\mathrm{d}P_i[/tex]
    The last term is zero since
    [tex]\sum P_i=1[/tex]
    and hence
    [tex]\sum \mathrm{d}P_i=\mathrm{d}(\sum P_i)=\mathrm{d}1=0 [/tex]
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Absolute Entropy (Statistical)
  1. Statistical entropy (Replies: 8)

Loading...