Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Boltzmann vs. Gibbs entropy, negative energy

  1. Jan 6, 2014 #1

    tom.stoer

    User Avatar
    Science Advisor

  2. jcsd
  3. Jan 6, 2014 #2

    atyy

    User Avatar
    Science Advisor

    Last edited by a moderator: May 6, 2017
  4. Jan 6, 2014 #3

    tom.stoer

    User Avatar
    Science Advisor

    Hm. There are many lectures discussing why Gibbs' definition encompasses Boltzmann's definition. But I agree, some lectures (and even text books) miss these facts.
     
  5. Jan 6, 2014 #4

    atyy

    User Avatar
    Science Advisor

    Yeah , eye opening for me. Does the Gibbs definition have a quantum version?
     
  6. Jan 7, 2014 #5

    tom.stoer

    User Avatar
    Science Advisor

    The von-Neumann entropy

    ##S_N = -k\,\text{tr}(\rho\,\ln\rho)##

    with a density operator ρ is the basis for a quantum version of the Gibbs entropy. Note that no temperature has been introduced so far.

    The Gibbs entropy follows for a special density operator.

    ##\rho=e^{-\beta H}##
     
  7. Jan 7, 2014 #6

    atyy

    User Avatar
    Science Advisor

    But isn't that only for the canonical ensemble? For the microcanonical ensemble, doesn't one use

    ##\rho=\frac{\delta(H-E)}{\omega(E)}##
     
  8. Jan 7, 2014 #7

    tom.stoer

    User Avatar
    Science Advisor

    But this is equivalent to a ρ where only the states of a fixed E do contribute, the general expression introduced by von Neumann is still valid (I agree that the the second formula holds only for the canonical ensemble)

    You have

    ##\rho = \omega^{-1}(E) \, \delta(H-E) = P_E \to \sum_n p_n |n\rangle\langle n| = \omega^{-1}(E) \, \sum_{n; E_n = E} |n\rangle\langle n| ##
     
  9. Jan 7, 2014 #8

    atyy

    User Avatar
    Science Advisor

    Yes, I agree the standard method for the microcanonical ensemble is to use the microcanonical density matrix in the von Neumann entropy (which I usually think of as the generalization of the classical Gibbs entropy). I'm not sure this is correct, but it seems that if I work in the energy basis, and there are ##N## discrete states of energy ##E##, then the microcanonical ensemble gives ##p(E)=1/N## and the von Neumann entropy is ##\sum_{N} - \frac{1}{N}\ln{\frac{1}{N}} = \ln{N}##. So the microcanonical entropy is something like log(number of states).

    But that seems closer to Eq 5 in http://arxiv.org/abs/1304.2066, which they call the "Boltzmann entropy". Their Eq 6, which they call the "Gibbs entropy" seems different from what one would get from the microcanonical ensemble and the von Neumann entropy or what is usually called the Gibbs entropy.
     
    Last edited: Jan 7, 2014
  10. Jan 7, 2014 #9

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    The equilibrium distribution is always maximizing the entropy under the given contraints, which is defined as the von Neumann entropy in quantum statistical mechanics. This can be founded on information-theoretical ideas by Shannon and Jaynes.

    Anyway, assuming that the correct entropy is given by the von Neumann entropy. The constraint for the microcanonical ensemble, applicable to a closed system, is that energy is precisely constant, i.e., the Statistical Operator must be given by
    [tex]\hat{R}_{\text{can}}=\sum_{j} P_j |E,j \rangle \langle E,j|,[/tex]
    where for simplicity I've assumed that the energy eigenspace at the given eigenvalue [itex]E[/itex] of the Hamiltonian is spanned by a discrete set of orthonormalized eigenvectors.

    The von Neumann entropy is then given by
    [tex]S=-\mathrm{Tr} \; (\hat{R} \ln \hat{R})=-\sum_{j} P_j \ln P_j.[/tex]
    To maximize the entropy under the constraint that
    [tex]\sum_j P_j=1[/tex]
    we must have (with [itex]\lambda[/itex] as the corresponding lagrange multiplyer
    [tex]\frac{\partial S}{\partial P_j}=-\ln P_j -1-\lambda=0.[/tex]
    From this we get
    [tex]P_j=\text{const}[/tex]
    and from the constraint
    [tex]P_j=\frac{1}{\omega(E)},[/tex]
    where [itex]\omega(E)[/itex] is that dimension of the eigenspace of the Hamiltonian with eigenvalue [itex]E[/itex], and that we wanted to prove.
     
  11. Jan 7, 2014 #10

    atyy

    User Avatar
    Science Advisor

    @vanhees71, I agree (except the part about Jaynes :tongue2:), but I think what people usually call the Boltzmann-Gibbs-Shannon entropy, and for which the von Neumann entropy is the quantum generalization, isn't what the authors of http://arxiv.org/abs/1304.2066 are calling the "Gibbs entropy" (Eq 6).
     
  12. Jan 7, 2014 #11

    tom.stoer

    User Avatar
    Science Advisor

    I absolutely agree with vanhees71; this is the standard method to define the microcanonical ensemble; and eq. (1) expressed exactly this statement: the density operator of the microcanonical ensemble is

    ##\rho = \omega^{-1}(E) \, \delta(H-E)##

    where δ is the projector to the energy eigenspace and ω is the dimension.

    What I do not see is how and why from this unique (!) definition of ρ two inequivalent "definitions" SB and SG can be "derived".
     
  13. Jan 7, 2014 #12

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    The Boltzmann entropy belongs to the canonical ensemble, which applies to a system that can exchange energy with a reservoir. Then only the mean energy is fixed. The maximum-entropy principle (which can also derived from kinetic theory, which follows from many-body quantum (field) theory by applying appropriate gradient-expansion approximations to the Kadanoff-Baym equations from the 2PI-Baym functional formalism, if you don't like the information-theoretical approach a la Jaynes) then gives the corresponding canonical statistical equilibrium operator
    [tex]\hat{R}_{\text{can}}=\frac{1}{Z} \exp(-\beta \hat{H}), \quad Z=\mathrm{Tr} \; \exp(-\beta \hat{H}).[/tex]
    In case of the grand canonical ensemble, you can also exchange one or more conserved charge-like quantities (or, in the non-relativistic case the particle number), which leads to the grand-canonical ensemble
    [tex]\hat{R}_{\text{gc}}=\frac{1}{Z} \exp(-\beta \hat{H} + \alpha \hat{Q} ).[/tex]
    The Nature paper by Dunkel and Hilbert shows that the canonical (or grand canonical) ensemble doesn't give a good description for small systems or for systems with bounded Hamiltonians allowing occupation inversion. I think it's a nice very clearly written paper, but I don't think that this is something so new. It's clear that the canonical and grand canonical ensembles work for really large macroscopic open systems, where the fluctuations of the relevant quantities (like energy and other conserved quantities for equilibrium states) are small compared to their average (see, e.g., Landau+Lifshitz vol. 5 and Sommerfeld vol. 5).
     
  14. Jan 7, 2014 #13

    atyy

    User Avatar
    Science Advisor

    Yes, it's not new that the microcanonical ensemble with the usual Boltzmann-Gibbs-Shannon-von Neumann entropy has problems for small systems or systems with a finite number or states. I think what they claim is that a different definition of entropy which they call the "Gibbs entropy" or "Hertz entropy" (Eq 6), does work for the microcanonical ensemble even for small systems or systems with a finite number of states, and is more general than the usual BGSvN entropy for the microcanonical ensemble.

    So the usual picture is we use BGSvN entropy in both microcanonical and canonical ensembles. I think they propose the BGSvN entropy be used in the canonical ensemble, but that their "Gibbs entropy" or "Hertz entropy" should be used for the microcanonical ensemble.
     
    Last edited: Jan 7, 2014
  15. Jan 7, 2014 #14

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    So the Jaynes is the same Jaynes who wrote papers rebuting QED, I presume. But if he really came up with the axiomatical formulation of equilibrium statistical mechanics (which uses a variational principle for the statistical entropy), then he should get his name cleared.
     
  16. Jan 7, 2014 #15

    tom.stoer

    User Avatar
    Science Advisor

    This is clear.

    What is not clear to me is how something like (5) can be "defined" or "derived" from (1). I don't understand that.

    In the following they show that quantities derived from (5) are partially partially physically wrong or inconsistent.

    On page 8 they write

    This was clear to me since my first course in statistical mechanics.

    So my summary is that they write down (5) which I don't understand and which I cannot relate to the density operator (1), and that they derive inconsistencies which do not come as a surprise.
     
  17. Jan 7, 2014 #16

    atyy

    User Avatar
    Science Advisor

    I believe their Eq 5 is something like what you get from the microcanonical ensemble and the von Neumann entropy. I posted a rough sketch in post #8 that the microcanonical ensemble and the von Neumann entropy gives an entropy that is log(number of states). Their Eq 5 is log(sum over states with energy E), so it's essentially the same. So their Eq 5 is what you and vanhees71 and most people call the Gibbs entropy or the Boltzmann-Gibbs-Shannon-von Neumann entropy (-∫plnp).

    Their Eq 6 is the unconventional proposal. Although they call it the "Gibbs entropy", it is not what you call the Gibbs entropy.
     
  18. Jan 7, 2014 #17
    I know this comment was tongue in cheek, but for the record Jaynes' work on QED led to him proposing the now extremely widely used Jaynes-Cummings model, which is "of great interest in atomic physics, quantum optics, and solid-state quantum information circuits, both experimentally and theoretically" (quote from http://en.wikipedia.org/wiki/Jaynes-Cummings_model).

    Edit: If you're interested Jaynes' papers on statistical mechanics and information theory can be downloaded here http://bayes.wustl.edu/etj/node1.html (the original papers are items 9 and 10 on the page linked to).
     
    Last edited: Jan 7, 2014
  19. Jan 7, 2014 #18

    tom.stoer

    User Avatar
    Science Advisor

    atty, if I understand you correctly then for fixed E (1) and (5) contain a sum

    ##\sum_{n,E_n = E}\ldots##

    which agrees with (1) and which therefore agrees with the standard definitions for both ρ and S.

    If I understand (6) correctly then due to the step function the sum becomes

    ##\sum_{n,E_n \le E}\ldots##

    But then I don't understand the following:

    ρ from (1) restricts the sum to the subspace E, therefore the sentence "Given the MCE operator (1) , one can find two competing ..." cannot be correct. (6) cannot be derived from (1) simply b/c the subspace in the sum is different.
     
  20. Jan 8, 2014 #19

    atyy

    User Avatar
    Science Advisor

    Yes, that is my understanding too - Eq (6) cannot be derived from (1) and the von Neumann entropy. Eq (6) is their proposal for a new definition of entropy in the microcanonical ensemble, that they suggest will to rectify the failure of the standard von Neumann entropy in the microcanonical ensemble for small systems or a system with finite range of energy.
     
  21. Jan 8, 2014 #20

    tom.stoer

    User Avatar
    Science Advisor

    OK, I think I got it; was essentially a confusion regarding Gibbs-Boltzmann-von-Neumann-... entropies ;-)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Boltzmann vs. Gibbs entropy, negative energy
  1. Negative energies. (Replies: 2)

  2. Negative energy? (Replies: 3)

  3. Negative energies (Replies: 6)

Loading...