Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Boltman interpretation of entropy

  1. Aug 19, 2010 #1
    we have been learning ststisticsl physics for the last month and the lecturer has still not explained what entropy is.Other than say "this is the boltman interpretation of entropy"
    all the info i find on the web seems to say different things for different situations so i was hoping someone could explain this concept to me because i still dont have a clue what it is, and most of the calculations we are doing involve entropy
     
  2. jcsd
  3. Aug 19, 2010 #2

    diazona

    User Avatar
    Homework Helper

    Re: entropy

    Well, entropy is kind of a tricky thing to really understand. But to get you started: it's a measure of the "disorder" of a system. What that really means is, entropy is related to the number of configurations a system could have given its energy, temperature, pressure, and volume.

    Very artificial example: two coins sitting on a table. Each one can be heads up or tails up, so there are a total of four possibilities (HH, HT, TH, TT) for the two coins. In this case the system's entropy would be basically log(4). (Of course the coins could also be standing on their side, or spinning, but that would require more energy so those states don't count towards the entropy) If you had three coins, there would be 8 possible states with the same energy, so the entropy would be basically log(8). And so on.

    Realistic example: you have a balloon filled with air. The balloon, along with the air inside it, has a certain temperature, a certain pressure, a certain volume; all of these things can be measured. For a gas, a "state" is defined by the positions, momenta, and angular momenta of all the individual particles (molecules). Obviously there are a large* number of ways the molecules could arrange themselves inside the balloon, and a large* number of possibilities for how they could be moving in there, so there are a very large* number of possible states of the gas. You could (theoretically) calculate the temperature, pressure, and volume (and energy) that would be produced by each of those states, and throw out all the ones that don't match the measured properties of the balloon, and count how many are left. That number is called the multiplicity, denoted Ω. And the entropy is basically log(Ω).

    *"large" = "UNIMAGINABLY INCOMPREHENSIBLY HUGE" if you want to get technical :wink:
     
  4. Aug 20, 2010 #3
    Re: entropy

    Actually entropy is simple. There is a 1-to-1 relation to the probability of the state (provided you have left enough time to cycle through all intermediate states).
     
  5. Aug 20, 2010 #4
    Re: entropy

    There's two basic, related definitions of entropy. I'm probably about to post a related question myself, so I'm no expert, but I hope you find the following helpful.

    The term was first used in the field of classical thermodynamics. Quick summary:
    A system doesn't possess "work" or "heat"- these are labels that we assign to different methods of transferring energy into or out of a system. This means that if some body undergoes a cyclic process, so that at the end of the process it's in exactly the same state it started out from, then you can add up all the heat that's transferred into and out from it and not get zero; any net energy that flowed into the system as heat might be spent doing work, leaving the body with the same total energy at the end. Mathematically speaking, the integral of the increments dQ around a closed loop is non-zero. However, it turns out the integral [tex]\oint\frac{dQ}{T}=0[/tex] identically, where T is the temperature of the body at which the heat is transferred. This means that we can define a total differential [tex]dS=\frac{dQ}{T}[/tex] of some quantity S, called the entropy, which is a function of the state of a system; in plain English, a system "has" a definite value of this 'entropy' the same way it has a definite energy.

    I never really understood entropy in a purely thermodynamic context, and still don't. It makes more sense (to me, at least) from the viewpoint of statistical mechanics.

    To a first approximation, it's helpful to start out with the idea that diazona posted below: the entropy is the logarithm of the number of "states" of the system, by which we mean "microstates". What you mean by this precisely depends on whether or not you're talking about classical or quantum mechanics. Classically, a system is specified by a point in phase space (if you've encountered the concept?); and quantum mechanically, by a specification of the components of the state vector with respect to some basis (or, at a simpler level, by a wavefunction). So you fix the energy of your system, and see how many ways you can configure the constituents of that system so that in total it has that energy.

    At this simple level, however, it doesn't make any sense for a system to "maximise its entropy"- the paragraph above counts the total number of ways it's physically possible to share out the a given amount of energy among the components of a system; that's a fixed number. Instead, we have to think about the distinction between a "microstate"- a complete specification of the system- and a "macrostate"- a specification of all the numbers like temperature, pressure, volume etc. that we can actually measure.

    To understand the relation between the two, think about a sequence of 100 coin tosses. A complete specification of the sequence would consist of the entire sequence of heads and tails; a "macrostate" would consist of saying how many were heads and how many tails. If you actually sat and tossed a coin 100 times, the odds are slim that you'd get 100 heads, even though this sequence is no less likely than any given sequence HTHTHTTH...HT that contains 50 heads. The odds are extremely high, in fact, that you'd get 50 heads, plus or minus a couple, simply because there's more of those sequences. The essential point is that you're picking out the macrostate by looking for that which corresponds to the greatest number of microstates. The number of microstates corresponding to that macrostate is what diazona called the multiplicity, the number [tex]\Omega[/tex] of which the logarithm is the entropy.

    Finally, it's worthwhile commenting on why we take the logarithm. It means that if we have two systems, with multiplicities [tex]\Omega_1[/tex] and [tex]\Omega_2[/tex], then the multiplicity of the combined system is [tex]\Omega_1\cdot \Omega_2[/tex], so that the total entropy is the sum of the entropies of the two systems.

    Sorry for the lengthy post, but I hope it helped!
     
  6. Aug 21, 2010 #5

    WHT

    User Avatar

    Re: entropy

    Informational entropy is a mathematical book-keeping approach that allows one to reason about the amount of disorder in the system. As stated this is closely related to statistical mechanics and Boltzmann.

    Thermodynamic entropy is a way of evaluating how much of some energetic behavior adds to the disorder based on notions of the temperature of the system. This reduces statistical mechanics to some simple rules that mechanical engineers and others can apply.

    I agree that the latter is much harder to wrap one's mind around. Look up the publicly available works of E.T. Jaynes on entropy as he tries to reconcile the two approaches, and bridges the gap to uses of entropy in other disciplines, such as Shannon's information theory.
     
  7. Aug 22, 2010 #6
    Re: entropy

    Shannon's entropy is:

    S = -the sum over states of (P_state) times the logarithm of (P_state)

    Boltzmann's entropy is:

    S = (Boltzmann's constant) times the logarithm of the number of states


    And Shannon's reduces to Boltzmann's for uniform probability distributions (P_states)...
    is that not so? Ie, Shannon's is an extension of Boltzmann's for non-uniform
    probability distributions. This seems to be the connection.
     
  8. Aug 22, 2010 #7
    Re: entropy

    I thought the same, but when I looked at the details of the definitions, I found out that this notion is conceptually wrong.
    It's because Boltzmann entropy is the logarithm of [the number of microstates within one particular macrostate].

    So if you have macrostates i=1...N and each macrostate has O_i microstates, then the entropy of one particular macrostate is S_i=ln(O_i)
    The probabilities are p_i=O_i/O where O=sum O_i

    What you probably were thinking of is that for a uniform distribution p_i=1/O and therefore S=-ln O, but you see that's not the definition of Boltzmann entropy and O and O_i are different things. So this "proof" doesn't make sense.

    In fact, it's the other way round! The Shannon entropy is a special case of the Boltzmann entropy, if you restrict yourself to multinomial distributions.
     
  9. Aug 22, 2010 #8

    WHT

    User Avatar

    Re: entropy

    As defined the Boltzmann entropy can't really be maximized as it keeps growing with additional states. The utility of Shannon is that you can apply constraints to the probability distribution in terms of moments and limits and out pops the Maximum Entropy Principle. This allows you to reason about all sorts of physics, which is the point that Jaynes tried to address.

    Perhaps Boltzmann entropy is just too general to be practically useful?
     
  10. Aug 22, 2010 #9
    Re: entropy

    I'm not sure that's right. The index i in Gibb's entropy formula indexes microstates equivalent to a macrostate, not macrostates themselves.
     
  11. Aug 22, 2010 #10
    Re: entropy

    So what do you call microstate and what do you call macrostate then? Maybe I'm mixing up the names. Let's take an example:
    A gas has a certain volume V (macrostate) and each volume has many different realizations (microstates). I use the equation to calculate the entropy for one particular volume V.

    So do you sum over realizations for one volume or do you sum over all volumes? In the latter case your result does not depend on volume, so what is it good for?
     
  12. Aug 22, 2010 #11

    WHT

    User Avatar

    Re: entropy

    Isn't that one of the fundamental issues. We want entropy to be an intensive property of matter but the definitions allow it to show extensive traits. I think that is one of the reasons that some scientists are fiddling with other nonextensive formulations such as Tsallis entropy.
     
  13. Aug 22, 2010 #12
    Re: entropy

    Why do we want it to be intensive? It should rather be extensive. The corresponding intensive quantity is temperature.

    I'd rather say it's because Shannon entropy is just a special case. But sometimes more general approaches fit experimental data better. In any case the Boltzmann definition is the most general and so easy, that it cannot be wrong.
     
  14. Aug 22, 2010 #13
    Re: entropy

    OK that sounds right

    You'd sum over realizations for one volume. The probabilities are probabilities of finding a particular realization of the volume V, i.e. they all correspond to the same volume. A microstate might be a list of positions of all the particles: {x_1 ... x_N}, then the Gibbs entropy is

    [tex]S(V) = -\sum_{\{x_1 ... x_N\}} p(\{x_1 ... x_N\}) \log{ p(\{x_1 ... x_N\}) }[/tex]
    Where the sum runs over all possible lists {x_1 ... x_N} such that [itex]vol(\{x_1 ... x_N\})=V[/itex]. Although I'm not even sure you need that, you could just say that if a particular list has [itex]vol(\{x_1 ... x_N\}) \neq V[/itex], then [itex]p(\{x_1 ... x_N\})=0[/itex] and use 0log0 = 0.

    The microcanonical ensemble is then [itex]p(\{x_1 ... x_N\}) = 1/O_V[/itex] for all lists with volume V, which gets you back to the Boltzmann entropy.
     
  15. Aug 22, 2010 #14
    Re: entropy

    Still looks like Boltzmann's entropy is Shannon's(Gibb's) entropy for a uniform
    distribution of probabilities...sometimes dyslexia is a good thing.
     
  16. Aug 22, 2010 #15

    WHT

    User Avatar

    Re: entropy

    Entropy composes as
    S(A+B) = S(A) + S(B)
    which indeed does make it perfectly extensive. Tsallis proposed composition as
    S(A+B) = S(A) + S(B) +(1-q)S(A)S(B)
    This makes it nonextensive for q<>1, which now that I think about it is not the same as making it intensive.
     
  17. Aug 22, 2010 #16
    Re: entropy

    Thanks for that. Some of his papers are available from his wikipedia page; they look to be interesting reading.
     
  18. Aug 31, 2010 #17
    Re: entropy

    I thought about it for a while and I think you are using an inappropriate interpretation of p_i.
    For example compare it to the derivation of the Bose-distribution. At the beginning one writes sum pln(p) -> max where p_i is the ratio of particles in state i considering that *one* particular microstate. That's conceptually something different, than what you have for p_i.
    With your definition the Fermi-Bose distribution derivation wouldn't work out?!
     
  19. Aug 31, 2010 #18
    Re: entropy

    Oh, on the second thought you were actually right. I confused the p_i.

    Maybe another way of counting microstates is better. You could say they have a multiplicity proportional to their probabilities. This view would make the second law a simple probabilistic statement.
     
  20. Feb 7, 2011 #19
    Re: entropy

    Can anyone please explain why

    [tex]S=- \ln\sum_{i=1}^N P_i^n[/tex] for some integer n

    (for microcanonical ensemble). Thanks.
     
  21. Feb 7, 2011 #20

    A. Neumaier

    User Avatar
    Science Advisor
    2016 Award

    Re: entropy

    See also Sections 7.6 and Section 7.7 of the online book

    A. Neumaier and D. Westra,
    Classical and Quantum Mechanics via Lie algebras.
    http://de.arxiv.org/abs/0810.1019
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Boltman interpretation of entropy
  1. Interpretation of Q.M. (Replies: 57)

  2. Interpretations of QP (Replies: 30)

Loading...