Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Is entropy real or just a construct?

  1. Nov 2, 2016 #1
    When we talk about entropy, we say it comes from our inability to completely describe the state of a system. Also, we say it is a property of the system (like entalphy). That's confusing me a lot. If entropy is a property, how can it come from our inability to describe the system? Or it's just a construct? Something that we use to describe reality?

    Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

    I just need some clarification about this.
     
  2. jcsd
  3. Nov 2, 2016 #2

    phinds

    User Avatar
    Gold Member
    2016 Award

    Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?
     
  4. Nov 2, 2016 #3
    Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

    So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
     
  5. Nov 2, 2016 #4

    phinds

    User Avatar
    Gold Member
    2016 Award

    How would you create a complete description of, say, the Earth? Things just get too complicated.
     
  6. Nov 2, 2016 #5

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    What does "completely describe the state of a system" mean?

    Do you think the concept of "temperature" or "heat" is sufficient to "completely describe the state of a system"? After all, you made no complaints about those, and thus, presumably, you are happy with those concepts. Yet, these are the same thermodynamics/statistical concepts on the SAME foundation as entropy.

    Zz.
     
  7. Nov 2, 2016 #6

    russ_watters

    User Avatar

    Staff: Mentor

    My first thought on reading your first post was that you were thinking of the HUP; no, entropy is not due to the limitations of the HUP.

    As an engineer, the statistical description doesn't speak to me. The classical definition may work better for you too; it is thermal energy unavailable to do work. A simple manifestation of it is that when the temperature difference between two reservoirs is lowered, your ability to extract is reduced by a larger proportion. The difference is the entropy loss.

    To put it another way: you are better off having a little bit of hot water than a lot of lukewarm water. As such, this is also referred to as "quality".
     
  8. Nov 2, 2016 #7

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    The statistical notion of entropy involves two different notions of the state of a system: (1) the microstate, which is the complete description, and (2) the macrostate, which is the "coarse-grained" description. There can be many microstates that correspond to the same macrostate.

    For example, suppose you flip 100 coins. The microstate might be the exact record of which of the coins were heads and which were tails. The macrostate might be just the count of the number of heads and tails. Then the entropy for a macrostate with [itex]H[/itex] heads and [itex]T[/itex] tails would be [itex]k log(W(H,T,100)[/itex], where [itex]W(H,T,100)[/itex] is the number of ways to arrange 100 coins so that [itex]H[/itex] of them are heads and [itex]T[/itex] of them are tails. If I remember correctly, the formula for that is: [itex]W(H,T,100) = \frac{100!}{H! T!}[/itex]. The zero-entropy states are [itex]H= 100, T=0[/itex] and [itex]H=0, T=100[/itex]. The highest-entropy state is [itex]H=50, T=50[/itex].

    But the statistical notion of entropy is relative to a particular notion of macrostate. In thermodynamics, the usual notion of macrostate of a gas of identical particles is the total energy, the number of particles, the volume.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Is entropy real or just a construct?
  1. Entropy ? (Replies: 2)

  2. Entropy of gas (Replies: 5)

  3. Clarifying entropy (Replies: 12)

Loading...