Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I If entropy increases, where does the information come from?

  1. Jul 26, 2018 #1
    For ex. if two particals close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart? Shouldn't the information content be the same for the macrosystem?
     
  2. jcsd
  3. Jul 26, 2018 #2

    PeterDonis

    Staff: Mentor

    Can you give a specific example?
     
  4. Jul 26, 2018 #3
    You ask for a specific example a lot.... What do you not understand? I laid the question out pretty concretely
     
  5. Jul 26, 2018 #4
    Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?
     
  6. Jul 26, 2018 #5

    Grinkle

    User Avatar
    Gold Member

    Your question is not as concrete as you believe it to be. You are being coached to fill in the blanks.

    Google the definition of entropy, digest how one calculates a specific number for entropy of a specific system, and see if that helps you to fill in more of the blanks.
     
  7. Jul 26, 2018 #6
    My question is, since Shannon entropy is considered information, where does this new information come from?

    [Mentors's note: some unnecessary and off-topic argumentation has been removed from this post]
     
    Last edited by a moderator: Jul 26, 2018
  8. Jul 26, 2018 #7

    PeterDonis

    Staff: Mentor

    You may think you did, but you didn't. And I strongly suspect that is because you do not have a good understanding of the topic you're asking about.

    What do you think the Shannon entropy is of a system of two particles close together? What do you think the Shannon entropy is of a system of two particles far apart? Please show your work.
     
  9. Jul 26, 2018 #8

    Nugatory

    User Avatar

    Staff: Mentor

    I am still not clear on why you are saying that the Shannon entropy of a system with two nearby particles is different from the Shannon entropy when they are more widely separated. Can you show the calculation of the entropy under both conditions?
     
  10. Jul 26, 2018 #9
    If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?
     
  11. Jul 26, 2018 #10

    PeterDonis

    Staff: Mentor

    Because a system of two particles is not a gas.
     
  12. Jul 27, 2018 #11

    Demystifier

    User Avatar
    Science Advisor

    Entropy is not a property of particles. Entropy is a property of a gas as a whole, when its detailed particle content is ignored. More generally, entropy is something what you can say about a big system when you don't know all the details of the small constituents comprising the big system. If physicists were superbeings who knew all the details about everything, then they would not need the concept of entropy. Metaphorically, if particles are made by God, then entropy is made by men.

    That being said, now the answer to your question is easy. Two particles do not have an entropy in the way a gas has because, even if you can know every detail about 2 particles, you cannot know every detail about ##10^{23}## particles.

    For more details, I highly recommend
    https://www.amazon.com/Entropy-Demystified-Second-Reduced-Common/dp/9813100125
     
    Last edited: Jul 27, 2018
  13. Jul 27, 2018 #12

    Grinkle

    User Avatar
    Gold Member

    Consider a one dimensional two particle bounded system (not a gas) that is at equilibrium that has 8 slots, each particle is a "1".

    State 1: 00000011
    State 2: 10000001

    Do you consider State 1 having a different entropy than state 2, State 1 vs State 2 not changing the equilibrium of the system?

    I hope I have properly posed the example.

    Now let the system change and come to a new equilibrium.

    State 3: 010000001

    I added one slot, the 'volume' if you will of my 2 particle system is larger.

    Do you consider State 3 having a different entropy than states 1 or 2?
     
  14. Jul 27, 2018 #13

    Demystifier

    User Avatar
    Science Advisor

    Suppose that someone gives you a closed book and tells you that this book contains only one letter, say letter "A", written at one of the pages in the book. All the other pages are empty. How much information that book contains?

    At first sight, not much. However, it really contains more information than it looks at first. The letter "A" is written at some definite place of some definite page, and the information about the exact place of the letter - is an information too. If the book contains ##N## pages, then Shannon information about the page on which the letter is written is ##{\rm ln} N##. So the bigger the book the more information it contains, even when it contains only one letter.

    Expansion of a gas with a fixed number of particles is similar to an "expansion" of a book (that is, increasing the number of pages) with a fixed number of letters.
     
  15. Jul 27, 2018 #14
    So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
    What is the cutoff point mr Adonis? Sounds like a massive paradox to me.
     
  16. Jul 27, 2018 #15
    Again, why is there a cutoff in not being able to know about n particles. Also, you should be able to know the exact same amount about each particle in a 10^23 mass of particles as you can know about each particle in a 2 particle mass. HUP doesnt care how many particles you are looking at. And you cant know everything about a particle. Again bc of hup.
     
  17. Jul 27, 2018 #16

    PeterDonis

    Staff: Mentor

    There is no hard cutoff (see below), but the extremes are certainly easily distinguished. A box of 1 liter of hydrogen gas at room temperature has about ##10^{22}## atoms. Your intuitions about how gases work are based on collections of that many atoms or more. Claiming that things should work exactly the same for a system of 2 atoms shows a huge failure to understand the issue.

    Not a paradox, but a failure to realize that the term "gas", like most terms, does not have crisp, precise boundaries.

    https://en.wikipedia.org/wiki/Sorites_paradox

    To use the example given in that article, two grains of sand does not make a heap, but ##10^{22}## grains certainly would. Yet there is no precise number of grains of sand where you transition between "heap" and "not heap".
     
  18. Jul 27, 2018 #17
    So is there a slight disconnect between the concept of a systems information content and its entropy? What i dont get is that if a system gains information it should take more to describe it. But it seems like apart from each particles intrinsic values, each particle could be described by an xyz position that doesnt need more info to describe.

    I have read a college professors post that negative entanglement entropy cancels out the increase in standard entropy which keeps information content the same.

    Is information in the sense of expanding gas actually growing?
    I think my concept of information might need a different definition than shannon entropy.

    Leonard suskind believed that the idea of information being destroyed was an abomination. Likewise, im sure he would say the creation of information from nothing is an equal abomination because entroby decreases are feasable (although improbable) and an entropy decrease means information distruction if shannon entropy truly means information.
    T
     
  19. Jul 27, 2018 #18

    PeterDonis

    Staff: Mentor

    This is a classical model, not a quantum model. In a quantum model, the particle is described by a state vector in a Hilbert space; the x, y, and z components of position are parameters that pick out which particular state vector it is. But the Hilbert space itself is not the 3-dimensional space of the x, y, z position vector.

    This is a somewhat different sense of the word "information" from the one you're using when you ask about the relationship between information and entropy. When Susskind talks about information not being destroyed, he is referring to quantum unitary evolution; basically he is claiming that unitary evolution can never be violated. But that just means that, as far as the quantum state of an entire system is concerned, its evolution is deterministic: if you know the state at one instant, you know it for all time. And if you know the system's exact state for all time, its entropy is always zero, by definition.

    However, if the entire system contains multiple subsystems (such as multiple particles), then it might be possible to assign a nonzero entropy to the individual subsystems, because the subsystems might not have definite states due to entanglement. This is the sort of case the professor you mentioned was talking about. For example, suppose we have a two-electron system in the singlet spin state (i.e., total spin zero); for simplicity we'll ignore their positions (if it matters, consider them to be in some bound state like an atomic orbital with no transition possible). The total entropy of this system is zero, because we know its exact state. But each individual electron has a nonzero positive entropy, because it doesn't have a definite state; its spin could turn out to be in any direction when measured. However, there is also a negative entropy due to entanglement; this is because the electron spins must be opposite, so once we have measured one electron, we know the directions of both electrons' spins. So the total entropy is still zero for the system as a whole.
     
  20. Jul 27, 2018 #19

    Grinkle

    User Avatar
    Gold Member

    If one doubles pages in @Demystifier 's book then there are twice as many possible configurations for the book to be in. The letter is still only on a single page, but it now requires more information to say which page because there are twice as many of them to consider.

    Two particles being close or far does not necessarily change system entropy; the two microstates (particles close and particles farther apart) might both yield the same macro level thermodynamic state for the system.
     
  21. Jul 27, 2018 #20

    PeterDonis

    Staff: Mentor

    It's not so much the macro level thermodynamic state (a system of two particles isn't usefully viewed using the thermodynamic approximation) as the fact that the set of possible two-particle position states is the same whether the particles are close together or far apart. So the "number of pages in the book" stays the same, all that changes is which of the "pages" each "letter" (particle) is on.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted