- #1
- 21
- 0
For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart? Shouldn't the information content be the same for the macrosystem?
Wolfenstein3d said:if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart?
Wolfenstein3d said:What do you not understand? I laid the question out pretty concretely
Wolfenstein3d said:I laid the question out pretty concretely
Wolfenstein3d said:since Shannon entropy is considered information, where does this new information come from?
I am still not clear on why you are saying that the Shannon entropy of a system with two nearby particles is different from the Shannon entropy when they are more widely separated. Can you show the calculation of the entropy under both conditions?Wolfenstein3d said:My question is, since Shannon entropy is considered information, where does this new information come from?
Wolfenstein3d said:If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?
Entropy is not a property of particles. Entropy is a property of a gas as a whole, when its detailed particle content is ignored. More generally, entropy is something what you can say about a big system when you don't know all the details of the small constituents comprising the big system. If physicists were superbeings who knew all the details about everything, then they would not need the concept of entropy. Metaphorically, if particles are made by God, then entropy is made by men.Wolfenstein3d said:If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?
Wolfenstein3d said:So how would a simplification of two gas particles not follow the same entropy increase?
Suppose that someone gives you a closed book and tells you that this book contains only one letter, say letter "A", written at one of the pages in the book. All the other pages are empty. How much information that book contains?Wolfenstein3d said:My question is, since Shannon entropy is considered information, where does this new information come from?
Demystifier said:Entropy is not a property of particles. Entropy is a property of a gas as a whole, when its detailed particle content is ignored. More generally, entropy is something what you can say about a big system when you don't know all the details of the small constituents comprising the big system. If physicists were superbeings who knew all the details about everything, then they would not need the concept of entropy. Metaphorically, if particles are made by God, then entropy is made by men.
That being said, now the answer to your question is easy. Two particles do not have an entropy in the way a gas has because, even if you can know every detail about 2 particles, you cannot know every detail about ##10^{23}## particles.
For more details, I highly recommend
https://www.amazon.com/dp/9813100125/?tag=pfamazon01-20
Wolfenstein3d said:So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
Wolfenstein3d said:What is the cutoff point mr Adonis? Sounds like a massive paradox to me.
Demystifier said:Suppose that someone gives you a closed book and tells you that this book contains only one letter, say letter "A", written at one of the pages in the book. All the other pages are empty. How much information that book contains?
At first sight, not much. However, it really contains more information than it looks at first. The letter "A" is written at some definite place of some definite page, and the information about the exact place of the letter - is an information too. If the book contains ##N## pages, then Shannon information about the page on which the letter is written is ##{\rm ln} N##. So the bigger the book the more information it contains, even when it contains only one letter.
Expansion of a gas with a fixed number of particles is similar to an "expansion" of a book (that is, increasing the number of pages) with a fixed number of letters.
Wolfenstein3d said:each particle could be described by an xyz position
Wolfenstein3d said:Leonard suskind believed that the idea of information being destroyed was an abomination.
Wolfenstein3d said:each particle could be described by an xyz position that doesn't need more info to describe.
Grinkle said:the two microstates (particles close and particles farther apart) might both yield the same macro level thermodynamic state for the system
PeterDonis said:This is a classical model, not a quantum model. In a quantum model, the particle is described by a state vector in a Hilbert space; the x, y, and z components of position are parameters that pick out which particular state vector it is. But the Hilbert space itself is not the 3-dimensional space of the x, y, z position vector.
This is a somewhat different sense of the word "information" from the one you're using when you ask about the relationship between information and entropy. When Susskind talks about information not being destroyed, he is referring to quantum unitary evolution; basically he is claiming that unitary evolution can never be violated. But that just means that, as far as the quantum state of an entire system is concerned, its evolution is deterministic: if you know the state at one instant, you know it for all time. And if you know the system's exact state for all time, its entropy is always zero, by definition.
However, if the entire system contains multiple subsystems (such as multiple particles), then it might be possible to assign a nonzero entropy to the individual subsystems, because the subsystems might not have definite states due to entanglement. This is the sort of case the professor you mentioned was talking about. For example, suppose we have a two-electron system in the singlet spin state (i.e., total spin zero); for simplicity we'll ignore their positions (if it matters, consider them to be in some bound state like an atomic orbital with no transition possible). The total entropy of this system is zero, because we know its exact state. But each individual electron has a nonzero positive entropy, because it doesn't have a definite state; its spin could turn out to be in any direction when measured. However, there is also a negative entropy due to entanglement; this is because the electron spins must be opposite, so once we have measured one electron, we know the directions of both electrons' spins. So the total entropy is still zero for the system as a whole.
Wolfenstein3d said:Could you elaborate a bit more about why suskind called it information?
We'd have to ask him for a definitive answer... but it sure looks as if he needed a word, so he chose the word in the English language that came closest to what he meant.Wolfenstein3d said:Interesting. Could you elaborate a bit more about why suskind called it information?
PeterDonis said:There is no hard cutoff (see below), but the extremes are certainly easily distinguished...
Not a paradox, but a failure to realize that the term "gas", like most terms, does not have crisp, precise boundaries.
https://en.wikipedia.org/wiki/Sorites_paradox
To use the example given in that article, two grains of sand does not make a heap, but ##10^{22}## grains certainly would. Yet there is no precise number of grains of sand where you transition between "heap" and "not heap".
Wolfenstein3d said:For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart?
It is not to do with any particular number of particles. Rather, it is how the observer chooses to group microstates into macrostates.Wolfenstein3d said:So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.Philip Koeck said:we get N k ln 2. We can also get the same result for the entropy change using Boltzmanns formula for statistical entropy.
If we now assume that we can describe this gas (as suggested by Wolfenstein3d) by a list of 6 floating point numbers per particle (for position and velocity) the information needed to describe the gas doesn't change when the gas expands by a factor of 2.
SlowThinker said:a "floating point number" contains infinite amount of information.
PeterDonis said:No, it doesn't. If it did, it couldn't be represented by a finite number of bits, which our computers do all the time.
SlowThinker said:I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.
This might be the answer I'm looking for. If I understand correctly you are saying that whenever I've described the particles with a limited number of bits I will always need 1 more bit per particle when the volume is suddenly doubled since I need to state whether the particle is at some position in the original half of the container or in a corresponding position in the new half. So a doubling of the volume changes the information needed to describe the whole gas by N bits, in agreement with the increase of entropy. This should be true no matter whether particles are distinguishable or not.SlowThinker said:If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
I would say the second part (starting with "To me it seems ...") of my post quoted above is wrong. The extra information needed when doubling the volume should actually be 1 bit per particle.Philip Koeck said:What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
I might have an entirely classical argument that the amount of information needed to describe a collection of particles does increase when the available volume increases. The important point is how you measure the information needed to describe the particles. In classical mechanics you would say that you need six numbers (three coordinates and three velocity components) to completely specify one particle. For N particles you obviously need 6N numbers and it doesn't matter how big N is! If only elastic collisions happen you have all the information needed to predict where every particle will be and how it will move for the rest of time assuming that your numbers have infinite accuracy. The latter means, of course, that you have infinite information.Wolfenstein3d said:Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?