If entropy increases, where does the information come from?

  • Context: Graduate 
  • Thread starter Thread starter Wolfenstein3d
  • Start date Start date
  • Tags Tags
    Entropy Information
Click For Summary
SUMMARY

The discussion centers on the relationship between entropy and information, specifically in the context of two particles and their spatial configurations. Participants debate whether the Shannon entropy of a system changes when two particles are close together versus when they are far apart. It is established that entropy is a property of a system as a whole, rather than individual particles, and that the information content can increase with the expansion of a system, similar to how a book with more pages contains more information about the location of a letter. The conversation highlights the complexities of defining information in relation to entropy, particularly in quantum mechanics.

PREREQUISITES
  • Understanding of Shannon entropy and its calculation
  • Familiarity with classical thermodynamics and the concept of microstates
  • Basic knowledge of quantum mechanics and state vectors
  • Concept of entanglement and its impact on entropy
NEXT STEPS
  • Research the calculation of Shannon entropy for different systems
  • Explore the relationship between entropy and information in quantum mechanics
  • Study classical thermodynamics, focusing on microstates and macrostates
  • Investigate the implications of entanglement on the entropy of quantum systems
USEFUL FOR

Physicists, students of thermodynamics and quantum mechanics, and anyone interested in the foundational concepts of information theory and entropy.

  • #31
SlowThinker said:
I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.

Yes, of course. I didn't see that. For an ideal gas the temperature has to be constant during free expansion since the inner energy is constant. That would mean that a free expansion with constant entropy contradicts the first law.
 
Science news on Phys.org
  • #32
SlowThinker said:
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
This might be the answer I'm looking for. If I understand correctly you are saying that whenever I've described the particles with a limited number of bits I will always need 1 more bit per particle when the volume is suddenly doubled since I need to state whether the particle is at some position in the original half of the container or in a corresponding position in the new half. So a doubling of the volume changes the information needed to describe the whole gas by N bits, in agreement with the increase of entropy. This should be true no matter whether particles are distinguishable or not.
 
  • Like
Likes   Reactions: SlowThinker
  • #33
Philip Koeck said:
What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
I would say the second part (starting with "To me it seems ...") of my post quoted above is wrong. The extra information needed when doubling the volume should actually be 1 bit per particle.
 
  • #34
Wolfenstein3d said:
Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?
I might have an entirely classical argument that the amount of information needed to describe a collection of particles does increase when the available volume increases. The important point is how you measure the information needed to describe the particles. In classical mechanics you would say that you need six numbers (three coordinates and three velocity components) to completely specify one particle. For N particles you obviously need 6N numbers and it doesn't matter how big N is! If only elastic collisions happen you have all the information needed to predict where every particle will be and how it will move for the rest of time assuming that your numbers have infinite accuracy. The latter means, of course, that you have infinite information.
Now let's make it more realistic and assume that every coordinate and velocity component has limited accuracy. If we want to specify for example the x-coordinate of a particle in a box of size L in the x-direction we can start by saying whether it's in the left or the right half. That requires 1 bit of information. To improve the accuracy we can add 1 bit which specifies whether the particle is in the left or the right half of the half specified by the first bit. We continue adding bits until we've reached the required accuracy. The number of bits we needed is the information. Note that every number we can store in a computer has an inbuilt accuracy.
Now comes the key thought: If we now double the volume by making the length of the box 2L we need one extra bit in order to specify the x-coordinate of one particle with the same accuracy. That means the total information needed to specify the gas (or collection of particles) increases by N bits when the volume is doubled.
 
Last edited:

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 57 ·
2
Replies
57
Views
5K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
6K