If entropy increases, where does the information come from?

  • #26
haruspex
Science Advisor
Homework Helper
Insights Author
Gold Member
2020 Award
37,151
7,275
For ex. if two particals close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart?
So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
It is not to do with any particular number of particles. Rather, it is how the observer chooses to group microstates into macrostates.
Suppose you have a box 1000 cells in each direction, and two particles that can independently be in any of the 109 cells.
In general, there are 1018 possible states (if the particles are distinct), but it takes less information to describe an arrangement in which the two particles are within 10 cells of each other, in each dimension, because you can choose to describe it in terms of that block of 1000 cells (109 possibilities) and their relative positions within the block (106 possibilities). [Yes, I know that's grossly overcounting - it's more like 2x1012 all up.]
 
  • #27
304
49
The original post got me thinking about free expansion of an ideal gas. If an ideal gas of N atoms that's thermally isolated from the surroundings expands freely to twice the starting volume the temperature stays the same. That means we can calculate the thermodynamic entropy change as if the expansion was reversible and isothermal and we get N k ln 2. We can also get the same result for the entropy change using Boltzmanns formula for statistical entropy.
If we now assume that we can describe this gas (as suggested by Wolfenstein3d) by a list of 6 floating point numbers per particle (for position and velocity) the information needed to describe the gas doesn't change when the gas expands by a factor of 2.
What is the conclusion from this?
A: Describing a gas in this classical way is not correct (because the particles are indistinguishable, quantum mechanical, non-classical ...).
B: There is no connection between statistical entropy and information.
Other suggestions?

Now let's assume a gas of particles that can be described by 6N numbers just like in classical mechanics. Think of super-heavy, hypothetical atoms or atom clusters consisting of millions of protons and neutrons. It seems to me that something has to change now. Now the information needed to describe this system is clearly independent of the volume. If that means that the entropy is also constant during free expansion then the temperature cannot be constant. The free expansion would have to be equivalent to a reversible adiabatic expansion and the temperature would decrease.
Are there any systems like this where a free expansion leads to a temperature decrease?
 
  • #28
474
66
we get N k ln 2. We can also get the same result for the entropy change using Boltzmanns formula for statistical entropy.
If we now assume that we can describe this gas (as suggested by Wolfenstein3d) by a list of 6 floating point numbers per particle (for position and velocity) the information needed to describe the gas doesn't change when the gas expands by a factor of 2.
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
Sackur-Tetrode formula may be of interest.

I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.
 
  • Like
Likes Philip Koeck
  • #29
PeterDonis
Mentor
Insights Author
2020 Award
35,935
13,997
a "floating point number" contains infinite amount of information.

No, it doesn't. If it did, it couldn't be represented by a finite number of bits, which our computers do all the time.

A floating point number contains some finite number of bits describing the mantissa, and some finite number of bits describing the exponent. That's a finite amount of information. The fact that the exponent varies the "resolution" doesn't mean there is infinite information in the number.

What you might be trying to describe here is real numbers, which do have "infinite resolution"; but real numbers are not the same as floating point numbers.
 
  • Like
Likes Philip Koeck
  • #30
304
49
No, it doesn't. If it did, it couldn't be represented by a finite number of bits, which our computers do all the time.

What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
 
  • #31
304
49
I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.

Yes, of course. I didn't see that. For an ideal gas the temperature has to be constant during free expansion since the inner energy is constant. That would mean that a free expansion with constant entropy contradicts the first law.
 
  • #32
304
49
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
This might be the answer I'm looking for. If I understand correctly you are saying that whenever I've described the particles with a limited number of bits I will always need 1 more bit per particle when the volume is suddenly doubled since I need to state whether the particle is at some position in the original half of the container or in a corresponding position in the new half. So a doubling of the volume changes the information needed to describe the whole gas by N bits, in agreement with the increase of entropy. This should be true no matter whether particles are distinguishable or not.
 
  • Like
Likes SlowThinker
  • #33
304
49
What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
I would say the second part (starting with "To me it seems ...") of my post quoted above is wrong. The extra information needed when doubling the volume should actually be 1 bit per particle.
 
  • #34
304
49
Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?
I might have an entirely classical argument that the amount of information needed to describe a collection of particles does increase when the available volume increases. The important point is how you measure the information needed to describe the particles. In classical mechanics you would say that you need six numbers (three coordinates and three velocity components) to completely specify one particle. For N particles you obviously need 6N numbers and it doesn't matter how big N is! If only elastic collisions happen you have all the information needed to predict where every particle will be and how it will move for the rest of time assuming that your numbers have infinite accuracy. The latter means, of course, that you have infinite information.
Now let's make it more realistic and assume that every coordinate and velocity component has limited accuracy. If we want to specify for example the x-coordinate of a particle in a box of size L in the x-direction we can start by saying whether it's in the left or the right half. That requires 1 bit of information. To improve the accuracy we can add 1 bit which specifies whether the particle is in the left or the right half of the half specified by the first bit. We continue adding bits until we've reached the required accuracy. The number of bits we needed is the information. Note that every number we can store in a computer has an inbuilt accuracy.
Now comes the key thought: If we now double the volume by making the length of the box 2L we need one extra bit in order to specify the x-coordinate of one particle with the same accuracy. That means the total information needed to specify the gas (or collection of particles) increases by N bits when the volume is doubled.
 
Last edited:

Related Threads on If entropy increases, where does the information come from?

Replies
12
Views
1K
  • Last Post
Replies
2
Views
9K
  • Last Post
Replies
7
Views
6K
Replies
4
Views
6K
Replies
23
Views
3K
Replies
10
Views
1K
Replies
1
Views
941
Replies
57
Views
2K
Top