Information stored in the initial condition of an ideal gas

In summary: All particles have definite values. It is because of our inability to even list this initial condition that statistics are needed.
  • #1
CY_Leung
6
0

Homework Statement


1 mm3 of gas at normal pressure and temperature contains about 1015 particles. Considering the particles as point-like and classical, provide a rough, conservative estimate for how many hard drives would be necessary to store the initial conditions of all gas particles. (As in 2013, a normal hard drive can store about 5 TB data.)

Homework Equations


This is the first course of statistical mechanics and actually we are not informed about the information theory in the class, except for the mathematical expression of information entropy. The following is all I know:
1 bit (1 Shannon) is defined as the amount of information of an event that occurs with a probability of 1/2.
Where 1 byte = 8 bits, 1 KB = 1000 bytes, 1 MB = 1000 KB, 1 GB = 1000 MB, 1 TB = 1000 GB
For an event with probability p, the number of bits of the corresponding amount of information is equal to -log2p.
Then the expected value of number of bits of a sample space is given by Σpilog2pi, which is actually the information entropy.
As for continuous probability distribution, one may write the information entropy as:
-∫p(x)log2p(x)dx

The Attempt at a Solution


Each classical and point-like particle can be fully described by (x,v) (or (x,p), doesn't really matter here). My interpretation of the question is: calculate the information entropy of the gas with the logarithmic base 2.
For information entropy is additive, i.e. for independent events, the total information entropy can be just summed.
I(p1p2) = I(p1) + I(p2) (read that from Wikipedia)
Thus velocities and positions of the particles can be considered separately, and each individual particle can also be considered separately.
First consider the velocity of a particle. What comes to my mind directly is the Maxwell-Boltzmann probability distribution. For ease of expression, the probability distribution can be written as:
p(v)=Cv2e-βv2, C being the normalization constant and β = m/2kT.
Do the integration:
[tex]I(\overrightarrow{v})=-C\int_{0}^{\infty}\overrightarrow{v}^{2}e^{-\beta\overrightarrow{v}^{2}}{log}_{2}(\overrightarrow{v}^{2}e^{-\beta\overrightarrow{v}^{2}})d\overrightarrow{v}
=-\frac{C}{{ln}2}\int_{0}^{\infty}\overrightarrow{v}^{2}e^{-\beta\overrightarrow{v}^{2}}(2{ln}|\overrightarrow{v}|-\beta\overrightarrow{v}^{2}) d\overrightarrow{v}[/tex]
For the gas is under normal condition, the velocities of the particles are not too large and we may omit the term [tex]ln|\overrightarrow{v}|[/tex]. By this approximation the result of the integration would be 3/(2 ln2) according to my computation.
But I have no idea how to do this for particle positions. I attempted in the following way:
[tex]I(\overrightarrow{x})=-\int_{V}^{ } p(\overrightarrow{x})log_{2} p(\overrightarrow{x}) d\overrightarrow{x} [/tex]
[tex]V[/tex] being the volume bounded by the gas, [tex]p(\overrightarrow{x})[/tex] is the uniform probability density distribution. But now comes the problem: How should I do the logarithm of the uniform probability density distribution function, which is not dimensionless and depends on the unit we use for volume?
Or is the above approach correct at all? Thanks.
 
Physics news on Phys.org
  • #2
My feeling is that you are overthinking it. I would take a much simpler approach: how much information is needed to store the state of one particle times the number of particles.
 
  • #3
DrClaude said:
My feeling is that you are overthinking it. I would take a much simpler approach: how much information is needed to store the state of one particle times the number of particles.
This is actually also my approach, but since the positions and velocities of particles subject to some specific probability distribution, and the number of bits is related to the probability of a certain event, I have used the probability distribution to estimate the amount of information.
 
  • #4
I mean, I can also take an approach like using the fact that, e.g. 1 byte is required to store a floating number, but I think the information entropy specifies the lower bound of information amount, and the way of computer to store a floating number is only one specific way of coding that may not be the most efficient when compared with the theoretical minimum?
I am not familiar with these stuff so please point out my mistakes if you spot them.
 
  • #5
CY_Leung said:
This is actually also my approach, but since the positions and velocities of particles subject to some specific probability distribution, and the number of bits is related to the probability of a certain event, I have used the probability distribution to estimate the amount of information.
But the initial condition of the gas is not probabilistic. All particles have definite values. It is because of our inability to even list this initial condition that statistics are needed.
 
  • #6
DrClaude said:
But the initial condition of the gas is not probabilistic. All particles have definite values. It is because of our inability to even list this initial condition that statistics are needed.
Okay I see what you mean. I interpreted the question as the following: The initial state of the gas is unknown, so if I have some way to measure all 6N values, how much information would I obtain (i.e. the decrease in information entropy = the original information entropy of the system, for there are no uncertainties about the gas anymore), and this has something to do with probability.
But if I change the question a little bit, like what is the amount of the information I obtain when I measure all states of the gas at an arbitrary time, does it make my approach correct?
 

1. What is the initial condition of an ideal gas?

The initial condition of an ideal gas refers to the state of the gas at the beginning of a process or experiment. This includes the gas's pressure, volume, temperature, and number of moles.

2. How is information stored in the initial condition of an ideal gas?

The initial condition of an ideal gas stores information about the gas's properties, such as its pressure, volume, and temperature, which can be used to calculate other variables, such as its internal energy and entropy.

3. What can be learned from the information stored in the initial condition of an ideal gas?

The information stored in the initial condition of an ideal gas can be used to understand the behavior and characteristics of the gas, such as how it responds to changes in temperature and pressure, and to make predictions about its future states.

4. How is the information stored in the initial condition of an ideal gas relevant to real-world applications?

The information stored in the initial condition of an ideal gas is relevant to many real-world applications, such as in the design and operation of engines, refrigeration systems, and industrial processes. It can also help in understanding and predicting the behavior of gases in different environments, such as in the Earth's atmosphere or in outer space.

5. How does the information stored in the initial condition of an ideal gas relate to the laws of thermodynamics?

The information stored in the initial condition of an ideal gas is governed by the laws of thermodynamics, specifically the first and second laws. These laws dictate how the properties of the gas change during a process and provide insights into the energy and entropy changes that occur. The initial condition of the gas is crucial in determining the starting point for these changes.

Similar threads

  • Advanced Physics Homework Help
Replies
1
Views
745
  • Advanced Physics Homework Help
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Advanced Physics Homework Help
Replies
1
Views
3K
Replies
19
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
3K
Replies
6
Views
945
  • Advanced Physics Homework Help
Replies
2
Views
2K
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
14
Views
883
Back
Top