# B Calculating entropy for a simple scenario

Tags:
1. Jan 14, 2017

### zrek

I'd like to create a simple model that demonstrates the basic values of thermodinamics of an ideal gas. I begin with two rooms, several molecules in them. Every data of every individual molecule is given (position, mass, speed, etc), so I can easily calculate the total energy, pressure, temperature. Now I open up the small door between the two rooms and I can easily model how the molecules mixing, causing the two rooms finally balancing their temperature and pressure.
Is there a formula for this? (If it is necessary, we can say that the available positions of the molecules are limited)

2. Jan 14, 2017

### Stephen Tashi

The concept of entropy can be related to probability, which in turn, applies to probability spaces defined over the "possible" microstates of a system. If you are simulating a deterministic system to the level of detail of simulating the position and velocity of each particle, you haven't introduced any idea of probability. At a given time in the simulation, the system would be in one particular microstate, so this does not account for any concept of it having a probability for being in each of several "possible" microstates.

You need to introduce the idea of probability. You might be able to do this by running many simulations that begin with some random variation in the initial conditions. Then at time T, the system might be in a different microstates on different runs. You could estimate the probability that the system was in a certain microstate by the frequency that it appeared in that microstate on different runs of the simulation.

3. Jan 16, 2017

### zrek

As far as I know, the probability calculations are (in nutshell) about the ratio of the all possible states and the desired result. This can be done even in deterministic scenarios: I don't have to run many simulations if I can calculate all of the possible ones. This is why I mentioned some limit for the molecule positions (and other property values). If we assume that it is possible to calculate for a scenario, can you suggest a formula that gives a concrete number for entropy?

4. Jan 16, 2017

### Stephen Tashi

That only works if all possible states have the same probability.

You describe doing a deterministic simulation down to the level of individual gas molecules and apparently you want to compute entropy as a function of time. That is not a typical textbook problem in thermodynamics (as far as I know). The textbook problem would only ask us to find the change in entropy between the initial state and the state where the gases have reached equilibrium.

You are describing a system that is not in equilibrium. If the Wikipedia can be believed (
https://en.wikipedia.org/wiki/Non-equilibrium_thermodynamics ) :

So we may be out of luck. Perhaps we can define entropy in microscopic terms if we introduce probability into the model. I'll have to think about it.

5. Jan 17, 2017

### Staff: Mentor

Let me understand your question. You have two chambers, with a partition between them. You have the same gas in each chamber. You know the volumes of the two chambers. And you know the initial temperatures and pressures of the of the gas in each of two chambers. Now you remove the partition, and let the system re-equilibrate adiabatically. You want to know the change in entropy from the initial state of the system to the final equilibrated state of the system. Is this correct? Are you allowed to use classical thermodynamics?

6. Jan 17, 2017

### malemdk

Net increase in entropy will be zero , since the intermixing of molecules are done in adiabatic condition , and all the position, velocity etc are known,and sum of the heat transfer will be zero , the change in entropy will be zero

7. Jan 17, 2017

### Staff: Mentor

Is this supposed to be the answer to the problem that I posed in post #6? If so, it is totally incorrect. Here is a link to a Physics Forums Insights article I wrote on how to correctly determine the entropy change for a process imposed on a closed system: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/ See if you can figure out how to apply the recipe to this problem.

8. Jan 17, 2017

### zrek

Yes, you described correctly what I want to do. I'd like to create a starting state with no equilibrium, different temperatures and pressures in the chambers, and watch what happens after I open the door. During the process while the termperatures balancing, I'd like to draw a graph with the values of the key properties: pressure, temperature ... and entropy. As far as I know, the entropy should increase. Since I can calculate the temperature and the pressure directly form the data of the particles, I'd like to do the same with the entropy. But I have no idea how to calculate it. Thank you for your help!

9. Jan 17, 2017

### zrek

Then it is not possible to calculate the entropy during a process in a simulated environment? There is no concept at all for problems like this?
Thank you! I'd be happy if you have a good idea. A concrete modell with a simulation would make anyone understand the entropy better, I think.

10. Jan 17, 2017

### Staff: Mentor

Are you willing to go to a continuum model? If so, you can get the transition of the entropy.

So you are able to get the entropy in the final state, correct?

11. Jan 17, 2017

### Stephen Tashi

For the sake of your other advisors, let's be clear that you want to

1) Model the gas as a finite number of particles
2) Compute entropy as a function of time - i.e. not merely compute the 2 numbers that give the entropy before the gases mix and the entropy after the gases have mixed and reached an equilibrium state.

For now, I leave the difficulties of 1) to you. (They including finding algorithms to determine when particles collide in the simulation.)

As far as I know, current textbooks do not provide any standard definition for the entropy of a system that applies to times when the system is going through states of non-equilibrium. It wouldn't surprise me if people have written papers proposing various definitions in that case. We'd have to search for such papers.

So, yes, the first difficulty of 2) is to define entropy for your simulation.

There are two basic approaches to defining entropy. One approach is "Shannon entropy", also known as "Information Entropy". This approach requires having a probability distribution. In thermodynamics, the entropy of a system can defined in terms of Shannon entropy by using a probability distribution for the system being in various possible microstates. However, at a given time t, your simulation is a single microstate - namely the microstate that specifies the positions and momenta of each of the particles. So you don't have a non-trivial probability distribution for the system being in several different microstates.

The other approach to defining entropy is thermodyamic entropy, which defines it by a differential equation.
e.g.
$t$ time
$U(t)$: total energy of the system at time t
$T(t)$: temperature of the system at time t
$P(t)$: pressure of the system at time t
$V(t)$: volume of the system at time t

The defining equation for entropy $S(t)$ is $U'(t) = T(t) S'(t) - P(t) V'(t)$

However, there are difficulties in defining some of these quantities for non-equilibrium situations. Consider the concept of "Volume of the system at time t". Take an extreme case where the system consists of 3 particles. Suppose they are within cube that is 1 meter on a side.

What is the "Volume of the system"? At a given time, you could fit the 3 particles in a triangular shaped thin box with each particle in one corner of the box. Is the "Volume of the system at time t" equal to the volume of this thin box? - Or is the "Volume of the system" equal to the volume of the 1 meter cube?

The concept that a 1 meter cube is the "Volume of the system" makes sense if the system consists of particles uniformly distributed throughout the cube. So "Volume of the system" becomes unambiguous when we are considering an equilibrium situation.

12. Jan 17, 2017

### malemdk

Consider the two room filled with few particles, let's 10, them how we will entropy?

13. Jan 19, 2017

### zrek

What do you mean on "continuum model" exactly? The Holtsmark-continuum model for example? Or you mean the way of continuum mechanics? I guess I'd prefer a way to calculate directly from the data of the given, discrete particles. But if there is an easy and correct way to map the states of particles to a continuum model, then I'd accept it also, if it leads to a numeric representation of entropy.

14. Jan 19, 2017

### zrek

From your words I assume that you know the way how to calculate "the 2 numbers that give the entropy" form the given finite number of particles. If this is the true, I also would be happy to know this.

I like your approach, the simple formula may work. I think, in case of ideal gases, the $U(t)$ is constant. $V(t)$ also constant, gases do not occupy a fixed volume, but expand to fill whatever space they can, don't they? However from the formula $U'(t) = T(t) S'(t) - P(t) V'(t)$ I can't get a concrete value for S(t), only its rate of change, right?

The probability is easy to calculate from the following concept:
With closed doors, there are several particles in the first chamber and there are others in the second. Their macrostate is represented by their all possible microstate in this situation, knowing that their exact position in the actual chamber is irrelevant. Now if we open the door, we have to count all of the possible positions of the particles, and compare this number to the number of states in which the same number of particles are in the very same chambers. This gives the probability distribution of microstates for a given macrostate. But how to calculate a concrete number for entropy from this probability (ratio)?

15. Jan 19, 2017

### Staff: Mentor

The latter.
So you are talking about using molecular dynamics, right?

In your original post, I got the feeling you were looking for a simple example of a problem that could illustrate the evolution of the entropy vs time for a system experiencing an irreversible change. Is this correct? If so, would a simpler system be acceptable, or does it have to be the ideal gas example.

16. Jan 19, 2017

### Stephen Tashi

No. I'm referrring to the 2 entropies that can be calculated from macroscopic parameters. Similar to Example 3 of https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/

They don't expand to fill the space instantaneously. If you want to compute entropy as a function of time in the problem you described, you are dealing with times before the gases have had "filled whatever space they can".

No. Counting the possibilities is unrelated to computing probability unless you can define the microstates so each microstate has the same probability of being occupied. How do you intend to define the microstates?

17. Jan 20, 2017

### Useful nucleus

From a pure classical thermodynamic point of view, there is no answer to your question because entropy is defined for equilibrium states only. The situation you describe is a succession of non-equilibrium states and classical thermodynamics cannot address it.

Statistical mechanics can allow you to calculate the difference in free energy (and hence also the difference in entropy) between the initial and final states even if the path that leads between them is a non-equilibrium one. The mixing process you describe is a non-equilibrium path. Using many molecular dynamics simulations and Jarzynski equation, you can compute the difference in free energy and consequently entropy. But you need to run many simulations to get a good ensemble average.

In order to get a time-dependent evolution of entropy, one needs a non-equilibrium definition of entropy. I think you can find such definitions in literature, but I'm not familiar with them and I'm not sure how useful they can be.

18. Jan 20, 2017

### zrek

I don't mind if the expansion is not instantaneous. It is perfect also if it is not instantaneous, but much faster then that my time resolution scale requires. In a computer simulation I can't give exact numbers for the other values too for every single moment, but I can calculate them with approximations, for example even with simple linear interpolation.
For example if there are several particles, they sometimes hit the wall of the chamber. I can't calculate the pressure value for every moment (or step) as the particles moving, but I can count continously how many particle hits a given area of the surface of the chamber for the last time section, so in a less exact time scale with a linear interpolation I can get pretty good result for the pressure. By this method I can state that the movement of the particles is so quick, that for the next analysed step of time the gas clearly fills the space, and this is a very good approximation.

I can map any actual configuration of the microstates of the particles to a macrostate. I can also define what and how many microstate configuration is equal for any macrostate. Isn't this enough? For example I can count how many particles are in one chamber and how many are in the other. I can also calculate how many possible microstate configuration results the same macrostate. I can clearly give a probability percentage for this current state compared to the all possible microstate configurations. For example I can calculate that "the current state have 35% chance to exist", and for sure I'll get the a maximum percentage if there are same number of particles in the two chambers. Isn't this percentage convertable to entropy somehow?

19. Jan 20, 2017

### zrek

This is correct, the simplest system is acceptable as far as it can be modelled from deterministic microstates, since I'd like to run my modell in a computer. I'd like to see a number for entropy just like for the temperature and pressure, during the process. I haven't realized, but you got it right, it is my intension to create a spectacular demonstration of an irreversible change. (Thank you for making my intension clearer even for me :-) )

20. Jan 20, 2017

### zrek

This sounds interesting. Could you give me a very simple example for the connection between the free energy and the entropy?