How can entropy increase in a deterministic universe

Click For Summary
In a deterministic universe, the concept of entropy becomes complex because classical physics suggests that all microstates are not equally probable, leading to questions about why entropy increases. The discussion highlights that while deterministic laws govern particle behavior, the vast number of particles makes precise predictions impossible, resulting in a coarse-grained understanding of the system. This coarse-graining creates an appearance of randomness and disorder, even within deterministic frameworks. Two main explanations for entropy increase are proposed: one focuses on imprecise knowledge of the system's state, and the other on how deterministic evolution leads to more possible microscopic states for a given macroscopic state. Ultimately, even in a deterministic system, the mixing of states tends to lead to increased disorder, illustrating the inherent complexity of entropy.
  • #31
stevendaryl said:
It's only one state at a time, but all that we know about that state is that it is a yellow state or a blue state. If it's yellow, then the system is in one of 100 possible microstates. If it's blue, it's in one of 9900 possible microstates.

In the statistical mechanics notion of entropy, there are two kinds of states:
  1. The microstate, which in my drawing would be represented by a pixel in the picture.
  2. The macrostate, which is an observable property of the microstate. In my example, it would be the color of the pixel.

So far, that's clear.

The entropy of a microstate is proportional to the log of the number of microstates with the same macrostate. So in my example, the entropy of a yellow state would be the log of the area that is colored yellow. The entropy of a blue state would be the log of the area that is colored blue.
Then "Entropy" as a property of a state doesn't change as a function of time, correct? A state is a state. It doesn't become a different state as time passes.

A system can change from one state to another, so we can define the entropy "of a system" at time t as the entropy of the state it is in at time t.

With the numbers that I made up, the entropy of a yellow state is proportional to log(100) = 2 (using base 10). The entropy of a blue state is proportional to log(9900) = 3.996. So blue states have about twice the entropy of yellow states.

That clarifies the intepretation of the picture. But it isn't clear how the picture shows that the entropy of a system never decreases with time. That would say that if a system in the yellow patch ever moves into the blue then it must stay there. If the intended meaning is not that this be strictly true, but that it be true "most of the time" then we are introducing probability or time averages into the model, aren't we? We are saying something like "Over a long period of time, a system that has moved into the blue area will spend most of its time in the blue area?"
 
Science news on Phys.org
  • #32
Stephen Tashi said:
As I said, I don't object to a probabilistic analysis. But, this only answers the OP in an empirical sense. You are saying that, in practice, we cannot do a deterministic analysis, so we resort to probability.

The empirical treatment of the problem is not a solution to the theoretical question of why deterministic laws would imply that entropy does not decrease with time. As far as I can see, there is no guarantee that arbitrarily selected deterministic transition laws will have such a result.

Can we show that deterministic transition laws that have particular properties imply that entropy increases with time? I think stevendaryl is attempting to show that it is sufficient to require that the transition laws are reversible.

I'm not talking about arbitrarily selected deterministic laws. I'm talking about the laws of motion and particle collisions as they exist in our universe.
 
  • #33
Stephen Tashi said:
Then "Entropy" as a property of a state doesn't change as a function of time, correct? A state is a state. It doesn't become a different state as time passes.

I'm not sure I understand the question, but let's imagine a machine that has two internal counters, i and j that can range over a value from 1 to 100. The machine has an internal representation of my drawing. It behaves this way: Every time one of the counters changes, it looks up the color of the pixel with coordinates (i,j) and turns on the yellow light or blue light as appropriate.

The state is the pair (i,j), and its changing with time according to some unspecified rule. The mapping from coordinates to colors is recorded in the drawing, and that mapping is not changing with time.

A system can change from one state to another, so we can define the entropy "of a system" at time t as the entropy of the state it is in at time t.

Yes.

That clarifies the intepretation of the picture. But it isn't clear how the picture shows that the entropy of a system never decreases with time.

It's not necessarily true that the entropy never decreases. But in my example, with 100 yellow states and 9900 blue states, if the evolution is reversible, at most 100 blue states out of 9900 will experience decreasing entropy. So entropy-decreasing transitions are rare.

In a more realistic situation, such as one involving 10^{23} particles, the number of states that will experience decreasing entropy will be microscopically tiny compared to the total number of states.

That would say that if a system in the yellow patch ever moves into the blue then it must stay there.

No, the rule that entropy always increases is a statistical statement: for the vast majority of states, it's true. There may be (and will be) a small number of states that will violate this rule.

If the intended meaning is not that this be strictly true, but that it be true "most of the time" then we are introducing probability or time averages into the model, aren't we?

Yes, in the statistical mechanics interpretation of entropy, it necessarily involves uncertainty about what the actual (microscopic) state is. So probability is involved. But the probability does not reflect nondeterminism in the laws of physics.
 
  • #34
stevendaryl said:
The state is the pair (i,j), and its changing with time according to some unspecified rule.
I agree that the state being measured is changing. That's because the thing being measured is changing states, not because a state itself is being redefined as a different state. It is the vocabulary distinction that we'd make in saying a person's mass changed from 80 kg to 85 kg. It's the person who changed, not the definition of an 80 kg mass. In the diagram, a point on the diagram remains fixed and we imaging the system moving from one point to another. The entropy of a point (eg.. at a point in the blue area) is constant with respect to time.

It's not necessarily true that the entropy never decreases.

I agree. So part of answering the OP's question:
Then the question again rises, why does entropy increase and not decrease, if all microstates all not equally likelly?
is to say that entropy may sometimes decrease.

( I think the OP is implying that deterministic transition laws might result in microstates not being equally likely.)

Another part of answering the OP is to determine what "equally likely" or "not equally likely" would mean. These phrases only makes sense if we are considering a model that involves probability or a model where we can interpret "equally likely" to mean "an equal fraction of the time" or "an equal number of things" out of the total number of things.
But in my example, with 100 yellow states and 9900 blue states, if the evolution is reversible, at most 100 blue states out of 9900 will experience decreasing entropy. So entropy-decreasing transitions are rare.

In what sense "rare"? Since we are not using a probabilistic model, "rare" with respect to a single given system must refer to some sort of frequency of occurrence of its states (and entropies) measured over an interval of time - correct? And "rare" for with respect to a state (or a color of a state) would refer to a small fraction of states out of the total number of states.

You explained that reversibility of transitions implies that at most 100 blue states can change to yellow states in a "step" of the transition process. So such instances are rare with respect to a fraction of states. What we wish to show is that entropy increase is rare for a system. So to connect "rare for a system" with "rare for a state", it seems to me that we must consider the behavior of a system over time, as it passes through different states.

Unless we stipulate that a system can pass through each of the states in the diagram then we also have consider more than one system in our definition of "rare for a system".

The argument based on the diagram depends on making the yellow area smaller than the blue area - which we interpret as making the number of yellow states smaller than the number of blue states. How do we relate this to the physical definition of entropy? Crudely put, why would it be necessary to color the microstates ( i.e. the points) so the areas are different?

In some sense microstates are "equally likely" - that is a requirement enforced by how microstates are defined, isn't it? The problem is to say what "equally likely" means in a completely deterministic scenario. As a fraction of states, each unique microstate is 1/(total number of microstates). That would hold no matter how we define the microstates.

If we visualize the diagram as discrete points instead of a continuum then we impose the restriction that in one "step" of time, a system must move from one point to another or stay on the point where it is and remain there forever after. E.g. it is impossible for a system to remain at a point for 3 steps and then move off of it. There are no sub-microstates within the microstate represented by a point. I think the implementation of microstates in physics is such that this concept is a good approximation. However, what technical part of the definition a microstate guarantees this?


Yes, in the statistical mechanics interpretation of entropy, it necessarily involves uncertainty about what the actual (microscopic) state is. So probability is involved. But the probability does not reflect nondeterminism in the laws of physics.

It seems to me that probability can be avoided if entropy can be define in terms of numbers of states. One may then introduce probability by saying "Suppose we pick a microstate at random, giving each microstate an equal probability of being selected". I agree that introducing probability in this manner does not require that the population being sampled was generated by some random process.
 
  • #35
Stephen Tashi said:
In what sense "rare"? Since we are not using a probabilistic model, "rare" with respect to a single given system must refer to some sort of frequency of occurrence of its states (and entropies) measured over an interval of time - correct?

Well in systems that are "ergodic", the system visits every state (or gets arbitrarily close to every state), so the entropy does reflect the amount of time spent in each macroscopic state.

But I wasn't talking about time, I was talking about uncertainty. If all you know is that you're in some blue state, then there is a 1/9900 chance that you're in any particular blue state. That's subjective probability.

You explained that reversibility of transitions implies that at most 100 blue states can change to yellow states in a "step" of the transition process. So such instances are rare with respect to a fraction of states. What we wish to show is that entropy increase is rare for a system. So to connect "rare for a system" with "rare for a state", it seems to me that we must consider the behavior of a system over time, as it passes through different states.

I just meant that it is improbable (using a subjective notion of probability) that the next state will be lower entropy, if all we know about the current state is that it is a blue state. The statement about the fraction of time spent in blue versus yellow states may also be true, but we would have to make more complicated assumptions about the nature of the transition relation to make that conclusion.

The argument based on the diagram depends on making the yellow area smaller than the blue area - which we interpret as making the number of yellow states smaller than the number of blue states. How do we relate this to the physical definition of entropy? Crudely put, why would it be necessary to color the microstates ( i.e. the points) so the areas are different?

I'm assuming that the color is a macroscopically observable property of the state. The point of using an example where the two colors corresponded to different numbers of states is just that otherwise, the entropy would always be constant. Entropy is only a useful concept when it differs from state to state.

In some sense microstates are "equally likely" - that is a requirement enforced by how microstates are defined, isn't it? The problem is to say what "equally likely" means in a completely deterministic scenario.

It's a subjective notion of probability. If somebody shuffles a deck of cards and you pick a card from within the deck, there is a subjective probability of 1/13 that your card will be an ace. You have no basis for assuming anything else. It's possible that someone who has studied shuffling and has studied your preferences in picking a card could make a better prediction, but given no more information than that the deck has been shuffled and you have picked a card, there is no basis for any choice other than 1/13.

If we visualize the diagram as discrete points instead of a continuum then we impose the restriction that in one "step" of time, a system must move from one point to another or stay on the point where it is and remain there forever after. E.g. it is impossible for a system to remain at a point for 3 steps and then move off of it.

Right, in a deterministic system where the next state is a function of the current state.

There are no sub-microstates within the microstate represented by a point. I think the implementation of microstates in physics is such that this concept is a good approximation. However, what technical part of the definition a microstate guarantees this?

I'm not sure what you mean by "sub-microstate". Do you mean that the microstate itself may actually be a macrostate, with even more microscopic details?

It seems to me that probability can be avoided if entropy can be define in terms of numbers of states. One may then introduce probability by saying "Suppose we pick a microstate at random, giving each microstate an equal probability of being selected". I agree that introducing probability in this manner does not require that the population being sampled was generated by some random process.

Boltzmann introduced the purely discrete notion of entropy: S = k log(W), where S is the entropy of a macrostate, k is Boltzmann's constant, and W is the number of microstates corresponding to the same macrostate.
 
  • #36
stevendaryl said:
But I wasn't talking about time, I was talking about uncertainty. If all you know is that you're in some blue state, then there is a 1/9900 chance that you're in any particular blue state. That's subjective probability.

Being a Bayesian by preference, I approve of subjective probability.

I just meant that it is improbable (using a subjective notion of probability) that the next state will be lower entropy, if all we know about the current state is that it is a blue state. The statement about the fraction of time spent in blue versus yellow states may also be true, but we would have to make more complicated assumptions about the nature of the transition relation to make that conclusion.

In order to make statement about the subjective probability that a system makes a transition from one color state to another color at randomly selected time I think we must stipulate that the system spends an equal time "during" each microstate transistion and "on" each microstate, if we are modeling a "rest" interval between transitions.

What's the realistic situation for a laboratory experiment? I can conceive of preparing a system so it has a definite state and continuously measuring its state at evenly spaced discrete intervals over a 1 hr time interval. That would allow us to say what fraction "of the time" the system made transitions of a certain type ( yellow to blue etc).

I'm assuming that the color is a macroscopically observable property of the state. The point of using an example where the two colors corresponded to different numbers of states is just that otherwise, the entropy would always be constant. Entropy is only a useful concept when it differs from state to state.
That's a very important observation!
I'm not sure what you mean by "sub-microstate". Do you mean that the microstate itself may actually be a macrostate, with even more microscopic details?

Yes.

My general line of thinking is this: When the physical state of a system is described by a vector of values ##(x_1,x_2,x_3,...x_n)## where each ##x_i## may take values in a continuous range of real numbers, then approximating the vector as a discrete microstate presumably involves defining how a set of these vectors is to be regarded as the same microstate. For example, if we want microstates to be "boxes" we could define a microstate ##m_k## to be the set ##(x_1,x_2,x_3,...,x_n): a_{k1} < x_1 \le b_{k1}, a_{k2} < x_2 \le b_{k2}, ... ,a_{kn} < x_n \le b_{kn} ##.

If we wish to make direction connection between "fraction of the number of states that are so-and-do" and "fraction of the time the system is in a state that is so-and-do" then we cannot pick the boundaries ##a_{k_1},a_{k2},...,b_{k1},b_{k2},...,b_{kn} ## arbitrarily. We need to pick them so that the dynamical law that governs the system's trajectory through the states ##(x_1,x_2,...,x_n)## implies that the system spends approximately the same time in each microstate ##m_k##.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
Replies
46
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
39
Views
6K
  • · Replies 13 ·
Replies
13
Views
3K