Does the total entropy of the universe remain constant?

In summary, the conversation discusses the application of the 2nd law of thermodynamics to the whole universe and its implications. While entropy increases in finite subsystems, it is believed that the total entropy of the universe remains constant. The idea that gravity is a resulting entropic effect is also mentioned, along with its potential to explain the expansion and acceleration of the universe. The concept of entropy being conserved in quantum mechanics is also brought up, as well as its relationship with energy. The conversation ends with a discussion about the increase in entropy in a closed system and its connection to Liouville's Theorem.
  • #1
robheus
148
0
Applying the 2nd law of thermodynamics to the whole universe is tricky, since if the whole universe would always increase entropy it would be like running down, and would imply some kind of begin and end state. That can't be the case.

So in my point of view the total entropy of the whole universe is constant. Nevertheless for all finite subsystems, entropy most likely increases, but at the same time it decreases the entropy of the rest of the universe by the same amount.

Yet, I have never heard of a physics law that states that.

Interestingly, the idea emerged that gravity ain't a fundamental force but is a resulting entropic effect. It could explain why it never fits into the grand unification theory.
In a small system a gas definitely would tend to a higher entropy state, yet on very large scales the opposites occurs, since matter tends to cluster due to gravitational interaction like in large stellar gas clouds forming star systems, etc.

If we were given photographs of such a system at different times, without knowing the scale and or the temporal order, what would we conclude? If the systems happened to be at a large scale system and we see the system becoming more clustered, we would know the order was forward in time, but is the scale was small we would have to conclude the order was backward in time.

A theory that would have gravity emerge as a result of entropy also probably could get rid of "dark matter" and "dark energy", and that is one more puzzle solved.

Could it be that the universe, which at scales we can observe is expanding and accelerating at still higher scales is at the same time contracting?
 
Last edited:
Science news on Phys.org
  • #2
Yes, total entropy is conserved, but it's a very complicated subject.

The simplest illustration is from Many Worlds Interpretation. The total entropy across many-world is conserved. Entropy in every individual world is increasing.

Edit: Because this is something that really isn't often discussed, I suspect somebody is going to ask why I claim entropy to be a conserved quantity in quantum mechanics. If you look up definition of Von Neumann Entropy, S = Tr[ρ ln ρ], you will notice that it is invariant under unitary transformation. Since time evolution of ρ is unitary under time-independent Hamiltonian, quantity S(ρ) does not change in time. In canonical QM, entropy can ONLY increase when collapse of the wave function takes place, which makes sense. However, since mechanism of collapse is explained about as well as magic dust, it does not seem like a convincing enough explanation of entropy increase either. Both Decoherence and MWI models look after it, but MWI approach is easier to illustrate. And since all of the interpretations are equivalent anyhow, one might as well go with the one that makes most sense in this particular matter.

P.S. There is an equivalent Classical Mechanics Theorem, but I'm not really familiar enough with Classical Entropy to comment.
 
Last edited:
  • #3
Energy is conserved, entropy is not. The entropy of a system increases until its energy distribution reaches equilibrium. Perhaps you are thinking of some application of non-equilibrium thermodynamics of which I am not aware (very possible), but that's the textbook answer.
 
  • #4
Why don't you describe us a closed system in which entropy actually increases.

The "textbook answer" is given for an open system. Universe is not an open system.
 
  • #5
Hmm. I'm almost sure I don't understand the question, because it's too easy. I've been recently reviewing my thermodynamics and that's one of the reasons I was interested in this thread. So I am certainly willing to read more.

But isn't a standard example like mixing ice and water in a thermally isolated calorimeter and example of increasing entropy? The heat flow is the same, but the heat flows at different temperatures, so the entropy increase in the ice exceeds the entropy decrease in the water.
 
  • #6
Not if you consider all possible states of water molecules in both the ice crystal and free in the solution. It's only when you say "Here, I have a state with ice and water, vs state with just water, which one has more possible microscopic states?" that the entropy changes, but that's cheating.

The very fact that with some infinitesimal probability the water molecules in liquid water might gather up spontaneously in one place and form a block of ice means that you have to consider these configurations when computing entropy. And then the entropy of the overall state is preserved.

More generally, consider what Liouville's Theorem tells you about entropy. Obviously, entropy applies to a collection of "similar" states, not to a single state. Entropy of any unique state is precisely zero. Always. (Only one possible microscopic state). So take a neighborhood in phase-space, and call it your initial system. By Liouville's Theorem, after time evolution, the collection of possible states is going to have the same measure. In other words, the number of possible microscopic states does not change. Entropy is conserved.

Back to the melting ice example. If you describe water with and without ice as distinct macroscopic states (of which you'll have a continuum, depending on how much ice you have), then you have an entropy increase. But this is essentially because along every step in the dynamics you redefine what you consider to be a local neighborhood of "similar" states.

The exact same thing happens under MWI every time an observer makes an observation. A number of possible, but unlikely, states are discarded, and new "similar" states are added. As a result, the number of possible states of the system is increasing.

Still, for an overall system, whatever its initial states are, there is a classical theorem, the Liouville Theorem, and a quantum one, as I outlined in my previous post, that state that the overall entropy is conserved.

This is why I said from the very beginning that this is not a simple subject, and it is entirely ignored in most thermodynamics texts.

I think Landau and Lifgarbagez might address these problems. Or at least mention them. I'll look it up, and try to find you some references that go into more detail.
 
  • #7
dulrich said:
Energy is conserved, entropy is not. The entropy of a system increases until its energy distribution reaches equilibrium. Perhaps you are thinking of some application of non-equilibrium thermodynamics of which I am not aware (very possible), but that's the textbook answer.

The *big* issue is how it is conceived that for the whole universe the entropy goes up, without arriving at the difficult if not impossible philosophical position that -seemingly- the universe had a *start* condition (lowest entropy) and will have an *end* condition (highes entropy).
That is why I made the *grand* assumption that for any finite subsystem (f.i. the observable universe) the 2nd law does apply, but that for the whole universe the total entropy is a constant.
 
  • #8
K^2 said:
Universe is not an open system.

We have to take care with what we mean with either open or closed system in the thermo-dynamic sense, since i.m.h.o. they only apply to finite sized subsystems.

The universe as a whole is neither open (the univere is not in contact with any system *outside* itself) nor closed (the universe does not have a boundary), and that is why I assume the 2nd law only applies to finite size (any size) subsystyems, but explicitly *not* for the whole universe.
 
  • #9
I can pretty much guarantee you that the universe is finite and closed in the spatial dimensions and open and infinite in time dimension. I'd also bet good money on it being an S3R or T3R. But that's more of a hunch.
 
  • #10
K^2 said:
I can pretty much guarantee you that the universe is finite and closed in the spatial dimensions and open and infinite in time dimension. I'd also bet good money on it being an S3R or T3R. But that's more of a hunch.

It is speculation but I see no reason why space would be finite. The observable universe surely is, but then this does not imply anything about space being finite. Why do you assume space to be finite, whereas time is infinite?

The only thing afaik cosmologist agree on is the no-boundary proposal: space nor time has edges or boundaries.
 
  • #11
It could, in principle, be infinite, but that would suggest a very strange topology as we roll it back to big bang. Keep in mind that our inflation model requires that it's not just stuff that flies out into space, but the universe itself, as in the space itself, inflating from a point. So you'd have all that infinite space collapsing into a point at big bang. You'd have to come up with a heck of a model to explain that one.

On the other hand, if we assume that universe is finite and closes on itself, we don't have that problem. This seems to be consistent with all observations and with accelerated expansion.
 
  • #12
K^2 said:
It could, in principle, be infinite, but that would suggest a very strange topology as we roll it back to big bang. Keep in mind that our inflation model requires that it's not just stuff that flies out into space, but the universe itself, as in the space itself, inflating from a point. So you'd have all that infinite space collapsing into a point at big bang. You'd have to come up with a heck of a model to explain that one.

It's only the observable universe that traces back to that small dot at the beginning of inflation, but who says that that dot was all there is/was?

In Inflationary scenario's, inflation keeps going on in different parts of the universe, it's in fact an eternal process, once started will never end, and in principle could be past eternal too (didn't start at all).

On the other hand, if we assume that universe is finite and closes on itself, we don't have that problem. This seems to be consistent with all observations and with accelerated expansion.

Not my piece of cake; it begs the question of what is "outside" that finite bubble.
 
Last edited:
  • #13
K^2:
And to the issue of global topology, in my pov there is no global topology, we can only define and know a local topology for the universe.
 
  • #14
There is no outside. That's the part that you are missing. Our current understanding of cosmology pretty much requires the universe to be closed on itself. If you believe otherwise, you are operating outside of standard model. In that case, I don't know, and nobody else does either. But all signs point to it being the other way around.
And to the issue of global topology, in my pov there is no global topology, we can only define and know a local topology for the universe.
Do you understand what topology means? And do you understand the connection between gravity and topology of space-time? Because at this point, you are really just making stuff up.
 
  • #15
K^2 said:
The total entropy across many-world is conserved. Entropy in every individual world is increasing.

Maybe I missed something, but aren't these two statements contradicting each other? Isn't the total entropy of the many-worlds-manifold simply the sum of the entropy in the individual worlds? Seems to me that if every component has an increasing entropy then the total entropy must also increase.
 
  • #16
No. Entropies of two sub systems of a larger system are additive. Because total number of states is N*M, and ln(N*M) = ln(N) + ln(M). The number of states of the manyworld is not a product of number of states in each world. So there is no such property.

If we go to classical subsystems, imagine that some of the states in N are also in M. Then the total number is not N*M anymore. Since worlds of manyworld are just projections of the total state onto the state of observer, there are many intersections. The most basic example is your ability to read what I'm writing, despite the fact that both of us are distinct observers in the manyworld. There is the intersection of states, and ability to increase entropy as a result.
 
  • #17
"There are 2 things in the world which are infinite - the universe and human stupidity, but I am not sure about the first one."
- Albert Einstein
 

1. What is "Total Entropy = constant" and why is it important in science?

Total Entropy = constant is a fundamental principle in thermodynamics that states that the total entropy of an isolated system remains constant over time. It is important because it helps us understand the direction and magnitude of spontaneous processes and plays a crucial role in determining the efficiency of energy conversion.

2. How is "Total Entropy = constant" related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. Total Entropy = constant is an application of this law, as it states that in an isolated system, where no energy or matter can be exchanged with the surroundings, the total entropy remains constant.

3. Can the total entropy of a system ever decrease?

No, according to the second law of thermodynamics, the total entropy of a closed system can never decrease. It can only remain constant or increase. This is due to the spontaneous and irreversible nature of natural processes.

4. How is "Total Entropy = constant" measured in a system?

The total entropy of a system can be measured using the equation S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates of the system. This equation relates the microscopic properties of a system to its macroscopic entropy.

5. What are some real-world examples of "Total Entropy = constant"?

Some examples of "Total Entropy = constant" in action include heat transfer, chemical reactions, and natural processes such as weathering and erosion. In all of these cases, the total entropy of the system remains constant or increases, in accordance with the second law of thermodynamics.

Similar threads

  • Thermodynamics
Replies
3
Views
1K
Replies
13
Views
1K
  • Thermodynamics
Replies
26
Views
1K
  • Thermodynamics
Replies
1
Views
738
Replies
17
Views
1K
Replies
1
Views
917
Replies
9
Views
6K
  • Thermodynamics
2
Replies
57
Views
6K
Replies
2
Views
849
Back
Top