Could we picture the universe without entropy (in it)?

Click For Summary
Entropy describes the tendency of systems to evolve towards states with higher disorder, as there are significantly more microstates corresponding to mixed states than to separated states. The discussion highlights that while individual patterns, like specific arrangements of gas molecules or cards, may seem equally likely, the sheer number of configurations that lead to higher entropy makes those states overwhelmingly more probable. The argument emphasizes that recognizing patterns is subjective and anthropocentric, as patterns exist regardless of human perception. Ultimately, the probability of achieving a specific low-entropy state is vastly lower than that of achieving a high-entropy state due to the greater number of microstates available. This understanding underscores the fundamental nature of entropy in physical systems.
entropy1
Messages
1,232
Reaction score
72
It is sometimes said that entropy is "unlikely" to return to the "pattern" that it came from, for instance: if we have a vat with blue gasmolecules and white gasmolecules separated by a slit, if we remove the slit, the blue and white molecules will mingle, unlikely to return to their separated state spontaneously.

So, it would have to be equally unlikely for the molecules to form a picture of a horse's silhouette or a locomotive's silhouette, for example.

But that leaves me wondering if a set of cards "Ace of hearts", "Ace of spades", "Ace of clubs", "Ace of diamonds" is more 'special' than a set of cards "King of spades", "Seven of hearts", "Jack of spades", "Nine of clubs", for example.

It seems to me a little antropomorphic: if we humans see a pattern, there is a pattern, and otherwise not!

For instance: if we lay on our backs looking at the clouds, who are subject to entropy, we see patterns in them anyway!

It seems to me that the blue and white gasmolecules could indeed return back to their separated state. But the chances they do are equally likely as any other pattern! It is just that we don't like those patterns as much!

If we weigh all patterns equally, would there still be entropy?

So I wonder if we could indeed picture the universe without entropy.
 
Science news on Phys.org
entropy1 said:
But the chances they do are equally likely as any other pattern! It is just that we don't like those patterns as much!
This is wrong. The point is that there are many (many!) more microstates that satisfy the half-and-half condition than the all-separated condition. It is therefore much (much!) more likely that the system will remain in its mixed state.

Let’s do the counting in the case of two blue and two white particles. For the separated state there are only two possible microstate. Both blue to the left and white to the right or vice versa. For the fully mixed state there are two independent possibilities for each color: blue1 left & blue2 right or blue1 right & blue2 left - and the same for white. This leads to a total of 4 possible microstates that are all-mixed. Twice the number of the all-separated case and we only had four particles! The effect is magnified many (many!) times as you increase the number of particles.
 
Orodruin said:
This is wrong. The point is that there are many (many!) more microstates that satisfy the half-and-half condition than the all-separated condition. It is therefore much (much!) more likely that the system will remain in its mixed state.
Ok. Let's suppose there are N states, each state with probability 1/N of occurring. Now take one of those states called M1. Let's call the separated state S1. The probability of getting S1 is equal to the probability of getting M1. Of course, there are many mixed states that are similar to M1 (M2, M3...). But mixed state M1 is just as likely as separated state S1, as state M2, as state M3, etcetera.

So if I wanted to wait until M1 had occurred, I would on the average have to wait just as long as for S1. Or for M2 or M3 for that matter.
 
But that is just the point you are missing. That each microstate is equally probable does not mean that each macrostate is. There are just more microstates corresponding to the macrostate of higher entropy. This is the definition of entropy, ##S = k \ln\Omega##, where ##\Omega## is the number of microstates compatible with the macrostate.
 
  • Like
Likes entropy1

Similar threads

  • · Replies 25 ·
Replies
25
Views
8K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
12K
  • · Replies 6 ·
Replies
6
Views
4K
Replies
10
Views
6K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 8 ·
Replies
8
Views
8K
Replies
1
Views
3K