Understanding Entropy: Micro vs. Macro States of a Cracked Egg

In summary: A sealed glass of water has the same entropy as a open glass of water because the open glass has more information available to it.
  • #1
mviswanathan
39
0
I need some clarification on definition of Entropy. According to the simple example of the egg in the kitchen, the entropy of the whole egg is lower than that of a cracked egg. This, it is said due to the fact that, there are more ways to be broken than a whole egg. But, if one talks of a particular Macro state, it must refer to the macro state of the ‘particular cracked egg’ – the egg cracked exactly in a specific fashion. In such a case how can it be claimed that there are more micro-states corresponding to the cracked-egg than a whole egg. Or, one could be comparing a whole egg and ‘not-whole’ egg. Then, of course there are more ways of ‘not- whole’ egg than a whole egg. Then again, we can consider an egg ‘cracked-in- a-specific-fashion’ and the one ‘not-cracked-in- that-specific-fashion’. In this case there are more possibilities of the later than the former and the later could be a whole-egg. Does it mean that, in this case the whole-egg has more entropy than the cracked one? The same could also apply to the fallen down and broken pieces of a cup – broken and scattered in that particular fashion.

Obviously, I am going wrong somewhere. Please help.
 
Science news on Phys.org
  • #2
Entropy is "not knowing." It's not correct to equate a general cracked egg with an egg that has a perfectly specified crack; their entropies are different.

I'm not accustomed to the egg analogy; I'm more used to thinking about it in terms of a deck of cards. A randomly shuffled deck has a higher entropy than an ordered deck, even through it's the same deck. You don't know the arrangement of cards. However, if you took a specific randomly shuffled deck and called its specific sequence the "mviswanathan sequence," then any deck with the mviswanathan sequence would have a low entropy--the same as an ordered deck--because you know the position of each card with no uncertainty.

This is why I disagree with the statement

mviswanathan said:
But, if one talks of a particular Macro state, it must refer to the macro state of the ‘particular cracked egg’ – the egg cracked exactly in a specific fashion.

because when you focus on an "egg cracked exactly in a specific fashion," you're not looking at the same system, but on a different, lower-entropy system. I hope that helps resolve the apparent contradiction.
 
  • #3
Mapes is correct- the broken egg/shuffled deck/broken cup has a high entropy because many microstates (the specific state of the brokeen egg/cup) correspond to the same macrostate- a state that is an average over certain thermodynamic variables.

It's the same reason why a gas in a box, if specified to exist at a certain temperature T one macrostate), consists of an extremely large number of microstates- the specific positions and velocities of each molecule in the box.
 
  • #4
It really depends on how you define "macro" vs. "micro" states. Different possible ways of cracking the egg are distinguishable on visual inspection at a macroscopic level, so I suppose you could define them all as different macrostates. The cracked egg vs. whole egg is really just meant as a conceptual analogy though, I don't think there's any "standard" way to define the macrostates for such a system, normally thermodynamics deals with more macroscopically uniform systems like gas-filled boxes where macrostates are defined in terms of the value of macro-parameters like temperature and pressure.
 
  • #5
Mapes said:
Entropy is "not knowing." ...

because when you focus on an "egg cracked exactly in a specific fashion," you're not looking at the same system, but on a different, lower-entropy system. I hope that helps resolve the apparent contradiction.

I did not understand the first statement.

In the end, does it mean that there is no absolute value of entropy - and it depends on from where/what you are looking? Then how can one have a statement that the entropy of a closed system keeps going up?

May be it has to do with what a "system" means. Still on the egg-story - the whole egg and the broken egg with all the pieces, do they not refer to the same system since broken egg exactly has all the components of the original?

Or may be some difference, considering the binding forces/stresses in the original egg which got released (may be I am not using the right words) when the egg breaks?
 
  • #6
mviswanathan said:
In the end, does it mean that there is no absolute value of entropy - and it depends on from where/what you are looking? Then how can one have a statement that the entropy of a closed system keeps going up?

This is an excellent point. We do have an absolute value for entropy because of the conventions we adopt. The eggs, decks of cards, etc. are analogies, so let's go to the better example of a gas in a box that Andy brought up. Our convention is to measure the bulk temperature or total energy, which is relatively easy to do. (These are macrostate variables.) We cannot determine the momentum and position of each atom. But there are many possible values of momentum and position (i.e., possible microstates) that could produce a given temperature or total energy. We will never know which microstate we're in, and it changes every instant anyway. This is the "not knowing." We take the entropy to be proportional to the logarithm of the number of microstates, and we set a reference point of zero (or as close to zero as to be unmeasurable) at a temperature of absolute zero for a system in equilibrium. These conventions are what allow us to talk quantitatively about entropy and changes in entropy.
 
  • #7
mviswanathan said:
In the end, does it mean that there is no absolute value of entropy - and it depends on from where/what you are looking? Then how can one have a statement that the entropy of a closed system keeps going up?
Because regardless of how you choose to define your macrostates (what physicists would call your choice of 'coarse-graining' for the states of the system), the entropy of a given macrostate is always defined in terms of the logarithm of the number of microstates associated with that macrostate, and it can be proved that if we start with a randomly-chosen microstate from whatever the initial macrostate is, the underlying dynamics of the system are always more likely to take the system to future macrostates that have a greater number of microstates associated with them. I believe you only need a few basic assumptions about the dynamics governing the system to prove this, such as the assumption that the dynamics are such that http://en.wikipedia.org/wiki/Liouville's_theorem_(Hamiltonian)]Liouville's[/PLAIN] theorem is respected (in classical statistical mechanics anyway, quantum statistical mechanics might require some different assumptions). I haven't studied this stuff in a while, but my understanding was that at a conceptual level, one way of putting Liouville's theorem is that if you pick any region of the "phase space" (an abstract space where every point represents a particular microstate, and macrostates would be volumes containing many points) and assume the system is equally likely to occupy any point in that region, then if you evolve all these points forward to see the region of phase space that this set of systems occupies at some later time, then the volume of the later region will be the same--it can be seen as a kind of conservation law for phase space volume over time. You are free to start from a volume of phase space representing a macrostate that is far from equilibrium. One can see intuitively why this sort of thing would be helpful in proving the second law, since it means there can't be any small volumes of phase space that larger volumes are being "attracted" to, and we know that lower-entropy macrostates represent a much smaller proportion of the total volume of phase space than higher-entropy macrostates. See the discussion here, for example. Roger Penrose also has a good discussion of this stuff on pages 228-238 and 309-317 of his book The Emperor's New Mind (if you read that book, keep in mind that most mathematicians would disagree with his idiosyncratic ideas about human mathematical ability being noncomputable, and most physicists would disagree with his speculations about quantum gravity--his discussions of mainstream physics ideas are quite good though).
 
Last edited by a moderator:
  • #8
Last edited:
  • #9
Thanks everyone. I just joined this forum and frankly did not expect such an active participation. I have decided to go though the information and links suggested (and come back if I still have some doubts) :)
 
  • #10

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In simpler terms, it is the measure of how spread out or disorganized the particles in a system are.

2. How does entropy relate to a cracked egg?

When an egg is cracked, it goes from a state of low entropy (the yolk and whites are contained within the shell) to a state of high entropy (the yolk and whites are spread out and mixed together). This increase in disorder or randomness is a result of the breaking of the egg's shell.

3. What is the difference between micro and macro states in relation to entropy?

Micro states refer to the individual particles or molecules in a system, while macro states refer to the overall state or behavior of the system as a whole. In the case of a cracked egg, the micro state would be the individual molecules of yolk and whites, while the macro state would be the overall state of the cracked egg.

4. Can entropy be reversed?

In most cases, entropy cannot be reversed. Once a system has reached a state of high entropy, it is highly unlikely for it to spontaneously return to a state of low entropy. This is due to the fact that the increase in entropy is a result of the natural tendency for particles to move towards a more disordered state.

5. How does understanding entropy help in scientific research?

Understanding entropy is crucial in various fields of science, including thermodynamics, chemistry, and biology. It helps scientists understand and predict the behavior and changes in complex systems, and also plays a role in understanding the direction of chemical and physical reactions.

Similar threads

Replies
51
Views
3K
Replies
3
Views
953
  • Biology and Medical
Replies
9
Views
200
Replies
4
Views
3K
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
21
Views
3K
Replies
62
Views
13K
  • General Discussion
2
Replies
38
Views
7K
Back
Top