Entropy & Order: A Puzzling Paradox

  • Thread starter JJRittenhouse
  • Start date
  • Tags
    Entropy
In summary: It goes up when things are getting more chaotically mixed up,but it's always going to stay the same,unless something changes.So basically, entropy is like how warm or cold something is,or like how messy something is.In summary, entropy is a measure of how disorder something is, and it always stays the same unless something changes.
  • #1
JJRittenhouse
44
0
Something that has confused me for some time is how the entropy in the universe is less in the past than it is now, even though it seems to be getting more ordered.

I know that entropy must increase in any closed system. If the universe is finite, then it is a closed system?

I know also, for example, if my room were messy, and I cleaned it up and brought more order to it, the act of cleaning it up would expend energy, increasing the entropy of the system.

Does this act, along with the new state of the room increase the entropy more than if it were left messy?
 
Science news on Phys.org
  • #2
JJRittenhouse said:
Something that has confused me for some time is how the entropy in the universe is less in the past than it is now, even though it seems to be getting more ordered.

How does it seem to be getting more ordered ? and also how do you define order (although this could be rather somehow off topic)?

Although this could better fit in the cosmology sub-forum, I believe the Big Bang theory describes the universe initial condition as 'low entropy initial conditions'. the Cosmic Microwave background CMB maps show how homogeneous the early universe was.

Personally speaking, I tend to think of entropy as : a measure of our ability to predict the system's microstates. I think this could be less confusing than ascribing entropy to 'order'

JJRittenhouse said:
I know also, for example, if my room were messy, and I cleaned it up and brought more order to it, the act of cleaning it up would expend energy, increasing the entropy of the system.

Does this act, along with the new state of the room increase the entropy more than if it were left messy?

Yes. This is mandated by the second law.
 
  • #3
Firstly you got it in reverse... the entropy is supposed to increase (become less ordered) in a "closed" system . . . but "inverse of order" is not the best intuition about entropy. Entropy of a system is a measure of its "randomness" which isn't a perfect antonym of "order" but rather of "certainty". A low entropy system may be a "mess" but it is a particular "mess" as opposed to an unknown one of many "messes". Since there are more ways to be "messy" than "ordered" typically we must reduce entropy to cause order in a system.

It seems wrong to think of entropy as a measure of ignorance until you look closely at what knowledge means in science. A known system is an observed system and observations are physical acts. So entropy is closely tied to how we observe and constrain systems to keep their variables known. We begin to smell the distinct aroma of quantum theory here.

Another issue is the "closed" business. Given classical mechanics as an approximation of quantum mechanics for large scale, there is a "decoherence" assumption necessary to preserve a classical description of an evolving system and this decoherence implies some interaction with the environment (in the form of interaction with measuring devices and thermodynamic boundaries). The concept of thermodynamic isolation classically means no exchange of energy or material such as considering a gas in a insulated box. There is still the interactions which keep the gas bounded in that box (and thus "measures the location of the gas molecules to be within the box.)

I have found it is possible to "define" the entropy of the whole universe (assuming a finite closed universe) to be constant (namely = 0) without paradox if you look at it quantum mechanically. One way to look at entropy of a given system is as a measure of its entanglement with its environment (the rest of the universe) and thus the degree to which the system cannot be maximally described in isolation.

The interesting thing in QM is that you can have a system with two components A and B in which the entropy of the whole is 0 but due to entanglement the subsystems must be described as having non-zero entropes when described separately. It is in fact the very definition of entanglement that this is the case and we can use entropy as a quantifier of entanglement.

I discussed this in a thread https://www.physicsforums.com/showthread.php?t=170493"
 
Last edited by a moderator:
  • #4
Maybe this is out of line, but if you say the entropy of the universe is zero, and must stay zero, then that implies the evolution of the universe is reversible, does it not?
 
  • #5
Hopefully this clears entropy up for you :) If not let's just admit this is damn catchy.

Entropy, how can I explain it? I'll take it frame by frame it,
to have you all jumping, shouting saying it.
Let's just say that it's a measure of disorder,
in a system that is closed, like with a border.
It's sorta, like a, well a measurement of randomness,
proposed in 1850 by a German, but wait I digress.
"What the **** is entropy?", I hear the people still exclaiming,
it seems I got to start the explaining.

You ever drop an egg and on the floor you see it break?
You go and get a mop so you can clean up your mistake.
But did you ever stop to ponder why we know it's true,
if you drop a broken egg you will not get an egg that's new.

That's entropy or E-N-T-R-O to the P to the Y,
the reason why the sun will one day all burn out and die.
Order from disorder is a scientific rarity,
allow me to explain it with a little bit more clarity.
Did I say rarity? I meant impossibility,
at least in a closed system there will always be more entropy.
That's entropy and I hope that you're all down with it,
if you are here's your membership.

Defining entropy as disorder's not complete,
'cause disorder as a definition doesn't cover heat.
So my first definition I would now like to withdraw,
and offer one that fits thermodynamics second law.
First we need to understand that entropy is energy,
energy that can't be used to state it more specifically.
In a closed system entropy always goes up,
that's the second law, now you know what's up.

You can't win, you can't break even, you can't leave the game,
'cause entropy will take it all 'though it seems a shame.
The second law, as we now know, is quite clear to state,
that entropy must increase and not dissipate.

Creationists always try to use the second law,
to disprove evolution, but their theory has a flaw.
The second law is quite precise about where it applies,
only in a closed system must the entropy count rise.
The Earth's not a closed system' it's powered by the sun,
so **** the damn creationists, Doomsday get my gun!
That, in a nutshell, is what entropy's about,
you're now down with a discount.
 
  • #6
jambaugh said:
It seems wrong to think of entropy as a measure of ignorance until you look closely at what knowledge means in science. A known system is an observed system and observations are physical acts. So entropy is closely tied to how we observe and constrain systems to keep their variables known. We begin to smell the distinct aroma of quantum theory here.

Interesting you would talk about knowledge. I read Murray Gell-Mann's "The Quark and the Jaguar" a few years back and he talks about how entropy and order fit neatly into information theory. According to Gell-Mann, you can't get more ordered without some kind of record that is written. It doesn't have to be a sentient record, just a mark that is kept and doesn't get destroyed. This record, both writing it and its existence add to the entropy of the system immediately, so that Thermodynamics isn't violated even temporarily. And then when the information breaks down, so does the system that relies on it, increasing entropy more.


Does this mean that organizing something is actually just shifting entropy into different forms? Does organization lead to more efficient entropy, considering everything it takes to maintain it?

I had considered it might have something to do with the energy available and being used in a system...like how if you had more energy, things would be more organized, but then I realized that didn't make much sense. If you had the energy of the surface of the sun, there is a lot of complex organization that is NOT going to happen there...like life. But if you don't have enough, like a company starving for resources, then the organization starts to break down. It has also been shown that there is a diminishing return for investments of resources and the return on those resources in most systems (money or otherwise).

So...I don't know maybe that is all off topic of entropy and oganization, it does seem to me that each form of organization has a specific level of energy range that will support it, too much or too little either overwhelm or starve the system of organization.

I've been wondering more and more about entropy, as there are theories for entropy being linked to gravity, to information theory, to just about everything..and you mentioned QM now...so how far off base am I? And which way is actually entropy? Leaving the room messy or cleaining it up? Which generates more total entropy?
 
  • #7
JJRittenhouse said:
I had considered it might have something to do with the energy available and being used in a system...like how if you had more energy, things would be more organized, but then I realized that didn't make much sense. If you had the energy of the surface of the sun, there is a lot of complex organization that is NOT going to happen there...like life. But if you don't have enough, like a company starving for resources, then the organization starts to break down. It has also been shown that there is a diminishing return for investments of resources and the return on those resources in most systems (money or otherwise).

It seems that you are taking a great liberty in interpreting scientific definitions in anyway you like, which is not a helpful thing in science. You are completely distorting the classical thermodynamical definition of entropy. Entropy was defined in a thermodynamic sense as a measure of the amount of energy (heat) that CANNOT be converted into useful work. This definition along with the second law of thermodynamics explained why many thermodynamical processes, like internal combustion engines, have low efficiency, and thus putting a theoretical obstacle to perpetual machines. This has nothing to do with what you're saying about the energy of the sun and companies starving for resources. Also you said "if you had more energy, things would be more organized", this is not true and is not a valid interpretation to the definition of entropy in classical thermodynamics.

JJRittenhouse said:
I've been wondering more and more about entropy, as there are theories for entropy being linked to gravity, to information theory, to just about everything..and you mentioned QM now...so how far off base am I? And which way is actually entropy? Leaving the room messy or cleaining it up? Which generates more total entropy?

I find it essential that you put the definitions in their theoretical framework and their historical perspectives in order to have a clearer understanding of what Physics has to say about Entropy.

Let me have a shot at it.

The first definition of entropy was Clausius thermodynamical definition. Entropy is a measure of the amount of heat that cannot do useful work (BTW, work has a very strict definition in thermodynamics as well, so 'life' as per your previous interpretation cannot fit into that definition). It was also defined as a measure of the energy dispersal, which can loosely be related to disorder. So far Entropy had nothing to do with any phenomenon outside of thermodynamical processes. It also had nothing to do with the 'randomness' of the particles. Indeed, at that time scientists weren't even sure that fluids consist of particles. That had to wait until the statistical mechanics definition of Entropy was introduced. In statistical mechanics Entropy was defined as the log of the number of microstates that correspond to a certain macrostate . In other words, Entropy is a measure of how many ways of arranging your microstates (particles for example) in order to reach one certain macrostate (volume, temperature, energy of the whole system). Now this can be thought of as the fundamental definition of Entropy, and any other definition (including the classical thermodynamical one) can be derived from that one, but not always Vice Versa. This definition has related Entropy to the amount of randomness or disorder of the particles (although there are some critics to that interpretation). Also it has extended the definition of Entropy to outside the realm of thermodynamics, which enabled its use in information theory, gravitation, and Black hole thermodynamics.
 
  • #8
HossamCFD said:
It seems that you are taking a great liberty in interpreting scientific definitions in anyway you like, which is not a helpful thing in science. You are completely distorting the classical thermodynamical definition of entropy.

..probably because my only real exposure to physics are layman's books. I'll be taking physics soon, but I have to finish trig first. Please take my questions as just questions. I came here because I knew people who knew more about physics than I did would be here.
 
  • #9
JJRittenhouse
Sorry if my reply sounded a bit rude. I just wanted to warn you that Entropy can easily confuse you if you read too much into it without confining yourself to the mainstream definitions of it.

Back to your question about your messy room, The second law is saying that it is highly unlikely that your room will get more ordered by itself as time passes. It is far more likely that it will get more disordered (and hence increasing the entropy). Now if you interrupted this naturally occurring phenomena and clean it up you will definitely decrease the Entropy but only at the expense of applying work at it; it won't happen by itself without applying work. However, this does not violate the second law because it says that Entropy can only increase in CLOSED SYSTEMS. In this case your room is not a closed system because work has been done on it by an external force (you). So you have to include yourself in the system for it to be closed. Now the work that you have done to clean the room has resulted in increasing the Entropy (you have dispersed mechanical energy and heat during this cleaning up which were otherwise well ordered in the food calories you ate and the oxygen you breathed). So whenever you close the system you will find that Entropy can increase or remain constant, but never decreases.
 
  • #10
S = k*log(w)

It has little to do with your perception of whether things seem to be more or less ordered.
 
  • #11
The simple Laws of Thermodynamics cannot be applied to the Universe as a whole. Have you studied Thermodynamics in curved space-time?
 
  • #12
I have not, nor do I ever think I will. Please explain.
 
  • #13
For one thing, the space density of a physical quantity [itex]u[/itex] is not an invariant of general coordinate transformations. I think it needs to be divided by the square root of the 00 component of the metric tensor, or, in other words [itex]u/\sqrt{g_{0 0}}[/itex] is such an invariant.

Secondly, the generalization of the law of conservation of energy to 4-dimensional space-time, namely the continuity equation for the of stress-energy tensor, is a simple consequence of Einstein's Field equations and is not a law of conservation of anything in the strict sense of the word. One needs to define a so called stess-energy pseudotensor as a proper description of the energy-momentum contained in the gravitational field itself. In other words, the gravitational field carries energy and gravitational waves can carry energy and momentum out of a system.

Thirdly, the equation of state for the Universe cannot be derived on the grounds of Einstein's equations only. Therefore, it is not derivable solely by general thermodynamic arguments (just like you cannot derive the ideal gas law in ordinary thermodynamics).
 
  • #14
That's all good to know. You have clearly outdone me in your knowledge of general relativity. What does that mean as far as the OP?
 
  • #15
The OP juggles with unscientific notions of scientific terms. Until they pose their question in a well formulated manner, this thread will be a soap-box podium of people making unrelated claims.
 

1. What is entropy?

Entropy is a scientific concept that refers to the measure of disorder or randomness in a system. It is a measure of the number of possible arrangements or states that a system can have, and the likelihood of a particular arrangement occurring.

2. How does entropy relate to order?

Entropy and order are often seen as opposing forces, with entropy representing disorder and order representing structure. However, in some cases, a certain level of order is necessary for a system to have a higher level of entropy. For example, a deck of cards may be more ordered when arranged in a specific pattern, but it has a higher entropy than if the cards were randomly scattered.

3. Is there a universal law of entropy?

Yes, the second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to move towards a more disordered state over time.

4. How does entropy play a role in our daily lives?

Entropy is a fundamental concept in many fields, including physics, chemistry, biology, and information theory. In our daily lives, we can observe entropy in the natural decay and degradation of objects, the mixing of substances, and the movement of energy.

5. Can entropy be reversed?

While the second law of thermodynamics states that the total entropy of a system will always increase, local decreases in entropy are possible. For example, a plant can use energy from the sun to grow and become more ordered, but this decrease in entropy is offset by an overall increase in the entropy of the universe.

Similar threads

Replies
17
Views
1K
Replies
2
Views
843
Replies
3
Views
966
Replies
1
Views
787
  • Special and General Relativity
Replies
7
Views
293
  • Thermodynamics
Replies
10
Views
2K
  • Introductory Physics Homework Help
Replies
8
Views
939
  • Thermodynamics
Replies
5
Views
959
Replies
3
Views
1K
Replies
8
Views
2K
Back
Top