Entropy could decrease with time

In summary, the conversation discusses the concept of entropy and whether it is possible for it to decrease over time. It is clarified that entropy is a statistical law rather than a physical law, and while it can decrease locally, it must be balanced by an increase in the entropy of the environment. The rarity of entropy decreasing is also emphasized through an analogy using a bottle of air. It is also noted that entropy is a measure of disorder, but the definition of order and disorder can vary. Overall, it is concluded that the Second Law of Thermodynamics states that entropy tends to increase in closed systems, but there have been observed instances where it has decreased.
  • #1
Jack
108
0
Is it possible, even though highly improbable, that entropy could decrease with time i.e. particles randomly form a more ordered state to 'unbreak' something. If so I realize that they are incredibly unlikely to do so and it only is hypothetical but since this is a law of physics (I think) then it should not be broken at all.
 
Last edited by a moderator:
Science news on Phys.org
  • #2
"Entropy increases" is NOT a "law of physics" in the sense in which you seem to mean it. It is a statistical law.
 
  • #3
Well, i don't know, perhaps in U.S. it is not a law of physics, but in Europe it is. !.
The Second Law of Thermodynamics cannot be broken at all, nor statiscally neither physically. But first you have to know what does it say to us. The Entropy of the Universe has to increase in an unrreversible (natural) process. This law can be broken locally. The entropy of a system can decrease, but this decreasing has to be overcame with the increasing of the entropy of the environment. Statiscally never the entropy of the "Universe" can decrease.

There is some processes in the Universe (in a cosmologycal sense) that appears to broke this law, but the decreasing has to be balanced in another part of the Universe.
 
  • #4
This is not a very well posed question, Entropy can and does increace in a system with a net input of energy, consider the surface of the earth, and the formation of life. The trend to higher states of organization show an INCREASE in entropy, put we have a steady input of energy from the sun. If you consider the earth, sun system the Entropy decreases, if you consider the Earth alone Entropy increases.

To speak meaningfully about entropy you must specify the system underconsideration.
 
  • #5
If you have, for example, a bottle of air, it is "possible" (ahem), that most of the gas molecules would head over to one side of the bottle and temporarily you would have a system that went to a lower state of entropy.

If you used a colored gas and waited for all eternity, you could see this occur, making this an exciting science project!

Better to play the odds and invest your savings in lottery tickets.
 
  • #6
it is "possible" (ahem),
Quite right. But to get a feel for how slim the possibility is, let us say we have a anthropomorphic-sized bottle of 10cm across, at room temp. How many molecules should we allow into have an even chance of seeing them all on one side at least once in our lifetime?

Ans: 43.

That's right, 43.

How many to have an even chance of such an event during the life of the universe?

Ans: 70.

So we typically have like 10^(20) particles in such a container. That's why discussions of spontaneous changes in entropy are not productive. I mean that for macroscopic amounts of particles, we use stat. mech. All other cases involve so few particles that we can trace them individually.
 
  • #7
Originally posted by Integral
This is not a very well posed question, Entropy can and does increace in a system with a net input of energy, consider the surface of the earth, and the formation of life. The trend to higher states of organization show an INCREASE in entropy, put we have a steady input of energy from the sun. If you consider the earth, sun system the Entropy decreases, if you consider the Earth alone Entropy increases.

To speak meaningfully about entropy you must specify the system underconsideration.
Late night? Since entropy is a measure of disorder, an increase in order is a decrease in entropy.:wink:
 
  • #8
Originally posted by krab
Quite right. But to get a feel for how slim the possibility is, let us say we have a anthropomorphic-sized bottle of 10cm across, at room temp. How many molecules should we allow into have an even chance of seeing them all on one side at least once in our lifetime?

Ans: 43.
Sounds reasonable, but I'd love to know where you got that from.
 
  • #9
Originally posted by russ_watters
Late night? Since entropy is a measure of disorder, an increase in order is a decrease in entropy.:wink:
I don't think this is technically true, is it? My understanding was that entropy has to do with heat and efficiency, not with order specifically.
 
  • #10
The 2md law should properly be stated as entropy tends to increase in closed systems, on small scales over short time periods entropy has been observed to decrease due to it's statistical nature.

Entropy is a maesure of disorder, but you have to be very careful exactly what you define as order and disorder as it relates more to the number of possible states than any concept favoured by you humans.
 
  • #11
Originally posted by jcsd
The 2md law should properly be stated as entropy tends to increase in closed systems, on small scales over short time periods entropy has been observed to decrease due to it's statistical nature.

Entropy is a maesure of disorder, but you have to be very careful exactly what you define as order and disorder as it relates more to the number of possible states than any concept favoured by you humans.
Isn't a law of thermodynamics supposed to be about heat?
 
  • #12
Originally posted by Integral
This is not a very well posed question, Entropy can and does increace in a system with a net input of energy, consider the surface of the earth, and the formation of life. The trend to higher states of organization show an INCREASE in entropy, put we have a steady input of energy from the sun. If you consider the earth, sun system the Entropy decreases, if you consider the Earth alone Entropy increases.

To speak meaningfully about entropy you must specify the system underconsideration.
You are completely right here. Than more strongly communication of object with other objects or the more interactions has object, the more its order and, accordingly, it is less entropy. Isolation of object, restriction of interactions results in its increase of entropy down to full destruction. This phenomenon is a cause of evolution.
 
  • #13
There are two relatively independant notions of entropy, thermodynamic entropy and informational entropy. The two are conveniently confused when suitable, and they are interchangeable most of the time.

Th-entropy is measure of energy not available to do work. Fancy definition, of something that is not. Where the energy goes is left as an exercise to a reader. But uniform spreadout of energy, or bound energy are examples.

I-entropy is measure of possible states system can take.
Condensation of gases or ice formation, phase shifts are examples that are counterintuitive to I-entropy, as number of states is reducing. But here comes saver - th-entropy increases as energy is bound, it becomes 'unavailable'. Thus, we can safely say that Entropy increases anyway..

I'm no authority, but I've got impression that practically anything that has capacity to form spontaneously more complex systems from environmental noise is basically local decrease of I-entropy. Interesting is that this is about the only scenario when I-entropy can decrease. Any kind of forced change to system results in more destruction than what forms as a result. Molecular bonds, planets orbiting suns, these are sort of examples to spontaneous decrease of I-entropy. Similarily, formation of life.
Th-entropy seems to be a bookkeeping measure. Everything costs energy, although energy doesn't disappear. It gets locked up, or spread out evenly.

I'm sure I'll be corrected if that's not sane.
 
  • #14
Originally posted by russ_watters
Sounds reasonable, but I'd love to know where you got that from.

Calculate the average speed of a particle from the temperature. From the container size, this gives a time for a particle to traverse the container. Consider this to be the time required for the system to change state (states being defined by which particles being in which half of the container). And so calculate how many state changes can occur in the given time period. Compare this with 2^N.
 
  • #15
"Isn't a law of thermodynamics supposed to be about heat?"

I like your signature, Zero. Else would I not answer this silly question.

Thermodynamics is the study of transformations of energy. It covers much territory other than heat. Work, for example, or electrode potentials, or the direction of chemical reactions.

First Law:
If you put a hot block and a cold block in a Styrofoam cooler, the first law says the total internal energy inside the cooler will not change (assuming perfect insulation) unless you open it up and add more heat or work (or more material--usually not considered).

Or, (the change in internal energy of a system) = (the energy added as heat) + (the energy added as work).

Second Law:
However, how do you know what happens if you slide the cold block next to the hot block? The first law just says that the energy inside the Styrofoam box remains the same.

However, if you define a mathematical fudge factor as (the change in heat)/temperature, you have a property, entropy, that will tell you what happens when you put the cold block against the hot one.

If the hot block loses an amount of heat energy to the cold block, it is doing so at a relatively high temperature, so its entropy decreases by a small amount, since temperature is in the denominator. The cold block gains heat at a lower temperature, so its entropy increases by a larger amount. Thus the total entropy for the system inside the box increases when the hot block heats the cold block. If heat flowed from the cold block to the hot block, system entropy would decrease, so we can predict this is not the direction the process would take.

In general spontaneous changes are accompanied with an increase in entropy. This can be used to determine the direction of chemical reactions, for example.
 
Last edited:
  • #16
Originally posted by Zero
Isn't a law of thermodynamics supposed to be about heat?

Yes really it should be, but it does usually correspond to the number of trivial arrangements something can have, for example the entropy of a black hole can be thought of in both terms of it's temperature and also in the more advanced speculative theory the number of different arrangements on a fundmental level.
 
  • #17
entropy law says entropy as a whole increases in an isolated system.
consider the process of star formation from a nebulae(assuming it to be an isolated system.)matter is diffused at first. so nebula begins at a high entropy state. somewhere in this nebula, there is a region of higher mass density creating a gravitational gradient. matter from the rest of the nebula is attracted towards this region. thus clumping occurs and soon most of the matter that was present in the nebula is concentrated in this region of space creating a new star. this is how all stars, planets and even galaxies came into being. does this not mean that due to gravitational force a system is spontaneously evolving from a state of higher entropy to a lower one(as order is increasing as matter is converging towards one particular region of the system)? then how can the second law be valid here?
 
  • #18
When you factor in gravity, entropy gets more complex but still holds, I'd say remember the matter in the nebula is losing GPE.
 
  • #19
Originally posted by sage
does this not mean that due to gravitational force a system is spontaneously evolving from a state of higher entropy to a lower one(as order is increasing as matter is converging towards one particular region of the system)?

The matter becomes hotter as the gravitational force accelerates it and collisions force molecules and atoms into higher speeds (and greater disorder), so I suspect the decrease in entropy due to increasing localization of matter is more than offset by the increasing disorder of atoms from the increasing temperature.

In this case we have gravitational forces doing work on matter to force it into a more localized state. I'm not sure if this should be analyzed as a work problem or an entropy problem. Have to think about that. Any help?
 
Last edited:
  • #20
The absolute order and zero value of entropy are possible at absolute zero value of temperature. But, will be exist a matter in this condition? As the atom consist of wave structures and the wave this fluctuation that the answer to this question is no.
Value of temperature defines the behaviour of atoms and macro objects. It is a key starting the corresponding program of functioning. As a result of interactions new objects are created. Evolution of the universe can be connected with change of temperature directly. On example of the Earth it is evidently.
I do not think, that the life is brought to the Earth from other planets. Cooling down, the heated sphere developed as a complex system the order of which had increased in inverse relationship from value of temperature.
 
  • #21
And don't you think our meeting here writing about this goes contrary to the 2nd law?. We are here, in a more ordenated state, talking about this, instead of being walking by there with a pretty girl, in a more disordenated state.
 
  • #22
Originally posted by jcsd
When you factor in gravity, entropy gets more complex but still holds, I'd say remember the matter in the nebula is losing GPE.
Interesting. Does increasing entropy mean increasing GPE?
 
  • #23
No but what it does mean is that your going to have to do a lot of work to take the star and put back into it's nebula state, more work than it takes to form the star.
 
  • #24
Originally posted by Zero
Isn't a law of thermodynamics supposed to be about heat?
Heat is kinda the manifestation of entropy in thermodynamics. Its a specific definition. "Disorder" is the more general definition and can be applied virtually anywhere. The dictionary does a pretty good job with it:

1. Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
2. A measure of the disorder or randomness in a closed system.
3. A measure of the loss of information in a transmitted message.
4. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
5. Inevitable and steady deterioration of a system or society.
 
Last edited:
  • #25
In this case we have gravitational forces doing work on matter to force it into a more localized state. I'm not sure if this should be analyzed as a work problem or an entropy problem. Have to think about that. Any help?
you have two states; initial randomly distributed particles, and a final sphere with the same mass.
You have to find out if the heat generated by the formation of the sphere -- Qelectric, is greater than the work required to make the sphere -- Wgravitational
 
  • #26
Originally posted by Zero
I don't think this is technically true, is it? My understanding was that entropy has to do with heat and efficiency, not with order specifically.

The Boltzmann definition of entropy is S = klnW, where k= Boltzmann's const and W is the number of degrees of freedom of the system. More degrees of freedom equals more disorder.

Entropy itself has nothing to do with heat and efficiency. But since the entropy change during a thermodynamic process, must increase or stay the same (2nd law of thermo), there is the familiar consequences that engine efficiencies must be < 1, heat flows from hotter to colder objects, etc.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the amount of energy in a system that is unavailable for work.

2. Can entropy decrease with time?

Yes, entropy can technically decrease in a closed system over time, but this is highly unlikely and goes against the second law of thermodynamics. The second law states that in any isolated system, the total entropy will always increase or stay the same over time, but never decrease.

3. How does the second law of thermodynamics relate to entropy?

The second law of thermodynamics states that in any isolated system, the total entropy will always increase or stay the same over time. This means that the overall disorder or randomness in a system will always increase or remain constant, and it is impossible for it to decrease without external intervention.

4. Are there any exceptions to the second law of thermodynamics?

There are some rare cases where it may appear that the second law of thermodynamics is violated, such as when a system becomes more ordered over time. However, these cases usually involve energy being input into the system from an external source, which can create the appearance of decreasing entropy.

5. How does the concept of entropy apply to the universe?

The concept of entropy is closely related to the idea of the universe's eventual heat death. As the universe continues to expand and dissipate energy, entropy will continue to increase until all of the energy is evenly distributed and no work can be done. This state of maximum entropy is also known as the heat death of the universe.

Similar threads

  • Thermodynamics
Replies
26
Views
1K
Replies
1
Views
478
Replies
13
Views
1K
  • Thermodynamics
Replies
19
Views
2K
Replies
2
Views
806
  • Thermodynamics
Replies
1
Views
697
Replies
2
Views
817
Replies
13
Views
2K
Replies
3
Views
923
Replies
3
Views
1K
Back
Top