# Resolving the Paradox of Entropy and Energy

• Cosmossos
In summary: Clouds of relatively dense cold air?The heat is transferred from the air in the room and ends up outside. The outside air disgregates, as it is heated up... Clouds of relatively dense cold air?
Cosmossos
we know that when the energy drops the entropy increases. but when the energy of a gas drops then it will become eventualy solid which has less degrees of freedem and therefore the entropy should decrease and not increase.
Moreover high temp. has more entropy (gas for ex.) and as long as the temp increases -> energy increases and so is the entropy . but the entropy should DECREASE AS LONG AS THE ENERGY INCREASES.
where is my mistake?

Cosmossos said:
we know that when the energy drops the entropy increases.
You have that backwards. dS = dQ/T. When energy is removed from a system, the entropy decreases.

entropy never decreases！

phywjc said:
entropy never decreases！
Total entropy including everything never decreases. But the entropy of some particular system can certainly decrease. (If one system loses entropy, some other system must gain at least that much.)

I don't understand. It is know that if I have some gas and its energy drops then the entropy should increase It's written in every textbook.
But in that case, the gas will eventually become solid which has less degrees of freedom and therefore less entropy...

you can look here http://en.wikipedia.org/wiki/Entropy
under classical thermo.

Cosmossos said:
I don't understand. It is know that if I have some gas and its energy drops then the entropy should increase It's written in every textbook.
Give one example.

you can look here http://en.wikipedia.org/wiki/Entropy
under classical thermo.
Where does it say anywhere that removing energy from a gas causes its entropy to increase?

It says that if the energy drops/decrease the entropy increase.
So gas is an example

Cosmossos said:
It says that if the energy drops/decrease the entropy increase.
Where does it say this? Cut and paste an exact quote.

From wikipedia on entropy: "An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics."

If you decrease the entropy of a system (in this case the air in the room) then you MUST increase it in another system (In this case, the air outside).

Ok thanks
and I have another question.
Entropy is non-conserved state function and in isolated systems it can't decrease and therefore ds>=0
In case ds=0 it does conserved because it doesn't change. how can it be a non conserved state function and also may have a case which ds=0?

Here is a paper written by an old physics professor of mine. He really does a great job of explaining all of this, but at the same time, it is more in the scope of a freshman physics course than wikipedia. Read the paper and take notes! After that you will feel much better about entropy.
If you like this paper, then you should check out a few others. He goes through Force, Work/Energy, Rotational Dynamics, Collections of Particles.. and does a paper on Thermodynamics in Chemistry.
http://orchard.wccnet.org/~gkapp/

#### Attachments

• Exploring Thermo-Physics (1).pdf
545.6 KB · Views: 278
Cosmossos said:
Ok thanks
and I have another question.
Entropy is non-conserved state function and in isolated systems it can't decrease and therefore ds>=0
In case ds=0 it does conserved because it doesn't change. how can it be a non conserved state function and also may have a case which ds=0?
Entropy is a state function. This means it depends only upon the thermodynamic state not on the path taken between thermodynamic states.

Entropy is non-conserved. If the sum of all entropy changes was always 0, entropy would be a conserved quantity. But it isn't. The sum of all changes in entropy of a system and its surroundings can theoretically be as close to 0 as you wish to make it (provided you have enough time to carry out a process). But that special case does not mean it is always conserved. In real life, entropy of the universe (system + surroundings) increases during any process.

AM

I am confused. Wiki says that entropy increases as substances "disgregate." What is disgregating in the a/c example? Clouds of relatively dense cold air?

From The Second Law of Thermodynamics，there is no contradiction for a spontaneous process in which energy drops while entropy decreases.

lionelwang said:
From The Second Law of Thermodynamics，there is no contradiction for a spontaneous process in which energy drops while entropy decreases.

Don't you have to specify some empirical/contextual parameters to make this statement testable?

brainstorm said:
I am confused. Wiki says that entropy increases as substances "disgregate." What is disgregating in the a/c example? Clouds of relatively dense cold air?

The heat is transferred from the air in the room and ends up outside. The outside air disgregates, as it is heated up by the transfer of heat from the inside of the room.

Drakkith said:
The heat is transferred from the air in the room and ends up outside. The outside air disgregates, as it is heated up by the transfer of heat from the inside of the room.

Isn't the a/c itself reducing entropy by compressing air and removing heat from it in that way? Then, logically, both the hot output of the evaporator and the cold output of the condenser would be introducing aggregated pockets of air into the surrounding air. At that point, the surrounding air goes to work increasing system entropy by mixing and exchanging energy with the hot/cold pocket of air. So there are several different systems going on with different forms of entropy/disagregation, no?

brainstorm said:
Isn't the a/c itself reducing entropy by compressing air and removing heat from it in that way? Then, logically, both the hot output of the evaporator and the cold output of the condenser would be introducing aggregated pockets of air into the surrounding air. At that point, the surrounding air goes to work increasing system entropy by mixing and exchanging energy with the hot/cold pocket of air. So there are several different systems going on with different forms of entropy/disagregation, no?

I think i can say yes to this. The AC used work to remove the heat from the air inside the room and transfers it to the outside air. Entropy inside the room decreases, and entropy outside increases. Yes, there are several systems going on, such as the transfer of heat from the outside to the walls of the room and then into the room, gradually heating the room and increasing its entropy at the same time that the AC is cooling the room and reducing it.

Drakkith said:
I think i can say yes to this. The AC used work to remove the heat from the air inside the room and transfers it to the outside air. Entropy inside the room decreases, and entropy outside increases. Yes, there are several systems going on, such as the transfer of heat from the outside to the walls of the room and then into the room, gradually heating the room and increasing its entropy at the same time that the AC is cooling the room and reducing it.

It sounds like you are talking about a decrease in entropy as the heat is aggregated outside. Disgregation would take place if the a/c was turned off and the heat was allowed to seep in from outside. When temperature equilibrium is reached between outside and inside, that's when you've reached maximum entropy, correct? But that only considers inside & outside together as a single system. If you were just looking at the inside system along, I think you'd be dealing with disgregation of heat from the air to the condenser or something like that.

brainstorm said:
It sounds like you are talking about a decrease in entropy as the heat is aggregated outside. Disgregation would take place if the a/c was turned off and the heat was allowed to seep in from outside. When temperature equilibrium is reached between outside and inside, that's when you've reached maximum entropy, correct? But that only considers inside & outside together as a single system. If you were just looking at the inside system along, I think you'd be dealing with disgregation of heat from the air to the condenser or something like that.
Just a little helpful advice: Disgregation is an archaic term and is no longer used in thermodynamics. So disregard disgregation. Stick to entropy and its thermodynamic meaning:

$$\Delta S_{a->b} = \Delta S_{surr:a->b} + \Delta S_{sys: a->b} = \int_a^b \frac{dQ_{surr}}{T_{surr}} + \int_a^b \frac{dQ_{sys}}{T_{sys}}$$

where the integral is measured along the reversible path from a to b. If you measure this quantity, you will have a positive number. That is the second law of thermodynamics. If you get a negative number, you are missing part of the system or surroundings.

AM

Andrew Mason said:
Just a little helpful advice: Disgregation is an archaic term and is no longer used in thermodynamics. So disregard disgregation. Stick to entropy and its thermodynamic meaning:

$$\Delta S_{a->b} = \Delta S_{surr:a->b} + \Delta S_{sys: a->b} = \int_a^b \frac{dQ_{surr}}{T_{surr}} + \int_a^b \frac{dQ_{sys}}{T_{sys}}$$

where the integral is measured along the reversible path from a to b. If you measure this quantity, you will have a positive number. That is the second law of thermodynamics. If you get a negative number, you are missing part of the system or surroundings.

AM
How does plugging in numbers to this algorithm and getting output help you understand what entropy is in a concrete sense?

Disgregation is a useful term because it describes the progression from relative order to relative disorder. Isn't that the basic meaning of entropy?

brainstorm said:
How does plugging in numbers to this algorithm and getting output help you understand what entropy is in a concrete sense?

Disgregation is a useful term because it describes the progression from relative order to relative disorder. Isn't that the basic meaning of entropy?

The equations represent the effect of entropy and present a useful and meaningful way of calculating it. What exactly are you having trouble understanding with again?

Drakkith said:
The equations represent the effect of entropy and present a useful and meaningful way of calculating it. What exactly are you having trouble understanding with again?

I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.

brainstorm said:
I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.

Are you simply asking "What is entropy"?

I can't say anything about disgregation, as i don't know anything about it.

Edit: Ok i found your first post. The air inside the room transfers its heat to the AC unit. I guess the coolant inside the AC would be disgregating.

Last edited:
Drakkith said:
Are you simply asking "What is entropy"?
I know that entropy is the amounts of disorder in a system. The issue is applying the general concept to a specific situation, in this case air-conditioning.

I can't say anything about disgregation, as i don't know anything about it.
I don't either, but the word itself seems to simply mean the opposite of aggregation, which involves generation of a concentration of differentiated elements in a system. So, for example, an a/c compressor aggregates heat by compressing gas, which causes it to dissipate into the surrounding air. Then the cooled coolant goes to the condenser (I think I'm naming the right part) where it absorbs heat from adjacent air, which cools that air to aggregate cool air. The warm and cool air pockets in the room then disaggregate into each other to reach a more homogenous temperature.

Edit: Ok i found your first post. The air inside the room transfers its heat to the AC unit. I guess the coolant inside the AC would be disgregating.
Disgregate is not the same thing as dissipate. Aggregate and disgregate seem to refer to relative concentration within a mixture. Dissipate just refers to the concentrated substance losing concentration by flowing away. They are similar, admittedly, but not the same. Heat can dissipate but not disgregate, I don't think, because it's not a substance. Molecules can disgregate, but heat dissipates. I think this is because molecules are discrete entities and heat is kinetic energy of motion.

brainstorm said:
I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.
Entropy does not really have to be thought of as a tangible physical property. It can be simply thought of as a mathematical relationship that is a useful tool when dealing with thermodynamic systems.

One could say that work is the integral of Force x distance $\int F\cdot ds$ over a certain path. What does that mean? At one level, it means that work is the ability to usefully move matter. Kinetic and potential energy can be viewed as (it is actually defined as) the ability to do work. In thermodynamics, kinetic and potential energy could be viewed as the lowest entropy form of energy. But at the end of the day, work means $\int F\cdot ds$ over a path and energy is just the ability to do work.Carnot showed that the maximum amount of useful work that can be produced from a system operating between two temperatures occurs when the system is arbitrarily close to equilibrium with its surroundings at all times. The further a system is from equilibrium with its surroundings during the process, the lower the efficiency (the work output per unit of input energy in heat flow). Where the system is arbitrarily close to equilibrium during a process, the direction of the process can be reversed with an arbitrarily small change in conditions. He called this a reversible process.

Carnot proved that if one calculated the quantity of $\int dQ/T$ between two states for both the system and surroundings for a reversible process, the result was always 0. He also proved that where the beginning and end states were not achieved by a reversible process, the quantity (ie the value of the above integral) was always greater than 0. He further proved that the amount of work that could be produced from the system for a given input heat flow (the efficiency) was indirectly proportional to this quantity: the greater the number, the lower the efficiency. It can be shown, consequently, that this quantity is also a measure of the amount of work that will have to be done to reverse the process.

It became apparent that this quantity be used as a tool for analysing thermodynamic processes. The value of the integral was called "entropy". What does it measure? It measures how far from equilibrium a thermodynamic process occurred.

What greater significance in the universe does this have? There are several ways of looking at it. If you look at it from an energy perspective, it is a measure of thermodynamic potential - that is a measure of how much energy you can usefully extract (work) from a thermodynamic system.

If you look at it from a statistical mechanics perspective, entropy can be viewed as a measure of the number of equivalent microstates that a system or the universe can have for a given thermodynamic state. The greatest number of microstates for the universe occurs when everything in the universe is in complete thermodynamic equilibrium).

But these are simply attempts to give a physical significance to the value of the integral of dQ/T over a path between two thermodynamic states. AM

Andrew Mason said:
Entropy does not really have to be thought of as a tangible physical property. It can be simply thought of as a mathematical relationship that is a useful tool when dealing with thermodynamic systems.
I prefer to think in concrete terms.

One could say that work is the integral of Force x distance $\int F\cdot ds$ over a certain path. What does that mean? At one level, it means that work is the ability to usefully move matter. Kinetic and potential energy can be viewed as (it is actually defined as) the ability to do work. In thermodynamics, kinetic and potential energy could be viewed as the lowest entropy form of energy. But at the end of the day, work means $\int F\cdot ds$ over a path and energy is just the ability to do work.
This makes sense to me insofar as there is a parallel between a battery with an electric field due to charge disequilibrium and a thermal system with heat disequilibrium.

Carnot showed that the maximum amount of useful work that can be produced from a system operating between two temperatures occurs when the system is arbitrarily close to equilibrium with its surroundings at all times. The further a system is from equilibrium with its surroundings during the process, the lower the efficiency (the work output per unit of input energy in heat flow). Where the system is arbitrarily close to equilibrium during a process, the direction of the process can be reversed with an arbitrarily small change in conditions. He called this a reversible process.
I don't get the reversibility. Also, don't you mean that the further the system is from equilibrium the more potential to do work it has?

Carnot proved that if one calculated the quantity of $\int dQ/T$ between two states for both the system and surroundings for a reversible process, the result was always 0. He also proved that where the beginning and end states were not achieved by a reversible process, the quantity (ie the value of the above integral) was always greater than 0. He further proved that the amount of work that could be produced from the system for a given input heat flow (the efficiency) was indirectly proportional to this quantity: the greater the number, the lower the efficiency. It can be shown, consequently, that this quantity is also a measure of the amount of work that will have to be done to reverse the process.
This would be easier to grasp with an example, I think.

It became apparent that this quantity be used as a tool for analysing thermodynamic processes. The value of the integral was called "entropy". What does it measure? It measures how far from equilibrium a thermodynamic process occurred.
I thought entropy is a measure of disorder, where equilibrium is the ultimate state of disorder and disequilibrium is relative order. In ice cube in water has potential to draw heat from the surrounding water, but once it melts the system is in equilibrium and no transfer takes place. I thought melted ice was maximum state of entropy for the ice cubes in water system.

If you look at it from a statistical mechanics perspective, entropy can be viewed as a measure of the number of equivalent microstates that a system or the universe can have for a given thermodynamic state. The greatest number of microstates for the universe occurs when everything in the universe is in complete thermodynamic equilibrium).
What is meant by "microstates?" Do you mean that water frozen in ice, for example, has less potential microstates possible because the molecules are locked in a solid formation? What about when you pour cold water into hot water? Does the evenly lukewarm mixture eventually reached have more microstates than the initial combination of hot water with cold water pockets in it?

But these are simply attempts to give a physical significance to the value of the integral of dQ/T over a path between two thermodynamic states.
and usefully so, imo.

Drakkith said:
From wikipedia on entropy: "An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics."

If you decrease the entropy of a system (in this case the air in the room) then you MUST increase it in another system (In this case, the air outside).

If the air conditioner is reversible engine but not carnot, will the entropy(total) still increase? I think YES, but college physics text says entropy(total) remain constant for reversible?

The word equilibrium means a state of balance. In an equilibrium state, there are no unbalanced potentials (or driving forces) with the system. A system that is in equilibrium experiences no changes when it is isolated from its surroundings.

Equilibrium isn't the ultimate state of disorder. If i take a system in thermodynamic equilibrium, and I add energy to it, it's entropy and disorder increases. You can't talk about a a maximum amount of disorder unless its on an isolated system, in which case it would be equal to the total amount of energy in that system when the entirety of the energy is spread throughout the system, which is when it reaches equilibrium.

Drakkith said:
The word equilibrium means a state of balance. In an equilibrium state, there are no unbalanced potentials (or driving forces) with the system. A system that is in equilibrium experiences no changes when it is isolated from its surroundings.

Equilibrium isn't the ultimate state of disorder. If i take a system in thermodynamic equilibrium, and I add energy to it, it's entropy and disorder increases. You can't talk about a a maximum amount of disorder unless its on an isolated system, in which case it would be equal to the total amount of energy in that system when the entirety of the energy is spread throughout the system, which is when it reaches equilibrium.

I thought entropy was relative to the ultimate state of disorder in a given system with isolated inputs. So, if a system reaches thermal equilibrium at on temperature, it has reached maximum entropy because no more disgregation of heat will take place. If heat is added unevenly, entropy can decrease insofar as the heat is concentrated/aggregated within some subset(s) of the system. In that case, the heat will dissipate and eventually cause the system to reach thermal equilibrium, but this time at a higher temperature.

Is this an incorrect description/example of thermal-equilibrium progress as increasing entropy?

kntsy said:
If the air conditioner is reversible engine but not carnot, will the entropy(total) still increase? I think YES, but college physics text says entropy(total) remain constant for reversible?

brainstorm said:
I thought entropy was relative to the ultimate state of disorder in a given system with isolated inputs. So, if a system reaches thermal equilibrium at on temperature, it has reached maximum entropy because no more disgregation of heat will take place. If heat is added unevenly, entropy can decrease insofar as the heat is concentrated/aggregated within some subset(s) of the system. In that case, the heat will dissipate and eventually cause the system to reach thermal equilibrium, but this time at a higher temperature.

Is this an incorrect description/example of thermal-equilibrium progress as increasing entropy?

Yes that looks correct.

Cosmossos said:
we know that when the energy drops the entropy increases

but the entropy should DECREASE AS LONG AS THE ENERGY INCREASES.
where is my mistake?

How did you come to these conclusions? Where do we know this from?

brainstorm said:
I thought entropy was relative to the ultimate state of disorder in a given system with isolated inputs. So, if a system reaches thermal equilibrium at on temperature, it has reached maximum entropy because no more disgregation of heat will take place. If heat is added unevenly, entropy can decrease insofar as the heat is concentrated/aggregated within some subset(s) of the system. In that case, the heat will dissipate and eventually cause the system to reach thermal equilibrium, but this time at a higher temperature.

Is this an incorrect description/example of thermal-equilibrium progress as increasing entropy?
Entropy has meaning only in terms of the difference in entropy between two states. As heat flows into the gas, there is a positive change in entropy of the gas (and a smaller negative change in the entropy of the surroundings).

"Disorder" is not really a very accurate explanation for entropy. First of all, it is not clear what "disorder" means. Consider 2 moles of gas in equilibrium at temperature T. Then consider one mole of the same gas at temperature T + $\Delta T$ and the other mole at T - $\Delta T$. Which of these two states has the most disorder? Why?

Second, a concept of "disorder" is misleading. Consider a mole of He gas and a mole of Argon gas each at state (P,V,T) in its own compartment insulated from their surroundings separated by a common insulated wall. Then consider the situation where the wall is removed and the gases mix. Is there a change in entropy? Do both states represent the same amount of disorder?

AM

## 1. What is the paradox of entropy and energy?

The paradox of entropy and energy is the apparent contradiction between the second law of thermodynamics, which states that entropy (or disorder) in a closed system always increases, and the conservation of energy, which states that energy cannot be created or destroyed but can only be converted from one form to another.

## 2. How is the paradox resolved?

The paradox is resolved by understanding that while entropy may increase in a closed system, the total entropy of the universe remains constant or increases over time. This means that while energy may be converted from one form to another, the overall disorder in the universe is still increasing.

## 3. What role does energy play in the paradox?

Energy is essential in resolving the paradox because it is the driving force behind all physical processes. Without energy, there would be no movement or changes in the universe, and the concept of entropy would not exist.

## 4. Can entropy ever decrease in a closed system?

No, according to the second law of thermodynamics, entropy can never decrease in a closed system. This means that the total disorder or randomness of a closed system will always increase or remain constant over time.

## 5. How does the resolution of this paradox impact our understanding of the universe?

The resolution of this paradox has significant implications for our understanding of the universe. It helps us to better understand the fundamental laws of thermodynamics and the role of energy in shaping the world around us. It also has practical applications in fields such as engineering, chemistry, and biology.

• Thermodynamics
Replies
17
Views
2K
• Thermodynamics
Replies
1
Views
864
• Thermodynamics
Replies
4
Views
1K
• Thermodynamics
Replies
22
Views
3K
• Astronomy and Astrophysics
Replies
4
Views
1K
• Thermodynamics
Replies
5
Views
1K
• Thermodynamics
Replies
10
Views
2K
• Thermodynamics
Replies
5
Views
2K
• Thermodynamics
Replies
13
Views
2K
• Thermodynamics
Replies
39
Views
5K