Resolving the Paradox of Entropy and Energy

  • Context: Graduate 
  • Thread starter Thread starter Cosmossos
  • Start date Start date
  • Tags Tags
    Entropy Paradox
Click For Summary

Discussion Overview

The discussion centers around the relationship between entropy and energy, particularly in the context of thermodynamics. Participants explore how changes in energy affect entropy in various systems, including gases and air conditioning units. The conversation includes theoretical implications, textbook references, and examples from classical thermodynamics.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • Some participants assert that when energy drops, entropy increases, while others argue that this is incorrect, stating that removing energy from a system decreases its entropy.
  • One participant emphasizes that total entropy never decreases, but the entropy of individual systems can decrease if another system compensates by increasing its entropy.
  • There is confusion regarding the relationship between energy loss in gases and entropy, with some claiming that a gas's entropy should increase as energy decreases, while others challenge this view by pointing out that solidification leads to lower entropy.
  • Participants reference external sources, such as Wikipedia, to support their claims about entropy and energy, leading to requests for specific quotes and examples.
  • Questions arise about the nature of entropy as a non-conserved state function, with discussions about scenarios where entropy changes may appear to be conserved.
  • Some participants discuss the role of air conditioning in entropy changes, noting that while the system cools the air inside a room (decreasing entropy), it simultaneously increases the entropy of the surrounding environment.
  • There is a suggestion that multiple systems are at play in the context of air conditioning, leading to complex interactions of entropy changes.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the relationship between energy and entropy. Multiple competing views are presented, leading to ongoing debate and clarification attempts.

Contextual Notes

Limitations include varying interpretations of thermodynamic principles, reliance on different sources for definitions, and the complexity of systems involved in entropy changes.

Cosmossos
Messages
100
Reaction score
0
we know that when the energy drops the entropy increases. but when the energy of a gas drops then it will become eventualy solid which has less degrees of freedem and therefore the entropy should decrease and not increase.
Moreover high temp. has more entropy (gas for ex.) and as long as the temp increases -> energy increases and so is the entropy . but the entropy should DECREASE AS LONG AS THE ENERGY INCREASES.
where is my mistake?
 
Science news on Phys.org
Cosmossos said:
we know that when the energy drops the entropy increases.
You have that backwards. dS = dQ/T. When energy is removed from a system, the entropy decreases.
 
entropy never decreases!
 
phywjc said:
entropy never decreases!
Total entropy including everything never decreases. But the entropy of some particular system can certainly decrease. (If one system loses entropy, some other system must gain at least that much.)
 
I don't understand. It is know that if I have some gas and its energy drops then the entropy should increase It's written in every textbook.
But in that case, the gas will eventually become solid which has less degrees of freedom and therefore less entropy...

you can look here http://en.wikipedia.org/wiki/Entropy
under classical thermo.
 
Cosmossos said:
I don't understand. It is know that if I have some gas and its energy drops then the entropy should increase It's written in every textbook.
Give one example.

you can look here http://en.wikipedia.org/wiki/Entropy
under classical thermo.
Where does it say anywhere that removing energy from a gas causes its entropy to increase?
 
It says that if the energy drops/decrease the entropy increase.
So gas is an example
 
Cosmossos said:
It says that if the energy drops/decrease the entropy increase.
Where does it say this? Cut and paste an exact quote.
 
From wikipedia on entropy: "An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics."

If you decrease the entropy of a system (in this case the air in the room) then you MUST increase it in another system (In this case, the air outside).
 
  • #10
Ok thanks
and I have another question.
Entropy is non-conserved state function and in isolated systems it can't decrease and therefore ds>=0
In case ds=0 it does conserved because it doesn't change. how can it be a non conserved state function and also may have a case which ds=0?
 
  • #11
Here is a paper written by an old physics professor of mine. He really does a great job of explaining all of this, but at the same time, it is more in the scope of a freshman physics course than wikipedia. Read the paper and take notes! After that you will feel much better about entropy.
If you like this paper, then you should check out a few others. He goes through Force, Work/Energy, Rotational Dynamics, Collections of Particles.. and does a paper on Thermodynamics in Chemistry.
http://orchard.wccnet.org/~gkapp/
 

Attachments

  • #12
Cosmossos said:
Ok thanks
and I have another question.
Entropy is non-conserved state function and in isolated systems it can't decrease and therefore ds>=0
In case ds=0 it does conserved because it doesn't change. how can it be a non conserved state function and also may have a case which ds=0?
Entropy is a state function. This means it depends only upon the thermodynamic state not on the path taken between thermodynamic states.

Entropy is non-conserved. If the sum of all entropy changes was always 0, entropy would be a conserved quantity. But it isn't. The sum of all changes in entropy of a system and its surroundings can theoretically be as close to 0 as you wish to make it (provided you have enough time to carry out a process). But that special case does not mean it is always conserved. In real life, entropy of the universe (system + surroundings) increases during any process.

AM
 
  • #13
I am confused. Wiki says that entropy increases as substances "disgregate." What is disgregating in the a/c example? Clouds of relatively dense cold air?
 
  • #14
From The Second Law of Thermodynamics,there is no contradiction for a spontaneous process in which energy drops while entropy decreases.
 
  • #15
lionelwang said:
From The Second Law of Thermodynamics,there is no contradiction for a spontaneous process in which energy drops while entropy decreases.

Don't you have to specify some empirical/contextual parameters to make this statement testable?
 
  • #16
brainstorm said:
I am confused. Wiki says that entropy increases as substances "disgregate." What is disgregating in the a/c example? Clouds of relatively dense cold air?

The heat is transferred from the air in the room and ends up outside. The outside air disgregates, as it is heated up by the transfer of heat from the inside of the room.
 
  • #17
Drakkith said:
The heat is transferred from the air in the room and ends up outside. The outside air disgregates, as it is heated up by the transfer of heat from the inside of the room.

Isn't the a/c itself reducing entropy by compressing air and removing heat from it in that way? Then, logically, both the hot output of the evaporator and the cold output of the condenser would be introducing aggregated pockets of air into the surrounding air. At that point, the surrounding air goes to work increasing system entropy by mixing and exchanging energy with the hot/cold pocket of air. So there are several different systems going on with different forms of entropy/disagregation, no?
 
  • #18
brainstorm said:
Isn't the a/c itself reducing entropy by compressing air and removing heat from it in that way? Then, logically, both the hot output of the evaporator and the cold output of the condenser would be introducing aggregated pockets of air into the surrounding air. At that point, the surrounding air goes to work increasing system entropy by mixing and exchanging energy with the hot/cold pocket of air. So there are several different systems going on with different forms of entropy/disagregation, no?

I think i can say yes to this. The AC used work to remove the heat from the air inside the room and transfers it to the outside air. Entropy inside the room decreases, and entropy outside increases. Yes, there are several systems going on, such as the transfer of heat from the outside to the walls of the room and then into the room, gradually heating the room and increasing its entropy at the same time that the AC is cooling the room and reducing it.
 
  • #19
Drakkith said:
I think i can say yes to this. The AC used work to remove the heat from the air inside the room and transfers it to the outside air. Entropy inside the room decreases, and entropy outside increases. Yes, there are several systems going on, such as the transfer of heat from the outside to the walls of the room and then into the room, gradually heating the room and increasing its entropy at the same time that the AC is cooling the room and reducing it.

It sounds like you are talking about a decrease in entropy as the heat is aggregated outside. Disgregation would take place if the a/c was turned off and the heat was allowed to seep in from outside. When temperature equilibrium is reached between outside and inside, that's when you've reached maximum entropy, correct? But that only considers inside & outside together as a single system. If you were just looking at the inside system along, I think you'd be dealing with disgregation of heat from the air to the condenser or something like that.
 
  • #20
brainstorm said:
It sounds like you are talking about a decrease in entropy as the heat is aggregated outside. Disgregation would take place if the a/c was turned off and the heat was allowed to seep in from outside. When temperature equilibrium is reached between outside and inside, that's when you've reached maximum entropy, correct? But that only considers inside & outside together as a single system. If you were just looking at the inside system along, I think you'd be dealing with disgregation of heat from the air to the condenser or something like that.
Just a little helpful advice: Disgregation is an archaic term and is no longer used in thermodynamics. So disregard disgregation. Stick to entropy and its thermodynamic meaning:

[tex]\Delta S_{a->b} = \Delta S_{surr:a->b} + \Delta S_{sys: a->b} = \int_a^b \frac{dQ_{surr}}{T_{surr}} + \int_a^b \frac{dQ_{sys}}{T_{sys}}[/tex]

where the integral is measured along the reversible path from a to b. If you measure this quantity, you will have a positive number. That is the second law of thermodynamics. If you get a negative number, you are missing part of the system or surroundings.

AM
 
  • #21
Andrew Mason said:
Just a little helpful advice: Disgregation is an archaic term and is no longer used in thermodynamics. So disregard disgregation. Stick to entropy and its thermodynamic meaning:

[tex]\Delta S_{a->b} = \Delta S_{surr:a->b} + \Delta S_{sys: a->b} = \int_a^b \frac{dQ_{surr}}{T_{surr}} + \int_a^b \frac{dQ_{sys}}{T_{sys}}[/tex]

where the integral is measured along the reversible path from a to b. If you measure this quantity, you will have a positive number. That is the second law of thermodynamics. If you get a negative number, you are missing part of the system or surroundings.

AM
How does plugging in numbers to this algorithm and getting output help you understand what entropy is in a concrete sense?

Disgregation is a useful term because it describes the progression from relative order to relative disorder. Isn't that the basic meaning of entropy?
 
  • #22
brainstorm said:
How does plugging in numbers to this algorithm and getting output help you understand what entropy is in a concrete sense?

Disgregation is a useful term because it describes the progression from relative order to relative disorder. Isn't that the basic meaning of entropy?

The equations represent the effect of entropy and present a useful and meaningful way of calculating it. What exactly are you having trouble understanding with again?
 
  • #23
Drakkith said:
The equations represent the effect of entropy and present a useful and meaningful way of calculating it. What exactly are you having trouble understanding with again?

I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.
 
  • #24
brainstorm said:
I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.

Are you simply asking "What is entropy"?

I can't say anything about disgregation, as i don't know anything about it.

Edit: Ok i found your first post. The air inside the room transfers its heat to the AC unit. I guess the coolant inside the AC would be disgregating.
 
Last edited:
  • #25
Drakkith said:
Are you simply asking "What is entropy"?
I know that entropy is the amounts of disorder in a system. The issue is applying the general concept to a specific situation, in this case air-conditioning.

I can't say anything about disgregation, as i don't know anything about it.
I don't either, but the word itself seems to simply mean the opposite of aggregation, which involves generation of a concentration of differentiated elements in a system. So, for example, an a/c compressor aggregates heat by compressing gas, which causes it to dissipate into the surrounding air. Then the cooled coolant goes to the condenser (I think I'm naming the right part) where it absorbs heat from adjacent air, which cools that air to aggregate cool air. The warm and cool air pockets in the room then disaggregate into each other to reach a more homogenous temperature.

Edit: Ok i found your first post. The air inside the room transfers its heat to the AC unit. I guess the coolant inside the AC would be disgregating.
Disgregate is not the same thing as dissipate. Aggregate and disgregate seem to refer to relative concentration within a mixture. Dissipate just refers to the concentrated substance losing concentration by flowing away. They are similar, admittedly, but not the same. Heat can dissipate but not disgregate, I don't think, because it's not a substance. Molecules can disgregate, but heat dissipates. I think this is because molecules are discrete entities and heat is kinetic energy of motion.
 
  • #26
brainstorm said:
I never said the equations aren't useful for calculations. I said that they don't express what entropy actually is. You said the term "disgregation" is antiquated but it is a useful term for describing the opposite of aggregation, insofar as relative order in a system increases with aggregation and decreases with disgregation. Earlier posts were about entropy due to air conditioning. I had asked what was disgregating in order to clarify what kind of entropy would be going on in the system.
Entropy does not really have to be thought of as a tangible physical property. It can be simply thought of as a mathematical relationship that is a useful tool when dealing with thermodynamic systems.

One could say that work is the integral of Force x distance [itex]\int F\cdot ds[/itex] over a certain path. What does that mean? At one level, it means that work is the ability to usefully move matter. Kinetic and potential energy can be viewed as (it is actually defined as) the ability to do work. In thermodynamics, kinetic and potential energy could be viewed as the lowest entropy form of energy. But at the end of the day, work means [itex]\int F\cdot ds[/itex] over a path and energy is just the ability to do work.Carnot showed that the maximum amount of useful work that can be produced from a system operating between two temperatures occurs when the system is arbitrarily close to equilibrium with its surroundings at all times. The further a system is from equilibrium with its surroundings during the process, the lower the efficiency (the work output per unit of input energy in heat flow). Where the system is arbitrarily close to equilibrium during a process, the direction of the process can be reversed with an arbitrarily small change in conditions. He called this a reversible process.

Carnot proved that if one calculated the quantity of [itex]\int dQ/T[/itex] between two states for both the system and surroundings for a reversible process, the result was always 0. He also proved that where the beginning and end states were not achieved by a reversible process, the quantity (ie the value of the above integral) was always greater than 0. He further proved that the amount of work that could be produced from the system for a given input heat flow (the efficiency) was indirectly proportional to this quantity: the greater the number, the lower the efficiency. It can be shown, consequently, that this quantity is also a measure of the amount of work that will have to be done to reverse the process.

It became apparent that this quantity be used as a tool for analysing thermodynamic processes. The value of the integral was called "entropy". What does it measure? It measures how far from equilibrium a thermodynamic process occurred.

What greater significance in the universe does this have? There are several ways of looking at it. If you look at it from an energy perspective, it is a measure of thermodynamic potential - that is a measure of how much energy you can usefully extract (work) from a thermodynamic system.

If you look at it from a statistical mechanics perspective, entropy can be viewed as a measure of the number of equivalent microstates that a system or the universe can have for a given thermodynamic state. The greatest number of microstates for the universe occurs when everything in the universe is in complete thermodynamic equilibrium).

But these are simply attempts to give a physical significance to the value of the integral of dQ/T over a path between two thermodynamic states. AM
 
  • #27
Andrew Mason said:
Entropy does not really have to be thought of as a tangible physical property. It can be simply thought of as a mathematical relationship that is a useful tool when dealing with thermodynamic systems.
I prefer to think in concrete terms.

One could say that work is the integral of Force x distance [itex]\int F\cdot ds[/itex] over a certain path. What does that mean? At one level, it means that work is the ability to usefully move matter. Kinetic and potential energy can be viewed as (it is actually defined as) the ability to do work. In thermodynamics, kinetic and potential energy could be viewed as the lowest entropy form of energy. But at the end of the day, work means [itex]\int F\cdot ds[/itex] over a path and energy is just the ability to do work.
This makes sense to me insofar as there is a parallel between a battery with an electric field due to charge disequilibrium and a thermal system with heat disequilibrium.

Carnot showed that the maximum amount of useful work that can be produced from a system operating between two temperatures occurs when the system is arbitrarily close to equilibrium with its surroundings at all times. The further a system is from equilibrium with its surroundings during the process, the lower the efficiency (the work output per unit of input energy in heat flow). Where the system is arbitrarily close to equilibrium during a process, the direction of the process can be reversed with an arbitrarily small change in conditions. He called this a reversible process.
I don't get the reversibility. Also, don't you mean that the further the system is from equilibrium the more potential to do work it has?


Carnot proved that if one calculated the quantity of [itex]\int dQ/T[/itex] between two states for both the system and surroundings for a reversible process, the result was always 0. He also proved that where the beginning and end states were not achieved by a reversible process, the quantity (ie the value of the above integral) was always greater than 0. He further proved that the amount of work that could be produced from the system for a given input heat flow (the efficiency) was indirectly proportional to this quantity: the greater the number, the lower the efficiency. It can be shown, consequently, that this quantity is also a measure of the amount of work that will have to be done to reverse the process.
This would be easier to grasp with an example, I think.

It became apparent that this quantity be used as a tool for analysing thermodynamic processes. The value of the integral was called "entropy". What does it measure? It measures how far from equilibrium a thermodynamic process occurred.
I thought entropy is a measure of disorder, where equilibrium is the ultimate state of disorder and disequilibrium is relative order. In ice cube in water has potential to draw heat from the surrounding water, but once it melts the system is in equilibrium and no transfer takes place. I thought melted ice was maximum state of entropy for the ice cubes in water system.

If you look at it from a statistical mechanics perspective, entropy can be viewed as a measure of the number of equivalent microstates that a system or the universe can have for a given thermodynamic state. The greatest number of microstates for the universe occurs when everything in the universe is in complete thermodynamic equilibrium).
What is meant by "microstates?" Do you mean that water frozen in ice, for example, has less potential microstates possible because the molecules are locked in a solid formation? What about when you pour cold water into hot water? Does the evenly lukewarm mixture eventually reached have more microstates than the initial combination of hot water with cold water pockets in it?

But these are simply attempts to give a physical significance to the value of the integral of dQ/T over a path between two thermodynamic states.
and usefully so, imo.
 
  • #28
Drakkith said:
From wikipedia on entropy: "An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics."

If you decrease the entropy of a system (in this case the air in the room) then you MUST increase it in another system (In this case, the air outside).

If the air conditioner is reversible engine but not carnot, will the entropy(total) still increase? I think YES, but college physics text says entropy(total) remain constant for reversible?
 
  • #29
The word equilibrium means a state of balance. In an equilibrium state, there are no unbalanced potentials (or driving forces) with the system. A system that is in equilibrium experiences no changes when it is isolated from its surroundings.

Equilibrium isn't the ultimate state of disorder. If i take a system in thermodynamic equilibrium, and I add energy to it, it's entropy and disorder increases. You can't talk about a a maximum amount of disorder unless its on an isolated system, in which case it would be equal to the total amount of energy in that system when the entirety of the energy is spread throughout the system, which is when it reaches equilibrium.
 
  • #30
Drakkith said:
The word equilibrium means a state of balance. In an equilibrium state, there are no unbalanced potentials (or driving forces) with the system. A system that is in equilibrium experiences no changes when it is isolated from its surroundings.

Equilibrium isn't the ultimate state of disorder. If i take a system in thermodynamic equilibrium, and I add energy to it, it's entropy and disorder increases. You can't talk about a a maximum amount of disorder unless its on an isolated system, in which case it would be equal to the total amount of energy in that system when the entirety of the energy is spread throughout the system, which is when it reaches equilibrium.

I thought entropy was relative to the ultimate state of disorder in a given system with isolated inputs. So, if a system reaches thermal equilibrium at on temperature, it has reached maximum entropy because no more disgregation of heat will take place. If heat is added unevenly, entropy can decrease insofar as the heat is concentrated/aggregated within some subset(s) of the system. In that case, the heat will dissipate and eventually cause the system to reach thermal equilibrium, but this time at a higher temperature.

Is this an incorrect description/example of thermal-equilibrium progress as increasing entropy?
 

Similar threads

  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 135 ·
5
Replies
135
Views
9K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 22 ·
Replies
22
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K