Understanding Entropy and the 2nd Law of Thermodynamics - comments

In summary: Hi Jimster.You ask great questions.Why don't you introduce this in a separate thread, and we'll work through it together? First we'll consider the spring/damper system to get an idea of how a difference between a rapid deformation and a very slow deformation (between the same initial and final states) plays out in terms of mechanical energy dissipated in the damper and work done. The idea is for you to get a feel for how this works.In summary, the reversible paths minimize the dissipation of mechanical energy to thermal energy, and maximize the ability of temperature differences to be converted into mechanical energy.
  • #1
Chestermiller
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
2023 Award
23,205
5,651
Chestermiller submitted a new PF Insights post

Understanding Entropy and the 2nd Law of Thermodynamics

thermo_tut-80x80.png


Continue reading the Original PF Insights Post.
 
  • Like
Likes ShayanJ, davidbenari, kelvin490 and 3 others
Science news on Phys.org
  • #3
Thanks Chester.
Yes. I really did find that clear.

Which is not to say I understood it...

What is special about the reversible paths? Are all the other paths, the non-reversible ones, the same, or do some integrate to different values <= DeltaS than others?
 
Last edited:
  • #4
Kudos chestermiller. That was clear, and understandable, The historical perspective really helped.I look forward to the day when chestermiller makes it similarly easy to understand why this second law implies that "the entopy of the universe tends to a maximum". And how it relates to the kind of information debated in the Hawking/Susskind "black hole wars."
 
  • Like
Likes Jimster41 and Greg Bernhardt
  • #5
Thanks Jimster.

Jimster41 said:
Thanks Chester.
Yes. I really did find that clear.

Which is not to say I understood it...

What is special about the reversible paths?
Reversible paths minimize the dissipation of mechanical energy to thermal energy, and maximize the ability of temperature differences to be converted into mechanical energy. In reversible paths, the pressure exerted by the surroundings at the interface with the system is only slightly higher of lower than the pressure throughout the system, and the temperature at the interface with the surroundings is only slightly higher or lower than the temperature throughout the system. This situation is maintained over the entire path from the initial to the final equilibrium state of the system.

For irreversible paths, the dissipation of mechanical energy to thermal energy is the result of viscous dissipation. The same thing happens if you compress a combination of a spring and (viscous) damper connected in parallel. If you compress the combination very rapidly from an initial length to a final length, you generate lots of heat in the damper (since the force carried by the damper is proportional to the velocity difference across the damper). On the other hand, if you compress the combination very slowly, the force carried by the damper is much less, and you generate much less heat. The amount of work you need to do in the latter case to bring about the compression is also much less. This is a very close analogy to what happens when you cause a gas in a cylinder to compress.
Are all the other paths, the non-reversible ones, the same, or do some integrate to different values <= DeltaS than others?
They integrate to different values < ΔS. The equal sign does not apply to irreversible paths. They are all less.

Chet
 
  • #6
Thanks Chet, I hope it's okay if I keep asking you questions. It really is my favorite way to learn, and I can get enough of the second law, and I'm sure I will learn something - not the least of which will be precision of terms.

In the the case of a gas in a cylinder with a piston (aka "the damper") why does the amount of dissipation vary with the amount of force per unit time? what does the time rate of force have to who with the efficiency of conversion to mechanical energy? Why does the difference at any given time between the system and surroundings, dictate the reversibility, as opposed to say the amount of energy transferred altogether?
 
Last edited:
  • Like
Likes Greg Bernhardt
  • #7
Jimster41 said:
Thanks Chet, I hope it's okay if I keep asking you questions. It really is my favorite way to learn, and I can get enough of the second law, and I'm sure I will learn something - not the least of which will be precision of terms.

In the the case of a gas in a cylinder with a piston (or damper) why does the amount of dissipation vary with the amount of force per unit time? what does the time rate of force have to who with the efficiency of conversion to mechanical energy? Why does the difference at any given time between the system and surroundings, dictate the reversibility, as opposed to say the amount of energy transferred altogether?
Hi Jimster. You ask great questions.

Why don't you introduce this in a separate thread, and we'll work through it together? First we'll consider the spring/damper system to get an idea of how a difference between a rapid deformation and a very slow deformation (between the same initial and final states) plays out in terms of mechanical energy dissipated in the damper and work done. The idea is for you to get a feel for how this works.

Chet
 
  • Like
Likes davidbenari and Jimster41
  • #8
Thanks Chet. Your explanation of the "Clausius Inequality" and you answer on the difference between reversible and non-reversible paths were helpful and lucid, and It means a lot to know they are at least sensible questions.

I don't suppose @techmologist and I could get your help reading an old Galvin Crooks paper from 1999 on the "generalized fluctuation theorem"? We've got a thread going in the cosmology forum. PeterDonis has been helping us (humoring us more like it). It's under @techmologists question "why are there heat engines?" It's pretty rambly at this point so I would be more than happy to restart it focusing it back on Crooks' paper and handful of equations, and drill in with your guidance.
 
  • #9
Jimster41 said:
Thanks Chet. Your explanation of the "Clausius Inequality" and you answer on the difference between reversible and non-reversible paths were helpful and lucid, and It means a lot to know they are at least sensible questions.

I don't suppose @techmologist and I could get your help reading an old Galvin Crooks paper from 1999 on the "generalized fluctuation theorem"? We've got a thread going in the cosmology forum. PeterDonis has been helping us (humoring us more like it). It's under @techmologists question "why are there heat engines?" It's pretty rambly at this point so I would be more than happy to restart it focusing it back on Crooks' paper and handful of equations, and drill in with your guidance.
I'll take a look and see if I can contribute. There are lots of pages and lots of posts, so it may take me a while to come up to speed. No guarantees.

Chet
 
  • #10
Hello Chestermiller.

"There have been nearly as many formulations of the second law as there have been discussions of it."

~P. W. BridgmanEntropy and the Second Law of Thermodynamics is not exactly an intuitive concept. While I think your article is basically a good one, it is obviously somewhat limited in scope, and my only critique is that you did not cover some of the most important aspects of entropy.

I agree that most people have a very hard time grasping entropy and the second law of thermodynamics. But I am not sure I understand why your article keeps referring to reversible processes and adiabatic idealizations. In natural systems, the entropy production rate of every process is always positive (ΔS > 0) or zero (ΔS = 0). But only idealized adiabatic (perfectly insulated) and isentropic (frictionless, non-viscous, pressure-volume work only) processes actually have an entropy production rate of zero. Heat is produced, but not entropy. In nature, this ideal can only be an approximation, because it requires an infinite amount of time and no dissipation.

You hardly mention irreversible processes. An irreversible process degrades the performance of a thermodynamic system, and results in entropy production. Thus, irreversible processes have an entropy production rate greater than zero (ΔS > 0), and that is really what the second law is all about (beyond the second law analysis of machines or devices). Every naturally occurring process, whether adiabatic or not, is irreversible (ΔS > 0), since friction and viscosity are always present.

Here is my favorite example of an irreversible thermodynamic process, the Entropy Rate Balance Equation for Control Volumes:

eq060701.gif


And here are are a couple of other important things you did not mention about entropy:

1) Entropy is a measure of molecular disorder in a system. According to Kelvin, a pure substance at absolute zero temperature is in perfect order, and its entropy is zero. This is the less commonly known Third Law of Thermodynamics.

2) "A system will select the path or assemblage of paths out of available paths that minimizes the potential or maximizes the entropy at the fastest rate given the constraints." This is known as the Law of Maximum Entropy Production. "The Law of Maximum Entropy Production thus has deep implications for evolutionary theory, culture theory, macroeconomics, human globalization, and more generally the time-dependent development of the Earth as a ecological planetary system as a whole." http://www.lawofmaximumentropyproduction.com/

And apparently, I just got another trophy since this is my first post!
 
Last edited:
  • #11
No problem Chet, don't feel obliged. Might be just as well if you were to do something you thought would be most helpful, rather than follow us down a rabbit hole. This is the Crooks paper

http://arxiv.org/abs/cond-mat/9901352

The Entropy Production Fluctuation Theorem and the Nonequilibrium Work Relation for Free Energy Differences
Gavin E. Crooks
(Submitted on 29 Jan 1999 (v1), last revised 29 Jul 1999 (this version, v4))
There are only a very few known relations in statistical dynamics that are valid for systems driven arbitrarily far-from-equilibrium. One of these is the fluctuation theorem, which places conditions on the entropy production probability distribution of nonequilibrium systems. Another recently discovered far-from-equilibrium expression relates nonequilibrium measurements of the work done on a system to equilibrium free energy differences. In this paper, we derive a generalized version of the fluctuation theorem for stochastic, microscopically reversible dynamics. Invoking this generalized theorem provides a succinct proof of the nonequilibrium work relation.
I'm interested in the fluctuation theorem, just understanding it really, it seems to underpin the second law? What I liked about Crooks formulation is that I thought I could see more how "entropy" path selection and work are related. But I have little confidence I understand it.
 
  • #12
INFO-MAN said:
Hello Chestermiller.

Entropy and the Second Law of Thermodynamics is not exactly an intuitive concept. While I think your article is basically a good one, it is obviously somewhat limited in scope, and my only critique is that you did not cover some of the most important aspects of entropy.
Thanks INFO_MAN. It's nice to be appreciated.

Yes. You are correct. I deliberately limited the scope. Possibly you misconstrued my objective. It was definitely not to write a treatise on entropy and the 2nd law. I was merely trying to give beginning thermodynamics students who are struggling with the basic concepts the minimum understanding they need just to do their homework. As someone relatively new to Physics Forums, you may not be aware of the kinds of questions we get from novices. Typical of a recurring question is: How come the entropy change is not zero for an irreversible adiabatic process if the change in entropy is equal to the integral of dq/T and dq = 0? Homework problems frequently involve irreversible adiabatic expansion or compression of an ideal gas in a cylinder with a piston. Students are often asked to determine the final equilibrium state of the system, and the change in entropy. You can see where, if they were asking questions like the the previous one, how they would have trouble doing a homework problem like this.

My original introduction to the tutorial was somewhat longer than in the present version, and spelled out the objectives more clearly. However, the guidelines that Physics Forums set a goal of about 400 words for the Insight articles, and the present version of my article is well over 1000 words. Here is the introductory text that I cut out:

In this author's judgement, the primary cause of the (students') confusion is the poor manner in which these concepts are taught in textbooks and courses.

The standard approach is to present the chronological development of the subject in a straight line from beginning to end. Although this is the way that the subject had developed historically, it is not necessarily the best way to teach the subject. It is much more important for the students to gain a solid understanding of the material by whatever means possible than to adhere to a totally accurate account of the chronological sequence. Therefore, in the present document, we have created a somewhat fictionalized account of the historical sequence of events in order to minimize the historical discussion, focus more intently on the scientific findings, and make the concepts clearer and less confusing to students.

Another shortcoming of existing developments is that the physical situations they discuss are not specified precisely enough, and the mathematical relationships likewise lack proper constraint on their applicability and limitations (particularly the so-called Clausius Inequality). There is also a lack a concise mathematical statement of the second law of thermodynamics in such a way that it can be confidently applied to practical situations and problem solving. In the present development, we have endeavored to overcome these shortcomings.

I agree that most people have a very hard time grasping entropy and the second law of thermodynamics. But I am not sure I understand why your article keeps referring to reversible processes and adiabatic idealizations. In natural systems, the entropy production rate of every process is always positive (ΔS > 0) or zero (ΔS = 0). But only idealized adiabatic (perfectly insulated) and isentropic (frictionless, non-viscous, pressure-volume work only) processes actually have an entropy production rate of zero. Heat is produced, but not entropy. In nature, this ideal can only be an approximation, because it requires an infinite amount of time and no dissipation.
This is an example of one of those instances I was referring to in which the constraints on the equations is not spelled out clearly enough, and, as a result, confusion can ensue. The situation you are referring to here with the inequality (ΔS > 0) and equality (ΔS = 0) applies to the combination of the system and the surroundings, and not just to a closed system. Without this qualification, the student might get the idea that for a closed system, ΔS≥0 always, which is, of course, not the case.

Even though reversible processes are an idealization, there is still a need for beginners to understand them. First of all they provide an important limiting case with which irreversible processes can be compared. In geometry, there is no such thing as a perfect circle, a perfect rectangle, a perfect square, etc., but yet we still study them and apply their concepts in our work and lives. Secondly, some of the processes that occur in nature and especially in industry can approach ideal reversible behavior. Finally, and most importantly, reversible processes are the only vehicle we have for determining the change in entropy between two thermodynamic equilibrium states of a system or material.
You hardly mention irreversible processes. An irreversible process degrades the performance of a thermodynamic system, and results in entropy production. Thus, irreversible processes have an entropy production rate greater than zero (ΔS > 0), and that is really what the second law is all about (beyond the second law analysis of machines or devices). Every naturally occurring process, whether adiabatic or not, is irreversible (ΔS > 0), since friction and viscosity are always present.
I'm sorry that impression came through to you because that was not my intention. I feel that it is very important for students to understand the distinction between real irreversible processes paths and ideal reversible process paths. Irreversible process paths are what really happens. But reversible process paths are what we need to use to get the change in entropy for a real irreversible process path.
Here is my favorite example of an irreversible thermodynamic process, the Entropy Rate Balance Equation for Control Volumes:

eq060701.gif
This equation applies to the more general case of an open system for which mass is entering and exiting, and I was trying to keep things simple by restricting the discussion to closed systems. Also, entropy generation can be learned by the struggling students at a later stage.

And here are are a couple of other important things you did not mention about entropy:

1) Entropy is a measure of molecular disorder in a system. According to Kelvin, a pure substance at absolute zero temperature is in perfect order, and its entropy is zero. This is the less commonly known Third Law of Thermodynamics.

2) "A system will select the path or assemblage of paths out of available paths that minimizes the potential or maximizes the entropy at the fastest rate given the constraints." This is known as the Law of Maximum Entropy Production. "The Law of Maximum Entropy Production thus has deep implications for evolutionary theory, culture theory, macroeconomics, human globalization, and more generally the time-dependent development of the Earth as a ecological planetary system as a whole." http://www.lawofmaximumentropyproduction.com/

As I said above, I was trying to limit the scope exclusively to what the beginning students needed to understand in order to do their homework.

Chet
 
  • #13
That was great Chet. It helps to know the purpose and scope. Hey, can you explain to a confused student why the change in entropy in a closed system is not always greater than or equal to 0? I think I know (Poincare' recurrence?) but I also think I'm probably wrong.
 
  • #14
Jimster41 said:
That was great Chet. It helps to know the purpose and scope. Hey, can you explain to a confused student why the change in entropy in a closed system is not always greater than or equal to 0? I think I know (Poincare' recurrence?) but I also think I'm probably wrong.
Suppose you compress a gas isothermally and reversibly in a closed system. To hold the temperature constant, do you have to add heat or remove heat? After you compress the gas to a smaller volume at the same temperature, are the number of quantum states available to it greater or fewer?

You are aware that, in thermodynamics, there is a difference between a closed system and an isolated system, correct?

Chet
 
Last edited:
  • #15
Chestermiller said:
Suppose you compress a gas isothermally and reversibly in a closed system. To hold the temperature constant, do you have to add heat or remove heat? After you compress the gas to a smaller volume at the same temperature, are the number of quantum states available to it greater or fewer?

You are aware that, in thermodynamics, there is a difference between a closed system and an isolated system, correct?

Chet
No sir, I was not clear on that precise difference of terms! Now I am. I believe you need to remove heat. Hmm, the quantum states. That one really makes me think, with great confusion, which is not good, since the answer should probably be obvious. In the closed system that has been isothermically compressed (heat removed), I would say the number of states is fewer? But it 's basically a guess. I don't know how to decompose that question, with any confidence. I think I know something about the parts, but probably have way too many questions and misconceptions tangled up in it. Please do illuminate!

I say fewer because the volume is less, and so the available "locations" are reduced. But this does not seem very satisfactory, right, or clear.
 
Last edited:
  • #16
Jimster41 said:
No sir, I was not clear on that precise difference of terms! Now I am. I believe you need to remove heat. Hmm, the quantum states. That one really makes me think, with great confusion, which is not good, since the answer should probably be obvious. In the closed system that has been isothermically compressed (heat removed), I would say the number of states is fewer? But it 's basically a guess. I don't know how to decompose that question, with any confidence. I think I know something about the parts, but probably have way too many questions and misconceptions tangled up in it. Please do illuminate!

I say fewer because the volume is less, and so the available "locations" are reduced. But this does not seem very satisfactory, right, or clear.
Both your answers are correct. You remove heat from the system in an isothermal reversible compression, so ΔS < 0 (q is negative). The number of states available to the system is fewer, so, by that criterion also, ΔS < 0.

A closed system is one that cannot exchange mass with its surroundings, but it can exchange heat and mechanical energy (work W). An isolated system is one that can exchange neither mass, heat, nor work.

Chet
 
  • Like
Likes Jimster41
  • #17
I find your "temperature at the interface with the surroundings" confusing in that to me it implies an average temperature between the system and the surroundings at that point. Would it be more clear to say "temperature of the surroundings at the interface" or am I missing something?
 
  • #18
insightful said:
I find your "temperature at the interface with the surroundings" confusing in that to me it implies an average temperature between the system and the surroundings at that point. Would it be more clear to say "temperature of the surroundings at the interface" or am I missing something?
What you're missing is that, at the interface, the local temperature of the system matches the temperature of the surroundings. There is no discontinuity in temperature (or in force per unit area) at the interface. However, in an irreversible process, the temperature within the system varies with distance from the interface.

Chet
 
Last edited:
  • #19
So you're saying there is a temperature gradient between the "bulk" system and the interface, but no temperature gradient between the "bulk" surroundings and the interface?
 
  • #20
insightful said:
So you're saying there is a temperature gradient between the "bulk" system and the interface
Yes. With an irreversible process, there is a temperature difference between the average temperature in the system and the temperature at the interface. However, at the very interface, the local system temperature matches the local surroundings temperature.
, but no temperature gradient between the "bulk" surroundings and the interface?
Not necessarily. I've tried to get us focused primarily on the system. I'm assuming that we are not concerning ourselves with the details of what is happening within the surroundings, except at the interface, where we are assuming that either the heat flux or the temperature is specified. (Of course, more complicated boundary conditions can also be imposed, and are included within the framework of our methodology). Thus, the "boundary conditions" for work and heat flow on the system are applied at the interface.

Chet
 
  • #21
Chestermiller said:
Both your answers are correct. You remove heat from the system in an isothermal reversible compression, so ΔS < 0 (q is negative). The number of states available to the system is fewer, so, by that criterion also, ΔS < 0.

A closed system is one that cannot exchange mass with its surroundings, but it can exchange heat and mechanical energy (work W). An isolated system is one that can exchange neither mass, heat, nor work.

Chet
I realize why the quantum states question confuses me. Probably it is an issue of specific terms.

If I picture the piston and cylinder made of graph paper cells, containing 1's and 0's, with the volume of the uncompressed cylinder as an area of zeros 0's mixed with 1's (representing the uncompressed gas in the cylinder) this area then surrounded by some more 1's representing the boundaries of the cylinder, including the piston. If I then compress the gas, by changing some of the cylinder volume cells to 1's, I haven't changed the number of states in the system (the graph paper hasn't shrunk or lost cells) I have just added information assigning some of the cells of the cylinder volume with specific values. So I guess by "available QM states" you mean those that are uncertain, or "free" to be randomly set to 1or 0.

Maybe it's a bad metaphor, because I get even more confused when I think that to expand the "cylinder" I still have to add information, changing a set of "fixed cells" to be "free".
 
Last edited:
  • #22
Jimster41 said:
I realize why the quantum states question confuses me. Probably it is an issue of specific terms.

If I picture the piston and cylinder made of graph paper cells, containing 1's and 0's, with the volume of the uncompressed cylinder as an area of zeros 0's mixed with 1's (representing the uncompressed gas in the cylinder) this area then surrounded by some more 1's representing the boundaries of the cylinder, including the piston. If I then compress the gas, by changing some of the cylinder volume cells to 1's, I haven't changed the number of states in the system (the graph paper hasn't shrunk or lost cells) I have just added information assigning some of the cells of the cylinder volume with specific values. So I guess by "available QM states" you mean those that are uncertain, or "free" to be randomly set to 1or 0.

Maybe it's a bad metaphor, because I get even more confused when I think that to expand the "cylinder" I still have to add information, changing a set of "fixed cells" to be "free".
My goal was to emphasize the classical approach to entropy in my development, and to generally skip the statistical thermodynamic perspective.

Chet
 
  • #23
Thanks Chet. Didn't mean to take you off track. Just trying to leverage my confusion opportunity, to figure out what I'm getting wrong when I think of it. Maybe yournext one. Or I could talk it to a new thread?
 
  • #24
What does the ∫ symbol in the thermodynamics equations mean? (i know what + - * / mean :) Oh and brackets)
 
  • #25
nitsuj said:
What does the ∫ symbol in the thermodynamics equations mean?
It is an integral sign. Apparently, you haven't had calculus yet. You are not going to be able to understand and apply much of thermodynamics without the basic tool of calculus.
(i know what + - * / mean :) Oh and brackets)
Brackets are a kind of parenthesis.

Chet
 
  • #26
Chestermiller said:
It is an integral sign. Apparently, you haven't had calculus yet. You are not going to be able to understand and apply much of thermodynamics without the basic tool of calculus.

Brackets are a kind of parenthesis.

Chet

thanks Chet, so ∫ is more complicated than it looks :smile:
 
  • #27
nitsuj said:
thanks Chet, so ∫ is more complicated than it looks :smile:
Hopefully not to those who have had calculus.:smile:
 
  • #28
I thought it was a lower bound, not an upper bound and the lower bound is zero. If you go from one state to another with a perfectly reversible process then the entropy generated is zero.
 
  • #29
wvphysicist said:
I thought it was a lower bound, not an upper bound and the lower bound is zero. If you go from one state to another with a perfectly reversible process then the entropy generated is zero.
Nope, for a closed system undergoing a reversible change, the entropy change of the system is an upper bound. And for a perfectly reversible process, the entropy change for the system (which is clearly stated as the focus of my article) is not necessarily zero; in fact, it can even be less than zero.

For the combination of system and surroundings, the entropy generated in a reversible process is zero, provided that the surroundings are also handled reversibly.

Chet
 
  • #30
OK, I understand a little more and accept the last sentence. I think primarily about heat engines.

I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?

I have another problem with entropy. Some folks say it involves information. I have maintained that only energy is involved. Consider a system containing two gasses. The atoms are identical except half are red and the other are blue. Initially the red and blue are separated by a card in the center of the container. The card is removed and the atoms mix. How can there be a change in entropy?

Oh, one more please. Can you show an example where the entropy change is negative like you were saying?
 
  • #31
wvphysicist said:
I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?
I don't have a clear idea of what this equation is about. Let me try to articulate my understanding, and you can then correct it. You have an isolated system containing an exothermic chemical reaction vessel in contact with a cold bath. In one case, the bath contains ice floating in water at 0 C. In the other case, the bath contains only water at 0 C. Is there a difference in the energy transferred from the reaction vessel to the bath in the two cases. How is this affected if the amount of water in the second case is increased? (Are you also asking about the entropy changes in these cases?)

I have another problem with entropy. Some folks say it involves information. I have maintained that only energy is involved. Consider a system containing two gasses. The atoms are identical except half are red and the other are blue. Initially the red and blue are separated by a card in the center of the container. The card is removed and the atoms mix. How can there be a change in entropy?
This question goes beyond the scope of what I was trying to cover in my article. It involves the thermodynamics of mixtures. I'm trying to decide whether to answer this in the present Comments or write an introductory Insight article on the subject. I need time to think about what I want to do. Meanwhile, I can tell you that there is an entropy increase for the change that you described and that the entropy change can be worked out using energy with the integral of dqrev/T.
Oh, one more please. Can you show an example where the entropy change is negative like you were saying?
This one is easy. Just consider a closed system in which you bring about the isothermal reversible compression of an ideal gas, so that the final temperature is equal to the initial temperature, the final volume is less than the initial volume, and the final pressure is higher than the initial pressure.

Chet
 
  • #32
To make sure I understand the end of your article clearly:

(1) A system can go from an initial equilibirum state to a final equilibrium state through a reversible or irreversible process.
(2) Whichever process it undergoes, its change in entropy will be the same.
(3) That change in entropy can be determined by evaluating the following integral over any reversible process path the system could have gone through: [itex]\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/itex].
(4) If the system goes through an irreversible process path, this integral: [itex]\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_{Int}(t)}dt}[/itex] will yield a lesser value than the reversible path integral, but the change in entropy would still be equal to the (greater) evaluation of the reversible path integral.

Is that right?
 
Last edited:
  • #33
DocZaius said:
To make sure I understand the end of your article clearly:

(1) A system can go from an initial equilibirum state to a final equilibrium state through a reversible or irreversible process.
(2) Whichever process it undergoes, its change in entropy will be the same.
(3) That change in entropy can be determined by evaluating the following integral over any reversible process path the system could have gone through: [itex]\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/itex].
(4) If the system goes through an irreversible process path, this integral: [itex]\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_{Int}(t)}dt}[/itex] will yield a lesser value than the reversible path integral, but the change in entropy would still be equal to the (greater) evaluation of the reversible path integral.

Is that right?
Perfect.
 
  • #34
I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?

I don't have a clear idea of what this equation is about. Let me try to articulate my understanding, and you can then correct it. You have an isolated system containing an exothermic chemical reaction vessel in contact with a cold bath. In one case, the bath contains ice floating in water at 0 C. In the other case, the bath contains only water at 0 C. Is there a difference in the energy transferred from the reaction vessel to the bath in the two cases. How is this affected if the amount of water in the second case is increased? (Are you also asking about the entropy changes in these cases?)

Using identical reaction vessels the energy transferred is set the same. The question is about the entropy change. Will heating water a tiny delta T or melting ice result in the same entropy change?
 
  • #35
This question goes beyond the scope of what I was trying to cover in my article. It involves the thermodynamics of mixtures. I'm trying to decide whether to answer this in the present Comments or write an introductory Insight article on the subject. I need time to think about what I want to do. Meanwhile, I can tell you that there is an entropy increase for the change that you described and that the entropy change can be worked out using energy with the integral of dqrev/T.
I cannot understand the symbols in the integral.
 
<h2>1. What is entropy?</h2><p>Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the distribution of energy within a system.</p><h2>2. How does entropy relate to the 2nd law of thermodynamics?</h2><p>The 2nd law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to move towards a state of higher disorder or randomness.</p><h2>3. Can entropy be reversed?</h2><p>In isolated systems, entropy can never decrease. However, in open systems, energy can be added or removed, causing the system's entropy to increase or decrease, respectively.</p><h2>4. What are some real-world examples of entropy?</h2><p>Some examples of entropy in everyday life include a cup of hot coffee cooling down, a car engine producing heat and sound, and a room becoming messy over time.</p><h2>5. How does understanding entropy and the 2nd law of thermodynamics benefit us?</h2><p>Understanding entropy and the 2nd law of thermodynamics allows us to predict and control the behavior of physical systems. It also helps us understand why certain processes occur in nature and how to improve the efficiency of energy transformations.</p>

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the distribution of energy within a system.

2. How does entropy relate to the 2nd law of thermodynamics?

The 2nd law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to move towards a state of higher disorder or randomness.

3. Can entropy be reversed?

In isolated systems, entropy can never decrease. However, in open systems, energy can be added or removed, causing the system's entropy to increase or decrease, respectively.

4. What are some real-world examples of entropy?

Some examples of entropy in everyday life include a cup of hot coffee cooling down, a car engine producing heat and sound, and a room becoming messy over time.

5. How does understanding entropy and the 2nd law of thermodynamics benefit us?

Understanding entropy and the 2nd law of thermodynamics allows us to predict and control the behavior of physical systems. It also helps us understand why certain processes occur in nature and how to improve the efficiency of energy transformations.

Similar threads

  • Thermodynamics
Replies
33
Views
1K
  • Thermodynamics
Replies
2
Views
702
Replies
1
Views
610
Replies
12
Views
1K
Replies
100
Views
6K
Replies
1
Views
840
  • Thermodynamics
Replies
3
Views
1K
Replies
5
Views
2K
Replies
3
Views
942
Back
Top