Understanding Entropy and the 2nd Law of Thermodynamics - comments

Click For Summary
Chestermiller's post on entropy and the second law of thermodynamics aims to clarify these complex concepts for beginners. Key discussions focus on the differences between reversible and irreversible processes, emphasizing that reversible paths minimize energy dissipation, while irreversible paths result in greater entropy production. Participants express appreciation for the clarity of the explanations but also seek deeper understanding, particularly regarding the implications of the second law and entropy's role in natural processes. The conversation highlights the need for further exploration of topics like the fluctuation theorem and its relationship to entropy. Overall, the thread encourages ongoing questions and discussions to enhance comprehension of thermodynamic principles.
  • #31
wvphysicist said:
I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?
I don't have a clear idea of what this equation is about. Let me try to articulate my understanding, and you can then correct it. You have an isolated system containing an exothermic chemical reaction vessel in contact with a cold bath. In one case, the bath contains ice floating in water at 0 C. In the other case, the bath contains only water at 0 C. Is there a difference in the energy transferred from the reaction vessel to the bath in the two cases. How is this affected if the amount of water in the second case is increased? (Are you also asking about the entropy changes in these cases?)

I have another problem with entropy. Some folks say it involves information. I have maintained that only energy is involved. Consider a system containing two gasses. The atoms are identical except half are red and the other are blue. Initially the red and blue are separated by a card in the center of the container. The card is removed and the atoms mix. How can there be a change in entropy?
This question goes beyond the scope of what I was trying to cover in my article. It involves the thermodynamics of mixtures. I'm trying to decide whether to answer this in the present Comments or write an introductory Insight article on the subject. I need time to think about what I want to do. Meanwhile, I can tell you that there is an entropy increase for the change that you described and that the entropy change can be worked out using energy with the integral of dqrev/T.
Oh, one more please. Can you show an example where the entropy change is negative like you were saying?
This one is easy. Just consider a closed system in which you bring about the isothermal reversible compression of an ideal gas, so that the final temperature is equal to the initial temperature, the final volume is less than the initial volume, and the final pressure is higher than the initial pressure.

Chet
 
Science news on Phys.org
  • #32
To make sure I understand the end of your article clearly:

(1) A system can go from an initial equilibirum state to a final equilibrium state through a reversible or irreversible process.
(2) Whichever process it undergoes, its change in entropy will be the same.
(3) That change in entropy can be determined by evaluating the following integral over any reversible process path the system could have gone through: \int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}.
(4) If the system goes through an irreversible process path, this integral: \int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_{Int}(t)}dt} will yield a lesser value than the reversible path integral, but the change in entropy would still be equal to the (greater) evaluation of the reversible path integral.

Is that right?
 
Last edited:
  • #33
DocZaius said:
To make sure I understand the end of your article clearly:

(1) A system can go from an initial equilibirum state to a final equilibrium state through a reversible or irreversible process.
(2) Whichever process it undergoes, its change in entropy will be the same.
(3) That change in entropy can be determined by evaluating the following integral over any reversible process path the system could have gone through: \int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}.
(4) If the system goes through an irreversible process path, this integral: \int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_{Int}(t)}dt} will yield a lesser value than the reversible path integral, but the change in entropy would still be equal to the (greater) evaluation of the reversible path integral.

Is that right?
Perfect.
 
  • #34
I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?

I don't have a clear idea of what this equation is about. Let me try to articulate my understanding, and you can then correct it. You have an isolated system containing an exothermic chemical reaction vessel in contact with a cold bath. In one case, the bath contains ice floating in water at 0 C. In the other case, the bath contains only water at 0 C. Is there a difference in the energy transferred from the reaction vessel to the bath in the two cases. How is this affected if the amount of water in the second case is increased? (Are you also asking about the entropy changes in these cases?)

Using identical reaction vessels the energy transferred is set the same. The question is about the entropy change. Will heating water a tiny delta T or melting ice result in the same entropy change?
 
  • #35
This question goes beyond the scope of what I was trying to cover in my article. It involves the thermodynamics of mixtures. I'm trying to decide whether to answer this in the present Comments or write an introductory Insight article on the subject. I need time to think about what I want to do. Meanwhile, I can tell you that there is an entropy increase for the change that you described and that the entropy change can be worked out using energy with the integral of dqrev/T.
I cannot understand the symbols in the integral.
 
  • #36
wvphysicist said:
I have two questions about closed systems. Consider two closed systems, both have a chemical reaction area which releases a small amount of heat and are initially at the freezing point. One has water and no ice and the other has ice. I expect after the chemical reaction the water system will absorb the heat with a tiny change in temperature and the other will convert a small amount of ice to water. Is there any difference in the increase of energy? Suppose I choose masses to enable the delta T of the water system to go toward zero. Is there any difference?

I don't have a clear idea of what this equation is about. Let me try to articulate my understanding, and you can then correct it. You have an isolated system containing an exothermic chemical reaction vessel in contact with a cold bath. In one case, the bath contains ice floating in water at 0 C. In the other case, the bath contains only water at 0 C. Is there a difference in the energy transferred from the reaction vessel to the bath in the two cases. How is this affected if the amount of water in the second case is increased? (Are you also asking about the entropy changes in these cases?)

Using identical reaction vessels the energy transferred is set the same. The question is about the entropy change. Will heating water a tiny delta T or melting ice result in the same entropy change?
Yes, provided the delta T of the water is virtually zero. Otherwise, the final reactor temperature will not be the same in the two cases. So the reactor would have a different entropy change.

Chet
 
  • #37
wvphysicist said:
This question goes beyond the scope of what I was trying to cover in my article. It involves the thermodynamics of mixtures. I'm trying to decide whether to answer this in the present Comments or write an introductory Insight article on the subject. I need time to think about what I want to do. Meanwhile, I can tell you that there is an entropy increase for the change that you described and that the entropy change can be worked out using energy with the integral of dqrev/T.
I cannot understand the symbols in the integral.
I don't understand what you are saying here. There are some basic concepts that need to be developed to describe mixtures. The starting point for mixtures goes back again to ideal gases, and discusses the thermodynamic properties of entropy, enthalpy, free energy, and volume of ideal gas mixtures, based on "Gibbs Theorem." This development enables you to determine the change in entropy in going from two pure gases, each at a certain pressure, to a mixture of the two gases at the same pressure. The partial pressures of the gases in the mixture are lower than their original pressures, so, to get to their final partial pressures, you would have to increase their volumes reversibly at constant temperature. This would give rise to an entropy increase for each of the gases. I'm uncomfortable going into it in more detail than this, because, to do it right, you need more extensive discussion. If you want to find out more about this, see Chapter 10 of Introduction to Chemical Engineering Thermodynamics by Smith and van Ness.

Chet
 
  • #38
I have always thought that a reversible process would give the minimum change in entropy, i.e. a lower bound for the integral. Is it not that the higher the entropy change the more energy it's dissipating from irreversibilities? In other words, why exactly is the integrand always less than or equal to and not greater than or equal to?
 
  • #39
nothingkwt said:
I have always thought that a reversible process would give the minimum change in entropy, i.e. a lower bound for the integral. Is it not that the higher the entropy change the more energy it's dissipating from irreversibilities? In other words, why exactly is the integrand always less than or equal to and not greater than or equal to?
This is discussed in posts #28 and 29. As I clearly said in my article, I am referring to the entropy of a (closed) system, not to the entropy of the combination of system and surroundings (which constitutes an isolated system). Have you never seen the Clausius Inequality in the form that I presented it before?

All you need to do to convince yourself that what I said is correct is do a few sample problems for irreversible processes, where you compare with the integral of the heat flow at the boundary divided by the temperature at the boundary with the entropy change of the system between the same two initial and final equilibrium states. In an isolated system, the integral is zero (since no heat is passing across the boundary of an isolated system), but the entropy change is greater than zero (for an irreversible change).

Chet
 
Last edited:
  • #40
Thanks for your definitive take on thermodynamics one and two.
 
  • #41
I fear that one may get from this article the impression that the concept of entropy can only be introduced under very restricing assumptions. Here some rethoric questions: Are the systems for which we can introduce entropy really restricted to those describable in terms of only T and P? How about chemical processes or magnetization? Can and work only enter through the boundaries? How about warming a glas of milk in the microwave, then? In this case, the pressure is constant, but we can't assign a unique temperature to the system, not even at the boundary, as the distribution of energy over the internal states of the molecules is out of equilibrium.
This already shows that the Clausius inequality is of restricted value, as the integrals aren't defined for most of the irreversible processes.

In fact, we don't need to break our heads about the complicated structure of non-equilibrium states. The point is that we can calculate entropy integrating over a sequence of equilibrium states. It plays no role whether we can approximate this integral by an actual quasistatic process.
 
  • #42
DrDu said:
I fear that one may get from this article the impression that the concept of entropy can only be introduced under very restricing assumptions. Here some rethoric questions: Are the systems for which we can introduce entropy really restricted to those describable in terms of only T and P? How about chemical processes or magnetization? Can and work only enter through the boundaries? How about warming a glas of milk in the microwave, then? In this case, the pressure is constant, but we can't assign a unique temperature to the system, not even at the boundary, as the distribution of energy over the internal states of the molecules is out of equilibrium.
This already shows that the Clausius inequality is of restricted value, as the integrals aren't defined for most of the irreversible processes.
Thanks for your comment DrDu.

I fear that, perhaps, you have not read all my responses to the comments that have been made so far. I tried to make it clear what my motivation was for writing this article, particularly with regard to its limited scope. See, in particular, posts #12 and #31. To reiterate: My target audience was beginning thermodynamics students who are being exposed to the 1st and 2nd laws for the first time, but who, due to the poor manner in which the material is presented in most texts and courses, are so confused that they are unable to do their homework. I tried to keep the article "short and sweet" so that the students would not lose interest and stop reading before they reached the end. I did not want the article to be a complete treatise on thermodynamics. If you would like to expand on what I have written, you are welcome to write an Insights article of your own. I'm sure it would be very well received.

Along these same lines, I might mention that I am currently preparing another Insights article that focuses on the work done in reversible versus irreversible gas expansion/compression processes, and quantitatively identifies the fundamental mechanism by which the work in the two situations differ.
In fact, we don't need to break our heads about the complicated structure of non-equilibrium states. The point is that we can calculate entropy integrating over a sequence of equilibrium states. It plays no role whether we can approximate this integral by an actual quasistatic process.
I thought that I had covered this in my article when I wrote: "Note that the reversible process path used to determine the entropy change does not necessarily need to bear any resemblance to the actual process path."

Chet
 
  • #43
But you claimed your formulation of the Clausius inequality to be ##\emph{mathematically precise}##, didn't you?
 
  • #44
DrDu said:
But you claimed your formulation of the Clausius inequality to be ##emph{mathematically precise}##, didn't you?
Should read "mathematically precise".
 
  • #45
DrDu said:
Should read "mathematically precise".
In my judgement, this is sufficiently precise mathematically to address the students' needs at their introductory level (i.e., giving them the ability to understand and do their homework). As with any subject, additional refinement can be introduced at a later stage. For example, when we first learn about heat capacity, we are told that it is defined in terms of the path-dependent heat flow Q = C ΔT, but later learn that, more precisely, heat capacity is a function of state (not path), defined in terms of the partial derivatives of internal energy U or enthalpy H with respect to temperature. In my opinion, my judgement call is a considerably less blatant use of literary license than this.

Chet
 
  • #46
I think your article may be helpful to students, but it would be good to put some more disclaimers to places where it simplifies a lot.

For example, you wrote

The time variation of q˙(t) and w˙(t) between the initial and final states uniquely defines the so-called process path


I think this is true for simple system whose thermodynamic state is determined by two numbers, say entropy and internal energy. But there may be more complicated situations, when one has magnetic and electric work in addition to volume work and then two numbers q˙ and w˙ are not sufficient to determine the path through the state space.
 
  • #47
This is a good explanation, but personally I feel like the classical description of thermodynamics which defines entropy as some maximum value of an integral should be deprecated in light of our increasing knowledge of physics. The statistical mechanics definition of entropy is far superior. The classical definition is inherently confusing because entropy is a state variable, yet it is defined in terms of paths. Each path gives you a different integral. Experimentally, how do you determine what the maximum is out of an infinite number of possible paths? If the system is opened up in a way such that more paths become available, can the entropy increase?

The statistical mechanics definition (and the related information theory definition) makes it clear why it is a state variable, because it only depends on the states. The paths are irrelevant.
 
  • #48
Khashishi said:
This is a good explanation, but personally I feel like the classical description of thermodynamics which defines entropy as some maximum value of an integral should be deprecated in light of our increasing knowledge of physics. The statistical mechanics definition of entropy is far superior. The classical definition is inherently confusing because entropy is a state variable, yet it is defined in terms of paths. Each path gives you a different integral. Experimentally, how do you determine what the maximum is out of an infinite number of possible paths? If the system is opened up in a way such that more paths become available, can the entropy increase?

The statistical mechanics definition (and the related information theory definition) makes it clear why it is a state variable, because it only depends on the states. The paths are irrelevant.

There is no one entropy to be defined by some optimal definition. In thermodynamics, (Clausius) entropy is defined through integrals. There is nothing confusing about it; to understand entropy in thermodynamics, the paths and integrals are necessary. It is hardly a disadvantage of the definition, since processes and integrals are very important things to understand while learning thermodynamics.

In statistical physics, there are several concepts that are also called entropy, but neither of these is the same concept as Clausius entropy. *Sometimes* the statistical concept has similar functional dependence on the macroscopic variables to thermodynamic entropy. But it is not the same concept as thermodynamic entropy. Any use of statistical physics for explanation of thermodynamics is based on the *assumption* that statistical physics applies to thermodynamic systems. It does not replace thermodynamics in any way.
 
  • Like
Likes Chestermiller
  • #49
Jano L. said:
I think your article may be helpful to students, but it would be good to put some more disclaimers to places where it simplifies a lot.

For example, you wrote

The time variation of q˙(t) and w˙(t) between the initial and final states uniquely defines the so-called process path


I think this is true for simple system whose thermodynamic state is determined by two numbers, say entropy and internal energy. But there may be more complicated situations, when one has magnetic and electric work in addition to volume work and then two numbers q˙ and w˙ are not sufficient to determine the path through the state space.
Thanks Jano L.

I toyed with the idea of mentioning that there are other forms of work that might need to be considered also, but in the end made the judgement call not to. You read my motivation for the article in some of my responses to the comments and in the article itself. I just wanted to include the bare minimum to give the students what they needed to do most of their homework problems. I felt that, if I made the article too long and comprehensive, they would stop reading before they had a chance to benefit from the article. There are many other things that I might have included as well, such as the more general form of the first law, which also includes changes in kinetic and potential energy of the system.

I invite you to consider writing a supplement to my article in which you flesh things out more completely. Thanks for your comment.

Chet
 
  • #50
Khashishi said:
This is a good explanation, but personally I feel like the classical description of thermodynamics which defines entropy as some maximum value of an integral should be deprecated in light of our increasing knowledge of physics. The statistical mechanics definition of entropy is far superior.
That's your opinion.
The classical definition is inherently confusing because entropy is a state variable, yet it is defined in terms of paths. Each path gives you a different integral. Experimentally, how do you determine what the maximum is out of an infinite number of possible paths? If the system is opened up in a way such that more paths become available, can the entropy increase?
Did you not read the article? I made it perfectly clear that:
  1. Entropy is a function of state
  2. There are an infinite number of process paths between the initial and final equilibrium states of the system
  3. The integral of dq/T over all these possible paths has a maximum value, and is thus a function of state
  4. All reversible paths between the initial and final equilibrium states of the system give exactly the same (maximum) value of the integral, so you don't need to evaluate all possible paths
  5. To get the change in entropy between the initial and final equilibrium states of the system, one needs only to conceive of a single convenient reversible path between the two states and integtrate dq/T for that path.
Try using the statistical mechanical definition of entropy to calculate the change in entropy of a real gas between two thermodynamic equilibrium states.

Chet
 
  • #51
You have taken internal pressure times change in volume in work equation with a positive convention. I am learning from YouTube lectures they have taken work as external pressure times change in volume with a negative convention, in this way work done on the system becomes positive but J M Smith takes work done by the system positive.So can you please explain me some logical reason behind these conventions and also about internal and external pressures in work equation.
Thanks
 
  • #52
Chestermiller said:
That's your opinion.

Did you not read the article? I made it perfectly clear that:
  1. Entropy is a function of state
  2. There are an infinite number of process paths between the initial and final equilibrium states of the system
  3. The integral of dq/T over all these possible paths has a maximum value, and is thus a function of state
  4. All reversible paths between the initial and final equilibrium states of the system give exactly the same (maximum) value of the integral, so you don't need to evaluate all possible paths
  5. To get the change in entropy between the initial and final equilibrium states of the system, one needs only to conceive of a single convenient reversible path between the two states and integtrate dq/T for that path.
Try using the statistical mechanical definition of entropy to calculate the change in entropy of a real gas between two thermodynamic equilibrium states.

Chet
In "dq/T", does "q" stand for, as it did in my statistical mechanics courses, the thermal energy of the system in question, call it "SYS"? If so, then dq/T is dS, the change in entropy of SYS, so in, for example, a process such as slow compression of a gas in a cylinder by a piston, which we will call "SYS", which would be a reversible process, the environment external to SYS, which supplies the mechanical energy for compression, can have zero entropy change, so dq and so dS must be zero (assuming T>0), otherwise the entropy change of SYS together with the environment would be non-zero, so positive--it could hardly be negative--so the process wouldn't be reversible. The point of this is that you said that in all reversible paths between the initial and final equilibrium states of a system give exactly the same (maximum) value of the integral of dq/T, which is the total entropy change of SYS together with the environment in our example, so all other paths between the initial and final states must give less than or equal to zero entropy change, and a change less than zero would violate the second law, so all paths must give zero entropy change, so no paths which increase the entropy can exist, which is obviously false. Was there a typo in both 3. and 4. in your article, and it should have been (minimum)? The only other likely possibility that I can think of right now is that your dq = my -dq.
 
Last edited by a moderator:
  • #53
Tahira Firdous said:
You have taken internal pressure times change in volume in work equation with a positive convention. I am learning from YouTube lectures they have taken work as external pressure times change in volume with a negative convention, in this way work done on the system becomes positive but J M Smith takes work done by the system positive.So can you please explain me some logical reason behind these conventions and also about internal and external pressures in work equation.
Thanks
Some people take work done on the system by the surroundings as positive and some people take work done by the system on the surroundings as positive. Of course, this results in different signs for the work term in the expression of the first law. In engineering, we take work done by the system on the surroundings as positive. Chemists often take work done on the system by the surroundings as positive
 
  • #54
fox26 said:
In "dq/T", does "q" stand for, as it did in my statistical mechanics courses, the thermal energy of the system in question, call it "SYS"?
I stated very clearly in the article that q represents the heat flowing into the system across its boundary, from the surroundings to the system. The T in the equation is the temperature at this boundary.

What you are calling the thermal energy of the system, I would refer to as its internal energy. But, the internal energy is not what appears in the definition of the entropy change.

If so, then dq/T is dS, the change in entropy of SYS, so in, for example, a process such as slow compression of a gas in a cylinder by a piston, which we will call "SYS", which would be a reversible process, the environment external to SYS, which supplies the mechanical energy for compression, can have zero entropy change, so dq and so dS must be zero (assuming T>0), otherwise the entropy change of SYS together with the environment would be non-zero, so positive--it could hardly be negative--so the process wouldn't be reversible. The point of this is that you said that in all reversible paths between the initial and final equilibrium states of a system give exactly the same (maximum) value of the integral of dq/T, which is the total entropy change of SYS together with the environment in our example, so all other paths between the initial and final states must give less than or equal to zero entropy change, and a change less than zero would violate the second law, so all paths must give zero entropy change, so no paths which increase the entropy can exist, which is obviously false. Was there a typo in both 3. and 4. in your article, and it should have been (minimum)? The only other likely possibility that I can think of right now is that your dq = my -dq.

I don't understand what you are trying to say here. Maybe it would help to give a specific focus problem to illustrate what you are asking.
 
Last edited:
  • #55
Chestermiller said:
I stated very clearly in the article that q represents the heat flowing into the system across its boundary, from the surroundings to the system. The T in the equation is the temperature at this boundary.

What you are calling the thermal energy of the system, I would refer to as its internal energy. But, the internal energy is not what appears in the definition of the entropy change.
I don't understand what you are trying to say here. Maybe it would help to give a specific focus problem to illustrate what you are asking.

Chet,
I didn't read your article before entering my post, just read your 5 listed points in reply to Khashishi, but a little while ago I did look at its first part, and found, of course, that what I called "thermal energy", q, is not, despite what you said, what you called the "internal energy", which according to your introduction includes both what I called thermal (heat) energy, but also mechanical energy, as it usually is meant to include. Also I found that you defined "equilibrium" so that even the atmosphere of Earth at a uniform temperature and completely still would not be in equilibrium, because of the pressure variation with altitude. The situation that I was concerned with is stated in the first sentence of my previous post. For my last statement of what your 5 points implied, instead of "entropy", I should have had "the integral of dq/T". The main point where I disagreed with you, apparently, is in the definition of "equilibrium", and so of "state" of the system. About the only thing you would consider to be a system in equilibrium is a sealed container with a gas, absolutely still, at uniform pressure and temperature, floating in space in free fall, with the state of the system, for a given composition and amount of gas, specified completely by its temperature and pressure, as DrDu said. Then its entropy, given the gas, is determined by that temperature and pressure, and I am willing to believe that Clausius did, by calculating many examples, almost show, except for paths involving such things as mechanical shock excitation of the gas, what you claimed he did show, for such a system.

However, defining entropy from just changes in entropy isn't possible; a starting point whose entropy is known is necessary. Can this be, say, empty space, with zero entropy (classically)? Also, if the second law, that the entropy of a closed system (by this I, and most other people, mean what you mean by an "isolated system") never (except extremely rarely, for macroscopic systems) decreases, is to have universal applicability, it must be possible to define "the entropy" of any system, even ones far from equilibrium, in your or more general senses of "equilibrium". How can this be done? In particular, why the entropy of the entire universe, or very large essentially closed portions of it, always increases, or at least never decreases, is now a fairly hot topic. Do you think this is a meaningful question?
 
  • #56
fox26 said:
Chet,
I didn't read your article before entering my post, just read your 5 listed points in reply to Khashishi, but a little while ago I did look at its first part, and found, of course, that what I called "thermal energy", q, is not, despite what you said, what you called the "internal energy", which according to your introduction includes both what I called thermal (heat) energy, but also mechanical energy, as it usually is meant to include.
Not so. Internal energy is a physical property of a material (independent of process path, heat and work), and heat and work depend on process path.
Also I found that you defined "equilibrium" so that even the atmosphere of Earth at a uniform temperature and completely still would not be in equilibrium, because of the pressure variation with altitude.
Not correct. The atmosphere at a uniform temperature and completely still would be in equilibrium even with pressure variation. The form of the first law equation I gave, for simplicity, omitted the change in potential energy of the system.

The situation that I was concerned with is stated in the first sentence of my previous post. For my last statement of what your 5 points implied, instead of "entropy", I should have had "the integral of dq/T". The main point where I disagreed with you, apparently, is in the definition of "equilibrium", and so of "state" of the system. About the only thing you would consider to be a system in equilibrium is a sealed container with a gas, absolutely still, at uniform pressure and temperature, floating in space in free fall, with the state of the system, for a given composition and amount of gas, specified completely by its temperature and pressure, as DrDu said. Then its entropy, given the gas, is determined by that temperature and pressure, and I am willing to believe that Clausius did, by calculating many examples, almost show, except for paths involving such things as mechanical shock excitation of the gas, what you claimed he did show, for such a system.

However, defining entropy from just changes in entropy isn't possible; a starting point whose entropy is known is necessary. Can this be, say, empty space, with zero entropy (classically)? Also, if the second law, that the entropy of a closed system (by this I, and most other people, mean what you mean by an "isolated system") never (except extremely rarely, for macroscopic systems) decreases, is to have universal applicability, it must be possible to define "the entropy" of any system, even ones far from equilibrium, in your or more general senses of "equilibrium". How can this be done? In particular, why the entropy of the entire universe, or very large essentially closed portions of it, always increases, or at least never decreases, is now a fairly hot topic. Do you think this is a meaningful question?
I still don't understand what you are asking or saying. Why don't you define a specific problem that we can both solve together? Define a problem that you believe would illustrate what you are asking. Otherwise, I don't think I can help you, and we will just have to agree to disagree.
 
  • #57
Chestermiller said:
Not so. Internal energy is a physical property of a material (independent of process path, heat and work), and heat and work depend on process path.

Not correct. The atmosphere at a uniform temperature and completely still would be in equilibrium even with pressure variation. The form of the first law equation I gave, for simplicity, omitted the change in potential energy of the system.I still don't understand what you are asking or saying. Why don't you define a specific problem that we can both solve together? Define a problem that you believe would illustrate what you are asking. Otherwise, I don't think I can help you, and we will just have to agree to disagree.

Is this not possible (classically, ignoring the internal energy of atoms and molecules and the relativistic rest-mass equivalent E = mc^2 energy): Total internal energy E of a closed (in my sense) system, in the center of mass frame = mechanical (macroscopic, including macroscopic kinetic and internal potential energy) energy + thermal (microscopic kinetic) energy? That is what I meant and, when I wrote my first comment, thought you meant, by "mechanical" and "thermal" energy. (I used "heat", non-precisely, in parenthesis after "thermal" in my second comment to try to indicate the meaning of "thermal" just because you seemed, in your reply to my first comment, to think my "thermal energy" meant the total internal energy of the system, which of course it didn't.) My comment that the atmosphere of the Earth would not be a system in equilibrium under my stated conditions, according to your definition of "thermodynamic equilibrium state", follows from your definition of that in the first sentence after the second bold subheading "First Law of Thermodynamics" in your article. I agreed, in the last sentence of the first paragraph of my second comment (this is my third comment) with your statements 3 and 4, of 5 total, in your reply to Khashishi ("and" in 3 should be "which"). Two other specific problems are stated in the second and last paragraph of my second comment.
 
  • #58
fox26 said:
Is this not possible (classically, ignoring the internal energy of atoms and molecules and the relativistic rest-mass equivalent E = mc^2 energy): Total internal energy E of a closed (in my sense) system, in the center of mass frame = mechanical (macroscopic, including macroscopic kinetic and internal potential energy) energy + thermal (microscopic kinetic) energy? That is what I meant and, when I wrote my first comment, thought you meant, by "mechanical" and "thermal" energy. (I used "heat", non-precisely, in parenthesis after "thermal" in my second comment to try to indicate the meaning of "thermal" just because you seemed, in your reply to my first comment, to think my "thermal energy" meant the total internal energy of the system, which of course it didn't.) My comment that the atmosphere of the Earth would not be a system in equilibrium under my stated conditions, according to your definition of "thermodynamic equilibrium state", follows from your definition of that in the first sentence after the second bold subheading "First Law of Thermodynamics" in your article. I agreed, in the last sentence of the first paragraph of my second comment (this is my third comment) with your statements 3 and 4, of 5 total, in your reply to Khashishi ("and" in 3 should be "which"). Two other specific problems are stated in the second and last paragraph of my second comment.
Huh? From what you have written, I don't even really know whether we are disagreeing about anything. Are we?

By a specific problem, what I was asking for was not something general, such as systems you have only alluded to, but for a problem with actual numbers for temperatures, pressures, masses, volumes, forces, stresses, strains, etc. Do you think you can do that? If not, then we're done here. I'm on the verge of closing this thread.
 
  • #59
Chestermiller said:
Huh? From what you have written, I don't even really know whether we are disagreeing about anything. Are we?

By a specific problem, what I was asking for was not something general, such as systems you have only alluded to, but for a problem with actual numbers for temperatures, pressures, masses, volumes, forces, stresses, strains, etc. Do you think you can do that? If not, then we're done here. I'm on the verge of closing this thread.

I asked general questions because those were what I was interested in, not a specific calculation. You mostly made general statements, instead of specific calculations, in your article and answers to replies, which often were themselves general. However, if you won't answer general questions from me, here's a specific one, even though it's a particular case of the first general question in the last paragraph of my last previous reply:

Suppose a closed (in your sense) system SYS in state S1 consists of a gas of one kilogram of hydrogen molecules in equilibrium at 400 degrees kelvin in a cubical container one meter on a side; I leave it to you to calculate its internal pressure approximately, if you wish, using the ideal gas law. How can its entropy be calculated? Integrating dq/T over the path of a reversible process going from some other state S2 of SYS to S1 can give the change of entropy Δentropy(S2,S1) caused by the process, and entropy(S1) = entropy(S2) + Δentropy(S2,S1), but what is entropy(S2), and how can that be determined by thermodynamic considerations alone, without invoking statistical mechanical ones? You did state somewhere that some important person in thermodynamics, I don't remember who, so call him "X" (maybe it was Clausius), had determined that the entropy of any system consisting of matter (in equilibrium) at absolute zero would be zero, so letting S2 = SYS at absolute zero, we would have entropy(S2) = 0, so the problem would be solved, except for the question of how X had determined that entropy(S2), or any other system at absolute zero, = 0, using only thermodynamic considerations. You wrote "determined", so I assume he didn't do this just by taking entropy(any system at absolute zero) = 0 as an additional law of thermodynamics, or part of the thermodynamic definition of "entropy", but instead calculated it. How? It can be done by statistical mechanical considerations (for the SM idea of entropy), but you presumably would want to do it by thermodynamics alone.
 
  • #60
fox26 said:
I asked general questions because those were what I was interested in, not a specific calculation. You mostly made general statements, instead of specific calculations, in your article and answers to replies, which often were themselves general. However, if you won't answer general questions from me, here's a specific one, even though it's a particular case of the first general question in the last paragraph of my last previous reply:

Suppose a closed (in your sense) system SYS in state S1 consists of a gas of one kilogram of hydrogen molecules in equilibrium at 400 degrees kelvin in a cubical container one meter on a side; I leave it to you to calculate its internal pressure approximately, if you wish, using the ideal gas law. How can its entropy be calculated? Integrating dq/T over the path of a reversible process going from some other state S2 of SYS to S1 can give the change of entropy Δentropy(S2,S1) caused by the process, and entropy(S1) = entropy(S2) + Δentropy(S2,S1), but what is entropy(S2), and how can that be determined by thermodynamic considerations alone, without invoking statistical mechanical ones? You did state somewhere that some important person in thermodynamics, I don't remember who, so call him "X" (maybe it was Clausius), had determined that the entropy of any system consisting of matter (in equilibrium) at absolute zero would be zero, so letting S2 = SYS at absolute zero, we would have entropy(S2) = 0, so the problem would be solved, except for the question of how X had determined that entropy(S2), or any other system at absolute zero, = 0, using only thermodynamic considerations. You wrote "determined", so I assume he didn't do this just by taking entropy(any system at absolute zero) = 0 as an additional law of thermodynamics, or part of the thermodynamic definition of "entropy", but instead calculated it. How? It can be done by statistical mechanical considerations (for the SM idea of entropy), but you presumably would want to do it by thermodynamics alone.
Wow. Thank you for finally clarifying your question.

You are asking how the absolute entropy of a system can be determined. This is covered by the 3rd Law of Thermodynamics. I never mentioned the 3rd Law of Thermodynamics in my article. You indicated that, in my article, I said that "some important person in thermodynamics, I don't remember who, so call him "X" (maybe it was Clausius), had determined that the entropy of any system consisting of matter (in equilibrium) at absolute zero would be zero, so letting S2 = SYS at absolute zero, we would have entropy(S2) = 0." I never said this in my article or in any of my comments. If you think so, please point out where. My article only deals with relative changes in entropy from one thermodynamic equilibrium state to another.
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
1K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 37 ·
2
Replies
37
Views
5K