- #1

- 348

- 13

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Yashbhatt
- Start date

In summary, the concept of entropy in thermodynamics is defined as the integral of dq/T, and is related to the amount of randomness in a system. It is proportional to the amount of heat added to the system, and is inversely proportional to temperature. The first law of thermodynamics states that the change in internal energy of a system is equal to the amount of heat added minus the work done by the system. The second law of thermodynamics introduces a third integral, involving the heat transfer rate, to provide a more precise definition of entropy.

- #1

- 348

- 13

Science news on Phys.org

- #2

Science Advisor

Homework Helper

Gold Member

- 5,547

- 1,631

How is entropy __defined__ in your textbook?

- #3

- 348

- 13

The standard way : The amount of randomness of a system. Then it goes on to say that if we add more heat the particles will be flying around more quickly and thus their randomness would increase. So, S proportional to Q. And then it says, "Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature." and so S is inversely proportional to T.Bystander said:How is entropydefinedin your textbook?

- #4

Science Advisor

Homework Helper

Gold Member

- 5,547

- 1,631

Entropy is defined as the integral of dq/T from absolute zero to the temperature of the system; it was defined that way nearly two centuries ago; and, it will be that for a long time to come. For exchange of heat between a system and its surroundings at some constant T, theYashbhatt said:The standard way :

- #5

- 23,023

- 5,553

Chet

- #6

Staff Emeritus

- 11,308

- 8,638

- #7

Science Advisor

Homework Helper

Gold Member

- 5,547

- 1,631

Had there been mention of "partition function(s)" rather than the word "randomness" I might have pursued that line of discussion. Being something of a "thermosaur," it struck me as more useful to stick to the original definition and attempt to clarify that for the OP.Chestermiller said:"how does the statistical thermo relationship for entropy (or entropy change) reconcile with the classical thermo relationship ∫dq/T?"

- #8

- 348

- 13

I am still confused.

- #9

- 23,023

- 5,553

Maybe this write up write up will help:Yashbhatt said:I am still confused.

FIRST LAW OF THERMODYNAMICS

Suppose that we have a closed system that at initial time t

[tex]\Delta U=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W[/tex]

where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

The time variation of [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperature

Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:

[tex]\dot{w}(t)=P_I(t)\dot{V}(t)[/tex]

where [itex]\dot{V}(t)[/itex] is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

[itex]P_I(t)=P(t)[/itex] (reversible process path)

Therefore, [itex]\dot{w}(t)=P(t)\dot{V}(t)[/itex] (reversible process path)

Another feature of reversible process paths is that they are carried out very slowly, so that [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (t

SECOND LAW OF THERMODYNAMICS

In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate [itex]\dot{q}(t)[/itex] and the rate of doing work [itex]\dot{w}(t)[/itex] as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:

[tex]Q=\int_{t_i}^{t_f}{\dot{q}(t)dt}[/tex]

[tex]W=\int_{t_i}^{t_f}{\dot{w}(t)dt}[/tex]

In the present section, we will be introducing a third integral of this type (involving the heat transfer rate [itex]\dot{q}(t)[/itex]) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:

[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}[/tex]

where T

Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call today

So, mathematically, we can now state the Second Law as follows:

[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}\leq\Delta S=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/tex]

where [itex]\dot{q}_{rev}(t)[/itex] is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is the

- #10

- 348

- 13

Chestermiller said:Maybe this write up write up will help:

FIRST LAW OF THERMODYNAMICS

Suppose that we have a closed system that at initial time t_{i}is in an initial equilibrium state, with internal energy U_{i}, and at a later time t_{f}, it is in a new equilibrium state with internal energy U_{f}. The transition from the initial equilibrium state to the final equilibrium state is brought about by imposing a time-dependent heat flow across the interface between the system and the surroundings, and a time-dependent rate of doing work at the interface between the system and the surroundings. Let [itex]\dot{q}(t)[/itex] represent the rate of heat addition across the interface between the system and the surroundings at time t, and let [itex]\dot{w}(t)[/itex] represent the rate at which the system does work on the surroundings at the interface at time t. According to the first law (basically conservation of energy),

[tex]\Delta U=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W[/tex]

where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

The time variation of [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperatureat the interfacecan be measured and controlled using the surroundings to impose the temperature and pressure boundary conditions that we desire. Thus, T_{I}(t) and P_{I}(t) can be used to impose the process path that we desire. Alternately, and even more fundamentally, we can directly control, by well established methods, the rate of heat flow and the rate of doing work at the interface [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex]).

Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:

[tex]\dot{w}(t)=P_I(t)\dot{V}(t)[/tex]

where [itex]\dot{V}(t)[/itex] is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

[itex]P_I(t)=P(t)[/itex] (reversible process path)

Therefore, [itex]\dot{w}(t)=P(t)\dot{V}(t)[/itex] (reversible process path)

Another feature of reversible process paths is that they are carried out very slowly, so that [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (t_{f}-t_{i}) becomes exceedingly large. In this way, Q-W remains constant and finite.

SECOND LAW OF THERMODYNAMICS

In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate [itex]\dot{q}(t)[/itex] and the rate of doing work [itex]\dot{w}(t)[/itex] as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:

[tex]Q=\int_{t_i}^{t_f}{\dot{q}(t)dt}[/tex]

[tex]W=\int_{t_i}^{t_f}{\dot{w}(t)dt}[/tex]

In the present section, we will be introducing a third integral of this type (involving the heat transfer rate [itex]\dot{q}(t)[/itex]) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:

[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}[/tex]

where T_{I}(t) is the temperature at the interface with the surroundings at time t. He carried out extensive calculations on many systems undergoing a variety of both reversible and irreversible paths and discovered something astonishing. He found that, for any closed system, the values calculated for the integral over all the possible reversible and irreversible paths (between the initial and final equilibrium states) was not arbitrary; instead, there was a unique upper bound (maximum) to the value of the integral. Clausius also found that this result was consistent with all the "word definitions" of the Second Law.

Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call todaythe reversible process paths. So, to determine the change in entropy between two equilibrium states, one must first conceive of a reversible path between the states and then evaluate the integral. Any other process path will give a value for the integral lower than the entropy change.

So, mathematically, we can now state the Second Law as follows:

[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}\leq\Delta S=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/tex]

where [itex]\dot{q}_{rev}(t)[/itex] is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is thesystemtemperature at time t (which, for a reversible path, is equal to the temperature at the interface with the surroundings). This constitutes a precise mathematical statement of the Second Law of Thermodynamics.

Thanks for the detailed post. But was entropy defined as follows just because someone found something interesting while evaluating an integral? I want to know intuitively why is it inversely proportional to the temperature at which it is added.

- #11

Staff Emeritus

- 11,308

- 8,638

Bystander said:Entropy is defined as the integral of dq/T

Yashbhatt, I believe that the source of your confusion is that you envision increasing temperature while holding everything else constant. That would make entropy decrease. But putting heat in increases both Q and T (making dQ positive), and it is the integral of the

- #12

- 23,023

- 5,553

I don't know how to answer this except to say that the Second Law evolved as a consequence of a huge body of empirical observations. The thermodynamic definition of entropy followed directly from these observations (so that they could be captured in a concise mathematial form) and from the recognition that a point thermodynamic function of this form must exist.Yashbhatt said:Thanks for the detailed post. But was entropy defined as follows just because someone found something interesting while evaluating an integral? I want to know intuitively why is it inversely proportional to the temperature at which it is added.

Also note that, if dQ/T is being used to evaluate entropy change between two thermodynamic states, the path between these two states must be reversible. Not just any path will do.

Chet

- #13

- 348

- 13

But what if it is defined normally without integral. That's the way it is defined in my book.anorlunda said:Yashbhatt, I believe that the source of your confusion is that you envision increasing temperature while holding everything else constant. That would make entropy decrease. But putting heat in increases both Q and T (making dQ positive), and it is the integral of theratio dQ to T, that is entropy.

My teacher told me something about phase change. She said that if heat is added to water at a higher temperature then it will vaporize which will cause less entropy increase if the heat were to be added such that the state would remain the same.

Is that relevant?

- #14

Science Advisor

- 2,813

- 491

##S = k_B \log n##, where ##n## is the number of microscopic quantum states available to a system with known bulk properties (such as total energy, volume), and ##k_B## is just a constant used to match the definition to the classical thermodynamics definition.

Basically, with a macroscopic sized system, (say a gas in a cylinder), you typically have the total energy relatively constant, but you don't know the energy of each particle since they are constantly colliding with each other and trading energy. The entropy tells you how many configurations of particle energies there are for a given total energy (and volume, etc...).

You also need the definition of temperature, which is less obvious than you think. In thermodynamics, the definition is

##\frac{1}{T} = \left(\frac{\partial{S}}{\partial{U}}\right)_{V,N}##

Using this definition, the answer to your original question is quite obvious. You might have come across a definition of temperature as an average energy per degree of freedom, such that ##\left<E\right>= k_B T/2##. This can be derived from above using kinetic theory, but this definition is not as fundamental and requires some idealized assumptions.

- #15

- 348

- 13

anorlunda said:Yashbhatt, I believe that the source of your confusion is that you envision increasing temperature while holding everything else constant. That would make entropy decrease. But putting heat in increases both Q and T (making dQ positive), and it is the integral of theratio dQ to T, that is entropy.

Chestermiller said:I don't know how to answer this except to say that the Second Law evolved as a consequence of a huge body of empirical observations. The thermodynamic definition of entropy followed directly from these observations (so that they could be captured in a concise mathematial form) and from the recognition that a point thermodynamic function of this form must exist.

Also note that, if dQ/T is being used to evaluate entropy change between two thermodynamic states, the path between these two states must be reversible. Not just any path will do.

Chet

Khashishi said:

##S = k_B \log n##, where ##n## is the number of microscopic quantum states available to a system with known bulk properties (such as total energy, volume), and ##k_B## is just a constant used to match the definition to the classical thermodynamics definition.

Basically, with a macroscopic sized system, (say a gas in a cylinder), you typically have the total energy relatively constant, but you don't know the energy of each particle since they are constantly colliding with each other and trading energy. The entropy tells you how many configurations of particle energies there are for a given total energy (and volume, etc...).

You also need the definition of temperature, which is less obvious than you think. In thermodynamics, the definition is

##\frac{1}{T} = \left(\frac{\partial{S}}{\partial{U}}\right)_{V,N}##

Using this definition, the answer to your original question is quite obvious. You might have come across a definition of temperature as an average energy per degree of freedom, such that ##\left<E\right>= k_B T/2##. This can be derived from above using kinetic theory, but this definition is not as fundamental and requires some idealized assumptions.

Now, I have got some vague idea about this thing after reading this answer on Quora(http://www.quora.com/Why-does-entropy-depend-on-the-temperature-at-which-heat-is-added-to-the-system [Broken]):

Suppose, if entropy were defined just involving heat, then there would be heat flow when there were a

Moreover, the heat capacity(or specific heat) of a substance increases with temperature and so the temperature rise when heat is added at a higher temperature is less as compared when it is added at a lower temperature.

Is this correct?

Last edited by a moderator:

- #16

- 23,023

- 5,553

Personally, to me, it doesn't make sense.Yashbhatt said:Now, I have got some vague idea about this thing after reading this answer on Quora(http://www.quora.com/Why-does-entropy-depend-on-the-temperature-at-which-heat-is-added-to-the-system [Broken]):

Suppose, if entropy were defined just involving heat, then there would be heat flow when there were aheatdifference and not atemperaturedifference. But it is not so because the substances have different heat capacities and so there needs to be a temperature difference. So, we need a term that involves T.

Moreover, the heat capacity(or specific heat) of a substance increases with temperature and so the temperature rise when heat is added at a higher temperature is less as compared when it is added at a lower temperature.

Is this correct?

Chet

Last edited by a moderator:

Share:

- Replies
- 2

- Views
- 184

- Replies
- 22

- Views
- 938

- Replies
- 45

- Views
- 2K

- Replies
- 1

- Views
- 517

- Replies
- 12

- Views
- 875

- Replies
- 20

- Views
- 1K

- Replies
- 3

- Views
- 617

- Replies
- 13

- Views
- 774

- Replies
- 3

- Views
- 613

- Replies
- 26

- Views
- 775