Why is Entropy defined as a fraction of heat over temperature?

In summary, the entropy of the freezer and the car is the same when the temperature of the freezer is uniformly distributed.
  • #1
Ale_Rodo
32
6
Can we make sense out of the formula of entropy like we do for density (like "quantity of mass per unit volume")? What's the sense of Q/T? Couldn't it be something else?

Of course it probably is a 'me-problem', but I haven't studied Thermodynamics deeply yet and was wondering what Entropy actually means. It is often described as a quantity defining how much randomness that system is subject to, how probable it is for a system to evolve in a certain way and something in between these two, but how do you get from "Q/T" to randomness or probability? Is it possible to make the most general example of why Q/T makes sense?

I am looking for a demonstration of why it came to existence in that form or just an immediate logical example, such as professors usually do when explaining what density of mass is, if possible.

I think you already got what I'm trying to say at this point, but I'm going to clarify by using this very example of density: ρ = m/V indicates how much mass is there per unit volume and it's easy to imagine, in fact, we could approximately say that the more dense something is, the more atoms per unit volume we'll have.

Can we do the same thing for S = Q/T? I have some ideas of my own but they don't really have anything to do with randomness or probability and of course I'm what's farthest away from being an expert in Thermodynamics.

Thank you very much in advance.
 
  • Like
Likes Delta2
Science news on Phys.org
  • #2
If Q=TS, by differentiation dQ=TdS+SdT.
But definition of entropy is dQ=TdS or dS=dQ/T, not S=Q/T.
So we know inverse of temperature is given as
[tex]\frac{1}{T}=\frac{dS}{dQ}[/tex]
RHS means rate of increase of entropy per heat energy given to the system. 1/T is a parameter to show this ratio.

Say two systems 1 and 2 contact to transfer heat energy, by energy conservation
[tex]dQ_1+dQ_2=0[/tex]
Total entropy of systems
[tex]d(S_1+S_2)=1/T_1 \ dQ_1+1/T_2 \ dQ_2 =(1/T_1-1/T_2)\ dQ_1 > 0[/tex]
from the second law of dynamics.
When ##T_1 > T_2, \ dQ_1 <0, \ dQ_2>0##
When ##T_1 < T_2, \ dQ_1 >0,\ dQ_2<0##
So we know heat flows from high temperature system to low temperature system and in equilibrium
##T_1=T_2, \ dQ_1=-dQ_2=0##
 
Last edited:
  • Informative
Likes Delta2
  • #3
Your answer is great, but I feel like it's a bit far from what I was trying to achieve.

For instance,
anuttarasammyak said:
If Q=TS, by differentiation dQ=TdS+SdT
but I was trying to imagine what brought scientists to define entropy at first. This way, you're starting from a point which is just after the definition of entropy and hence doesn't really question its formula's 'shape'.

anuttarasammyak said:
RHS means rate of increase of entropy per heat energy given to the system. 1/T is a parameter to show this ratio.
This part intrigues me and I feel this is the route I was trying to walk to discover why entropy is defined that way, but still it starts as if dS is something that has always existed in physics or that is easily understandable; something like mass, which from a certain point of view doesn't need much explanation (at least on a simpler level).

So, in the end, if we imagine that we have to come up for the definition of entropy (we haven't even have it clear nor we know it's going to be called 'S'), why would we choose it to be dS = dQ/T ?

I know it is a very strange question and I hope I conveyed the message, it would be a bless if anybody has this kind of answer.

I'm thankful for the answer by the way.
 
  • #4
From differential form we get definition of entropy
[tex]S_2-S_1 =\int_1^2 \frac{dQ}{T}[/tex]
where integration path from state 1 to state 2 is any reversible process.

The example of contact of high temperature body and low temperature body, after heat exchange all the bodies has same temperature. The Initial and the final state have same amount of energy. Do you observe any difference between the state before contact and that of after contact ? If yes, entropy is a parameter found to distinguish these states.

Counties A and B have same amount of national wealth. In country A 1% of citizens own all the national wealth and 99% have zero. In country B all the citizens own equal amount of national wealth. If you were an economist, would not you observe the difference between the countries and want to express it by some parameter ?
 
Last edited:
  • Like
Likes Delta2
  • #6
S=Q/T

Q means how many Joules, and T means in how cold place those Joules are ?Let's say we have built a car that runs on thermal energy of atmosphere and very cold water. Engine is ideal.

Charging this car so that it can run 1000 miles requires energy from the electric socket, which energy is used by an ideal freezer. When the freezer is sucking Joules out of water that is very cold, many Joules of electricity have to be used to suck out one Joule of heat. So removing lot of Joules from the cold place requires lot of energy. Removing lot of entropy requires lot of energy in this case.

Just in case anyone is skeptical: Letting many Joules of heat energy go through an ideal motor, which dumps just the small amount of heat energy that was not converted to mechanical work, into a very cool heat sink, that is an inverse process of the aforementioned process of using lot of energy to cool a very cool thing.
 
Last edited:
  • #7
Ale_Rodo said:
Can we make sense out of the formula of entropy like we do for density (like "quantity of mass per unit volume")? What's the sense of Q/T? Couldn't it be something else?

Of course it probably is a 'me-problem', but I haven't studied Thermodynamics deeply yet and was wondering what Entropy actually means. It is often described as a quantity defining how much randomness that system is subject to, how probable it is for a system to evolve in a certain way and something in between these two, but how do you get from "Q/T" to randomness or probability? Is it possible to make the most general example of why Q/T makes sense?

I am looking for a demonstration of why it came to existence in that form or just an immediate logical example, such as professors usually do when explaining what density of mass is, if possible.

I think you already got what I'm trying to say at this point, but I'm going to clarify by using this very example of density: ρ = m/V indicates how much mass is there per unit volume and it's easy to imagine, in fact, we could approximately say that the more dense something is, the more atoms per unit volume we'll have.

Can we do the same thing for S = Q/T? I have some ideas of my own but they don't really have anything to do with randomness or probability and of course I'm what's farthest away from being an expert in Thermodynamics.

Thank you very much in advance.
Actually you should write deltaS = Q / T or dS = dQ / T.

I've written a short text on exactly this and I also recommend Chestermiller's article.

Anyway here's a link to mine:
https://www.researchgate.net/publication/338555515_Carnot_reversibility_and_entropy

The main points are that S is a state function and that dS = dQ / T for any reversible process
 
  • #8
Philip Koeck said:
Actually you should write deltaS = Q / T or dS = dQ / T.
Of course you are right. Unfortunately I have this very odd problem: if I don't get the meaning of the formula, then I tend to write it by memory until I finally wrap my head around it but it usually causes these kind of mistakes.

By the way I have given a very superficial look to your article (unfortunately I lack time these days) which, for what I can judge as an undergrad, seems great. The only problem is that I haven't studied Thermodynamics yet as I said, so I can barely follow.

Although in a few weeks it might appear much clearer, at the moment I still have problems when I read statements such as

" We now define a quantity (in the system) called entropy, S, that cannot be measured directly. We can only measure its change, which we define, for reversible processes only, as 𝑑𝑆 = 𝑑𝑄 𝑇 .
For a larger change during a process at constant temperature this gives: 𝛥𝑆 = 𝑄𝑇 . " .

Why do we have to define it and how would anyone know if he/she were to experience that cycle for the first time? I can't imagine someone seeing a full Carnot's cycle and say "yep, dS = dQ/T is exactly what I need." .
As I'm trying to convey, I get why we needed entropy (for instance in Chemistry, not all reaction happen just because the systems tend to evolve in the direction of lower energy), but I don't get its mathematical form.
Did someone notice dQ/T was appearing too many times in too many Thermodynamic equations? Why dQ/T and not, to say an absurdity, dQT? or dQT2? Does it have an history or did someone wake up one day and said "I think this random function I dreamt of (dQ/T) is called 'entropy'. It is the 'guide' which leads systems to evolve in a certain way rather than other."

I hope this doesn't sound as crazy as I think I sound myself. Thanks for the patience, I'm really struggling on this one.
 
  • #9
jartsa said:
S=Q/T

Q means how many Joules, and T means in how cold place those Joules are ?
This is interesting! My brain automatically found some examples but I'm now asking you if the ideas I came up with make sense, so please let me know.

So, let's say we know entropy is a measure of how much disorder has been caused in the system (I think it's indeed how it's usually first presented).
Let's also say, hoping to be correct, that if the system examined were to be the whole universe, then the whole universe tends to a state of maximum disorder.
In this case, it would mean that no usable energy is available, matter is still there, but stars went out and the big-enough masses that could light one last star are too far away from each other.
Now, without radiation coming from stars or any other sources, the temperature should be at/nearby the absolute zero.

If somehow we were still there and could measure the total entropy change ΔS from, for example, today to that far time where the universe is just darkness, we would have to know the amount of heat exchanged during this time.

Let's say we know this amount Q for unspecified reasons, now if we compare that Q with the temperature measured at the end of the times we would get an incredibly high amount (for convenience T = 1 K so that we won't have definition problems when T = 0 K).
Would ΔS = Q/T give us a measure of how much the exchanged heat influenced the universe's temperature?
And in general:
- highest ΔS means "Ok you exchanged heat but can't really do anything anymore"
- lowest ΔS means "A few joules are enough to suddenly rise the temperature" ?

It is an incredibly rough idea but it could partially satisfy my question if it happens to be right-ish.
It's indeed an odd one, but did I at least grasp the concept, in your opinion?
 
  • #10
anuttarasammyak said:
From differential form we get definition of entropy
[tex]S_2-S_1 =\int_1^2 \frac{dQ}{T}[/tex]
where integration path from state 1 to state 2 is any reversible process.

The example of contact of high temperature body and low temperature body, after heat exchange all the bodies has same temperature. The Initial and the final state have same amount of energy. Do you observe any difference between the state before contact and that of after contact ? If yes, entropy is a parameter found to distinguish these states.

Counties A and B have same amount of national wealth. In country A 1% of citizens own all the national wealth and 99% have zero. In country B all the citizens own equal amount of national wealth. If you were an economist, would not you observe the difference between the countries and want to express it by some parameter ?
Indeed, it starts to make much more sense than before. Thanks for that! Could I find another name for entropy such as 'heat distribution' then? Would it make sense or did I aim in the wrong direction?
 
  • Like
Likes anuttarasammyak
  • #12
Ale_Rodo said:
So, let's say we know entropy is a measure of how much disorder has been caused in the system (I think it's indeed how it's usually first presented).
Let's also say, hoping to be correct, that if the system examined were to be the whole universe, then the whole universe tends to a state of maximum disorder.
In this case, it would mean that no usable energy is available, matter is still there, but stars went out and the big-enough masses that could light one last star are too far away from each other.
Now, without radiation coming from stars or any other sources, the temperature should be at/nearby the absolute zero.

If somehow we were still there and could measure the total entropy change ΔS from, for example, today to that far time where the universe is just darkness, we would have to know the amount of heat exchanged during this time.

Let's say we know this amount Q for unspecified reasons, now if we compare that Q with the temperature measured at the end of the times we would get an incredibly high amount (for convenience T = 1 K so that we won't have definition problems when T = 0 K).
Would ΔS = Q/T give us a measure of how much the exchanged heat influenced the universe's temperature?
And in general:
- highest ΔS means "Ok you exchanged heat but can't really do anything anymore"
- lowest ΔS means "A few joules are enough to suddenly rise the temperature" ?

As a non-cosmolgist I must change your example to a one with one very hot rock and one extremely cold rock, and a bag into witch the two rocks are put and kept there for a long enough time for the rocks to reach the same temperature.

Originally there were approximately no heat joules in the cold rock as we purposefully made it extremely cold. Afterwards the heat joules that originally were in the hot rock are in a less hot rock.

Let's say the absolute temperature of the hot rock halved.

Before: S=Q/T
After : S=Q/(T/2)

Entropy doubled. (Because number of joules stayed the same and the temperature of those joules halved)Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
 
  • #13
Ale_Rodo said:
By the way I have given a very superficial look to your article (unfortunately I lack time these days) which, for what I can judge as an undergrad, seems great. The only problem is that I haven't studied Thermodynamics yet as I said, so I can barely follow.

Although in a few weeks it might appear much clearer, at the moment I still have problems when I read statements such as

" We now define a quantity (in the system) called entropy, S, that cannot be measured directly. We can only measure its change, which we define, for reversible processes only, as 𝑑𝑆 = 𝑑𝑄 𝑇 .
For a larger change during a process at constant temperature this gives: 𝛥𝑆 = 𝑄𝑇 . " .

Why do we have to define it and how would anyone know if he/she were to experience that cycle for the first time? I can't imagine someone seeing a full Carnot's cycle and say "yep, dS = dQ/T is exactly what I need." .
Ale_Rodo said:
As I'm trying to convey, I get why we needed entropy (for instance in Chemistry, not all reaction happen just because the systems tend to evolve in the direction of lower energy), but I don't get its mathematical form.
Did someone notice dQ/T was appearing too many times in too many Thermodynamic equations? Why dQ/T and not, to say an absurdity, dQT? or dQT2? Does it have an history or did someone wake up one day and said "I think this random function I dreamt of (dQ/T) is called 'entropy'. It is the 'guide' which leads systems to evolve in a certain way rather than other."
I don't think you need a complete thermodynamics course to understand my text, but you will have to spend some time on. If you do I'm pretty sure it will help you.

The only background you need is the first law (I use the form Q = ΔU +W) and the efficiency of a heat engine.

I'm not so good at history of science, but I think Carnot did something very much like you describe.
He looked at the efficiency of the Carnot process (equation 1 in my text) and said something like: "S&&t (pardon my French), there's a state function hidden in there."

Equation 1 is a dead giveaway: You can read it as QH/TH+QC/TC=0
All the changes of state function in a cyclic process add up to zero!
Get it?

None of the alternative definitions of entropy change that you suggest will add up to zero in the Carnot process.
There is one other possibility: You could define deltaS as T/Q, but temperatures don't add up the way heats do and what would dS (for a small change) be then?

Read the text first and whatever you need from a very basic textbook and then send another message if you still need help.
 
  • Like
Likes vanhees71
  • #14
jartsa said:
Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
I get this one, and I think it's very helpful, but I still can't get why I should compare Q with T.
 
  • #15
Philip Koeck said:
I don't think you need a complete thermodynamics course to understand my text, but you will have to spend some time on. If you do I'm pretty sure it will help you.
It surely will, but at the moment I don't even have that knowledge unfortunately.

Philip Koeck said:
I'm not so good at history of science, but I think Carnot did something very much like you describe.
He looked at the efficiency of the Carnot process (equation 1 in my text) and said something like: "S&&t (pardon my French), there's a state function hidden in there."
I guess it's the only way I can explain myself how entropy was born, but I'll get what I need, read what I need to and eventually come back. Thanks for the resources!
 
  • #16
Ale_Rodo said:
I get this one, and I think it's very helpful, but I still can't get why I should compare Q with T.

Well, maybe we can think that large Q means that the hot rock emits many photons, and small T means that the photons get measured with high accuracy by the cold rock. These two factors together mean large amount of information, or entropy, gets created.

What we can do with that knowledge that the thing called entropy is large ... that is another good question. There are some laws about the entropy, I guess that's why its important.
 
  • #17
jartsa said:
As a non-cosmolgist I must change your example to a one with one very hot rock and one extremely cold rock, and a bag into witch the two rocks are put and kept there for a long enough time for the rocks to reach the same temperature.

Originally there were approximately no heat joules in the cold rock as we purposefully made it extremely cold. Afterwards the heat joules that originally were in the hot rock are in a less hot rock.

Let's say the absolute temperature of the hot rock halved.

Before: S=Q/T
After : S=Q/(T/2)

Entropy doubled. (Because number of joules stayed the same and the temperature of those joules halved)Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
This response doesn't make any sense to me.

If I have two bodies of mass M and heat capacity C, with one at absolute temperature ##T_H## and the other at absolute temperature ##T_C##, and I allow them to equilibrate, then the final temperature will be ##T_F=(T_H+T_C)/2##, the amount of heat transferred from the hot body to the cold body will be $$Q=MC(T_H-T_F)=MC\frac{(T_H-T_C)}{2}$$, and the entropy change will be $$\Delta S=MC\ln{(T_F/T_H)}+MC\ln{(T_F/T_C)}$$$$=2MC\ln{\left[\frac{(T_H+T_C)/2}{\sqrt{M_HM_C}}\right]}$$$$=\frac{4Q}{(T_H-T_C)}\ln{\left[\frac{(T_H+T_C)/2}{\sqrt{M_HM_C}}\right]}$$
 
  • Like
Likes Ale_Rodo and 256bits
  • #18
Ale_Rodo said:
This part intrigues me and I feel this is the route I was trying to walk to discover why entropy is defined that way, but still it starts as if dS is something that has always existed in physics or that is easily understandable; something like mass, which from a certain point of view doesn't need much explanation (at least on a simpler level).
May be it would rather make you confuse more, what is temperature T ? Surely we know the ways how to measure it and make use of it in daily scenes but what is the concept of temperature and why do we need it ?

I would say behind temperature there exists a hidden quantity S or entropy affect us through behavior of T with the relation .
[tex]\frac{1}{T}=\frac{dS}{dQ}[/tex]
or
[tex]T=\frac{dE}{dS}|_V[/tex]

In virtue of 1/T we know the ratio of entropy increase with addition of unit heat energy. The thermodynamics 2nd law teaches us from which side to which heat energy flows i.e. high to low, as I noted in my previous post, which we feel temperature convenient in our daily life.

All T, or kT to make it clearer, Q and E have physical dimension of energy, thus S has no physical dimension. It is a dimensionless number and has rather not physical but mathematical or information nature which remind us of number of cases in mathematics or statistics, e.g. permutation, combination. More profoundly the fact that number S works suggests that matter consists of atoms which has same nature and that properties of matters should be explained with configuration of same pieces, i.e. atoms.
 
Last edited:
  • #19
Chestermiller said:
This response doesn't make any sense to me.

Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
 
Last edited:
  • #20
A very good book on thermodynamics and statistics, using the approach with processes like the Carnot process is

R. Becker, Theory of Heat, Springer (1967)
 
  • Like
Likes Ale_Rodo and Lord Jestocost
  • #21
jartsa said:
Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
Two comments about all your postings in this thread:

You mention several times that an object contains a certain amount of heat.
Heat can only be transferred, not contained! You should check this in a basic text on thermodynamics.

You also state that S is supposed to equal Q / T. The only thing we can say about S, based on an analysis of the Carnot cycle, is that its change dS equals dQ / T for a reversible process.
For a reversible process at constant T involving a larger heat transfer Q this becomes ΔS = Q / T.
(Then we can also learn that S is a state function and a few more things.)
We can't say anything about the absolute value of S from this analysis, I believe.
 
  • Like
Likes vanhees71
  • #22
jartsa said:
Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
What is your definition of the "thermal energy of a gas?"
 
  • #23
I want to mention 2 points here:
1. When thermodynamics was invented, it was not quite clear, what heat really means.
One of the meanings is the modern concept of heat as a form of energy.
The second meaning was latter dubbed entropy. So entropy is a concept of heat which takes into account the temperature, where processes are taking place.
In cryophysics, even tiniest ammounts of heat (energy form) can destroy your experimental setup, because heat capacities are so low. This is another way of stating that entropy is a temperature weighted heat.
2. In classical thermodynamics empirical temperature scales ##\theta## (there are infinitely many of them!) are defined via the zeroth law of thermodynamics, which states the transitivity of thermal equilibrium: If A and B are in equilibrium and B and C are also, then A and C have to be in equilibrium, too. Hence you can introduce a state variable ##\theta##, which is the same for two systems A and B if they are in equilibrium.
The heat change in reversible processes is not a difference (or differential) of a state variable, but one can show that ##\delta Q## (##\delta##, because this is not a total differential of a state variable, but only an infinitesimal small path dependent amount of energy) has an integrating factor, which must be a function of empirical temperature ##\theta##, only, i.e.
## dS =\delta Q/f(\theta)##. If the integrating factor ##f## would depend on anything else but on ##\theta##, we could build a perpetuum mobile.
This integrating factor ##f(\theta)=:T## is what we call absolute temperature. So to come back to your original question: no, S can't be something different from Q/T because we not only define S here but also T as a function of ##\theta##.
 
  • Like
Likes Ale_Rodo and vanhees71
  • #24
DrDu said:
I want to mention 2 points here:
1. When thermodynamics was invented, it was not quite clear, what heat really means.
One of the meanings is the modern concept of heat as a form of energy.
The second meaning was latter dubbed entropy. So entropy is a concept of heat which takes into account the temperature, where processes are taking place.
In cryophysics, even tiniest ammounts of heat (energy form) can destroy your experimental setup, because heat capacities are so low. This is another way of stating that entropy is a temperature weighted heat.
2. In classical thermodynamics empirical temperature scales ##\theta## (there are infinitely many of them!) are defined via the zeroth law of thermodynamics, which states the transitivity of thermal equilibrium: If A and B are in equilibrium and B and C are also, then A and C have to be in equilibrium, too. Hence you can introduce a state variable ##\theta##, which is the same for two systems A and B if they are in equilibrium.
The heat change in reversible processes is not a difference (or differential) of a state variable, but one can show that ##\delta Q## (##\delta##, because this is not a total differential of a state variable, but only an infinitesimal small path dependent amount of energy) has an integrating factor, which must be a function of empirical temperature ##\theta##, only, i.e.
## dS =\delta Q/f(\theta)##. If the integrating factor ##f## would depend on anything else but on ##\theta##, we could build a perpetuum mobile.
This integrating factor ##f(\theta)=:T## is what we call absolute temperature. So to come back to your original question: no, S can't be something different from Q/T because we not only define S here but also T as a function of ##\theta##.
I don't regard entropy and entropy change to be directly related to heat at all. The entropy of a substance or system can change (increase) as a result of many mechanisms, only one of which is related to heat:
1. finite viscous dissipation of mechanical energy
2. finite conductive heat transfer within system and heat transfer across external boundary of a system
3. finite molecular diffusion
4. chemical reaction

In classical thermodynamics, the only way we have of determining the entropy change of a system experiencing an irreversible process is to devise an alternative reversible path between the same initial and final thermal thermodynamic equilibrium states as for the irreversible change and calculating the integral of dq/T for that alternate reversible path.

I will add the following to contribute to the basic picture on entropy:
There are only two ways in which the entropy of a closed system can change:

1. Entropy transfer across the boundary of the system with its surroundings by heat flow dQ across the boundary, at the boundary temperature ##T_B## (at which the heat flow occurs). The magnitude of this entropy change is equal to the integral of ##dQ/T_B##. This mechanism can be present in both reversible and irreversible processes.

2. Entropy generation within the system as a result of process irreversibility (which can easily be the result of viscous dissipation, chemical reaction, and diffusion just as easily as conductive heat transfer). This mechanism is present only in an irreversible process, and the amount of entropy generated can only be positive.
 
  • Like
Likes Ale_Rodo
  • #25
Chestermiller said:
I don't regard entropy and entropy change to be directly related to heat at all. The entropy of a substance or system can change (increase) as a result of many mechanisms, only one of which is related to heat:
I don't doubt this, as I was talking about the founders of thermodynamics.
I just looked up the history in the wonderful book by Friedrich Hund, "Geschichte der physikalischen Begriffe" (History of physical concepts). It was Rudolf Claudius who coined the term "entropy" in 1865 after calling it "Verwandlungswert der Wärme" (heat conversion value) before in 1862. The importance of entropy in chemical reactions was recognized in 1880, by Gibbs, van't Hoff and Helmholtz.
 
Last edited:
  • Like
Likes Chestermiller and Ale_Rodo
  • #26
DrDu said:
I don't doubt this, as I was talking about the founders of thermodynamics.
I just looked up the history in the wonderful book by Friedrich Hund, "Geschichte der physikalischen Begriffe" (History of physical concepts). It was Richard Claudius who coined the term "entropy" in 1865 after calling it "Verwandlungswert der Wärme" (heat conversion value) before in 1862. The importance of entropy in chemical reactions was recognized in 1880, by Gibbs, van't Hoff and Helmholtz.
The German expression "Verwandlungswert der Wärme" is actually very good. I would translate it something like "transformational value of heat". Adding heat at high T has little effect, whereas adding the same heat at low T has a much bigger effect.
 
  • #27
You mean Rudolf Clausius. I find the introduction of entropy among the most challenging didactic task in (theoretical) physics teaching. For me the most convincing concept is the information-theoretical approach. Together with that I think the most clear way from fundamental physics (classical and quantum mechanics) to many-body physics is kinetic theory, using the information-theoretical definition of entropy to motivate its use to derive the H-theorem and in this way come to equilibrium distributions and hydrodynamics in a more natural way than the very abstract treatment in terms of phenomenological thermodynamics, which unfortunately has a long tradition. If you choose this way, I think the idea of heat engines is quite intuitive.

With the kinetic approach it becomes clear that heat is a form of energy, namely this part of the energy of a system that is due to (thermal) fluctuations, i.e., the random motion of the molecules on top of the macroscopic average motion of the medium.

With the kinetic approach you also have the linear-response theory at hand by linearizing the Boltzmann equation for a situation close to (local) thermal equilibrium, and with this the general theory of transport coefficients becomes clear. It shows that dissipation is the transfer of energy from other energy forms into heat energy and always related to diffusion of the corresponding quantities, together driving the system from small deviations from local thermal equilibrium into local or even global thermal equilibrium. E.g., viscosity is the transport coefficient derived from the response to a macroscopic flow, dissipating kinetic energy of this flow to the random motion of the molecules, i.e., heat energy. Then you also have heat conduction due to temperature gradients, diffusion due to density/chemical-potential gradients, electric conductivity due to external electromagnetic fields etc.
 
  • #28
I found both the information-theoretical and the phenomenological derivation very interesting as a student. Phenomenological reasoning led sometimes to enormously valuable insights, like e.g. Ginzburg-Landau theory of superconductivity, and I got the impression, that this line of thought is not so present in teaching, as it may deserve.
 
  • Like
Likes vanhees71
  • #29
I think it's good to couple a kinetic approach with information theoretical reasoning for the introduction of entropy. This works already based on classical physics, starting from the Liouville equation for probability distributions and then to the reduced coarse-grained description "throwing away information contained in higher-order correlations" by truncating the BBGKY hierarchy, though you need to borrow some arguments from quantum theory concerning the "natural phase-space measure". More clear is the quantum-statistical approach using QFT (both non-relativistic and relativistic is possible), and there of course the mean-field theory as for example Ginzburg-Landau theory is very natural.
 
  • #30

Chestermiller said:
What is your definition of the "thermal energy of a gas?"

Thermal energy of ideal gas is the kinetic energy of the random motion of the gas molecules.

Or: Thermal energy of ideal gas is pressure times volume.
 
  • #31
The thermal energy of a classical ideal gas is ##U=f k_{\text{B}} N T/2##, where ##f## is the number of "relevant degrees of freedom" (i.e., ##f=3## for a monatomic, ##f=5## for a twoatomic, and ##f=6## for multiatomic gases); ##k## is Boltzmann's constant and ##N## is the number of particles/molecules, and ##T## the (absolute) temperature.

Then you have the ideal-gas law ##p V=N k_{\text{B}} T##, which is obviously different from ##U## by a factor.
 
  • #32
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
 
  • Skeptical
  • Like
Likes Philip Koeck, Motore and Ale_Rodo
  • #33
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
Can you please provide an example for this contention?
 
  • Like
Likes binis and Philip Koeck
  • #34
binis said:
Clausious definition is differential: dS=dQ/T, it's not always easy to integrate this.Keeping T stable, if you extract an amount dQ from the system, you increase the order of the molecules. If you induct an amount dQ in the system, then you increase the disorder of the molecules (random motion).
I don't want to give away too much here, but do please follow up on Chestermiller's suggestion.
How would your statement work out for an ideal gas, for example?
 
  • Like
Likes binis
  • #35
Chestermiller said:
Can you please provide an example for this contention?
Think of boiling water. Molecules transit from a more "ordered" state (liquid) to a less "ordered" state (gas).
 

Similar threads

  • Thermodynamics
Replies
26
Views
1K
Replies
15
Views
1K
Replies
3
Views
1K
Replies
4
Views
952
  • Thermodynamics
Replies
2
Views
776
Replies
56
Views
3K
Replies
13
Views
1K
Replies
16
Views
850
  • Thermodynamics
Replies
19
Views
2K
Back
Top