Why is Entropy defined as a fraction of heat over temperature?

  • I
  • Thread starter Ale_Rodo
  • Start date
  • #1
32
6
Can we make sense out of the formula of entropy like we do for density (like "quantity of mass per unit volume")? What's the sense of Q/T? Couldn't it be something else?

Of course it probably is a 'me-problem', but I haven't studied Thermodynamics deeply yet and was wondering what Entropy actually means. It is often described as a quantity defining how much randomness that system is subject to, how probable it is for a system to evolve in a certain way and something in between these two, but how do you get from "Q/T" to randomness or probability? Is it possible to make the most general example of why Q/T makes sense?

I am looking for a demonstration of why it came to existence in that form or just an immediate logical example, such as professors usually do when explaining what density of mass is, if possible.

I think you already got what I'm trying to say at this point, but I'm going to clarify by using this very example of density: ρ = m/V indicates how much mass is there per unit volume and it's easy to imagine, in fact, we could approximately say that the more dense something is, the more atoms per unit volume we'll have.

Can we do the same thing for S = Q/T? I have some ideas of my own but they don't really have anything to do with randomness or probability and of course I'm what's farthest away from being an expert in Thermodynamics.

Thank you very much in advance.
 

Answers and Replies

  • #2
anuttarasammyak
Gold Member
1,329
607
If Q=TS, by differentiation dQ=TdS+SdT.
But definition of entropy is dQ=TdS or dS=dQ/T, not S=Q/T.
So we know inverse of temperature is given as
[tex]\frac{1}{T}=\frac{dS}{dQ}[/tex]
RHS means rate of increase of entropy per heat energy given to the system. 1/T is a parameter to show this ratio.

Say two systems 1 and 2 contact to transfer heat energy, by energy conservation
[tex]dQ_1+dQ_2=0[/tex]
Total entropy of systems
[tex]d(S_1+S_2)=1/T_1 \ dQ_1+1/T_2 \ dQ_2 =(1/T_1-1/T_2)\ dQ_1 > 0[/tex]
from the second law of dynamics.
When ##T_1 > T_2, \ dQ_1 <0, \ dQ_2>0##
When ##T_1 < T_2, \ dQ_1 >0,\ dQ_2<0##
So we know heat flows from high temperature system to low temperature system and in equilibrium
##T_1=T_2, \ dQ_1=-dQ_2=0##
 
Last edited:
  • #3
32
6
Your answer is great, but I feel like it's a bit far from what I was trying to achieve.

For instance,
anuttarasammyak said:
If Q=TS, by differentiation dQ=TdS+SdT
but I was trying to imagine what brought scientists to define entropy at first. This way, you're starting from a point which is just after the definition of entropy and hence doesn't really question its formula's 'shape'.

anuttarasammyak said:
RHS means rate of increase of entropy per heat energy given to the system. 1/T is a parameter to show this ratio.
This part intrigues me and I feel this is the route I was trying to walk to discover why entropy is defined that way, but still it starts as if dS is something that has always existed in physics or that is easily understandable; something like mass, which from a certain point of view doesn't need much explanation (at least on a simpler level).

So, in the end, if we imagine that we have to come up for the definition of entropy (we haven't even have it clear nor we know it's going to be called 'S'), why would we choose it to be dS = dQ/T ?

I know it is a very strange question and I hope I conveyed the message, it would be a bless if anybody has this kind of answer.

I'm thankful for the answer by the way.
 
  • #4
anuttarasammyak
Gold Member
1,329
607
From differential form we get definition of entropy
[tex]S_2-S_1 =\int_1^2 \frac{dQ}{T}[/tex]
where integration path from state 1 to state 2 is any reversible process.

The example of contact of high temperature body and low temperature body, after heat exchange all the bodies has same temperature. The Initial and the final state have same amount of energy. Do you observe any difference between the state before contact and that of after contact ? If yes, entropy is a parameter found to distinguish these states.

Counties A and B have same amount of national wealth. In country A 1% of citizens own all the national wealth and 99% have zero. In country B all the citizens own equal amount of national wealth. If you were an economist, would not you observe the difference between the countries and want to express it by some parameter ?
 
Last edited:
  • #6
1,538
131
S=Q/T

Q means how many Joules, and T means in how cold place those Joules are ?


Let's say we have built a car that runs on thermal energy of atmosphere and very cold water. Engine is ideal.

Charging this car so that it can run 1000 miles requires energy from the electric socket, which energy is used by an ideal freezer. When the freezer is sucking Joules out of water that is very cold, many Joules of electricity have to be used to suck out one Joule of heat. So removing lot of Joules from the cold place requires lot of energy. Removing lot of entropy requires lot of energy in this case.

Just in case anyone is skeptical: Letting many Joules of heat energy go through an ideal motor, which dumps just the small amount of heat energy that was not converted to mechanical work, into a very cool heat sink, that is an inverse process of the aforementioned process of using lot of energy to cool a very cool thing.
 
Last edited:
  • #7
328
65
Can we make sense out of the formula of entropy like we do for density (like "quantity of mass per unit volume")? What's the sense of Q/T? Couldn't it be something else?

Of course it probably is a 'me-problem', but I haven't studied Thermodynamics deeply yet and was wondering what Entropy actually means. It is often described as a quantity defining how much randomness that system is subject to, how probable it is for a system to evolve in a certain way and something in between these two, but how do you get from "Q/T" to randomness or probability? Is it possible to make the most general example of why Q/T makes sense?

I am looking for a demonstration of why it came to existence in that form or just an immediate logical example, such as professors usually do when explaining what density of mass is, if possible.

I think you already got what I'm trying to say at this point, but I'm going to clarify by using this very example of density: ρ = m/V indicates how much mass is there per unit volume and it's easy to imagine, in fact, we could approximately say that the more dense something is, the more atoms per unit volume we'll have.

Can we do the same thing for S = Q/T? I have some ideas of my own but they don't really have anything to do with randomness or probability and of course I'm what's farthest away from being an expert in Thermodynamics.

Thank you very much in advance.
Actually you should write deltaS = Q / T or dS = dQ / T.

I've written a short text on exactly this and I also recommend Chestermiller's article.

Anyway here's a link to mine:
https://www.researchgate.net/publication/338555515_Carnot_reversibility_and_entropy

The main points are that S is a state function and that dS = dQ / T for any reversible process
 
  • #8
32
6
Philip Koeck said:
Actually you should write deltaS = Q / T or dS = dQ / T.
Of course you are right. Unfortunately I have this very odd problem: if I don't get the meaning of the formula, then I tend to write it by memory until I finally wrap my head around it but it usually causes these kind of mistakes.

By the way I have given a very superficial look to your article (unfortunately I lack time these days) which, for what I can judge as an undergrad, seems great. The only problem is that I haven't studied Thermodynamics yet as I said, so I can barely follow.

Although in a few weeks it might appear much clearer, at the moment I still have problems when I read statements such as

" We now define a quantity (in the system) called entropy, S, that cannot be measured directly. We can only measure its change, which we define, for reversible processes only, as 𝑑𝑆 = 𝑑𝑄 𝑇 .
For a larger change during a process at constant temperature this gives: 𝛥𝑆 = 𝑄𝑇 . " .

Why do we have to define it and how would anyone know if he/she were to experience that cycle for the first time? I can't imagine someone seeing a full Carnot's cycle and say "yep, dS = dQ/T is exactly what I need." .
As I'm trying to convey, I get why we needed entropy (for instance in Chemistry, not all reaction happen just because the systems tend to evolve in the direction of lower energy), but I don't get its mathematical form.
Did someone notice dQ/T was appearing too many times in too many Thermodynamic equations? Why dQ/T and not, to say an absurdity, dQT? or dQT2? Does it have an history or did someone wake up one day and said "I think this random function I dreamt of (dQ/T) is called 'entropy'. It is the 'guide' which leads systems to evolve in a certain way rather than other."

I hope this doesn't sound as crazy as I think I sound myself. Thanks for the patience, I'm really struggling on this one.
 
  • #9
32
6
S=Q/T

Q means how many Joules, and T means in how cold place those Joules are ?
This is interesting! My brain automatically found some examples but I'm now asking you if the ideas I came up with make sense, so please let me know.

So, let's say we know entropy is a measure of how much disorder has been caused in the system (I think it's indeed how it's usually first presented).
Let's also say, hoping to be correct, that if the system examined were to be the whole universe, then the whole universe tends to a state of maximum disorder.
In this case, it would mean that no usable energy is available, matter is still there, but stars went out and the big-enough masses that could light one last star are too far away from each other.
Now, without radiation coming from stars or any other sources, the temperature should be at/nearby the absolute zero.

If somehow we were still there and could measure the total entropy change ΔS from, for example, today to that far time where the universe is just darkness, we would have to know the amount of heat exchanged during this time.

Let's say we know this amount Q for unspecified reasons, now if we compare that Q with the temperature measured at the end of the times we would get an incredibly high amount (for convenience T = 1 K so that we won't have definition problems when T = 0 K).
Would ΔS = Q/T give us a measure of how much the exchanged heat influenced the universe's temperature?
And in general:
- highest ΔS means "Ok you exchanged heat but can't really do anything anymore"
- lowest ΔS means "A few joules are enough to suddenly rise the temperature" ?

It is an incredibly rough idea but it could partially satisfy my question if it happens to be right-ish.
It's indeed an odd one, but did I at least grasp the concept, in your opinion?
 
  • #10
32
6
From differential form we get definition of entropy
[tex]S_2-S_1 =\int_1^2 \frac{dQ}{T}[/tex]
where integration path from state 1 to state 2 is any reversible process.

The example of contact of high temperature body and low temperature body, after heat exchange all the bodies has same temperature. The Initial and the final state have same amount of energy. Do you observe any difference between the state before contact and that of after contact ? If yes, entropy is a parameter found to distinguish these states.

Counties A and B have same amount of national wealth. In country A 1% of citizens own all the national wealth and 99% have zero. In country B all the citizens own equal amount of national wealth. If you were an economist, would not you observe the difference between the countries and want to express it by some parameter ?
Indeed, it starts to make much more sense than before. Thanks for that! Could I find another name for entropy such as 'heat distribution' then? Would it make sense or did I aim in the wrong direction?
 
  • Like
Likes anuttarasammyak
  • #12
1,538
131
So, let's say we know entropy is a measure of how much disorder has been caused in the system (I think it's indeed how it's usually first presented).
Let's also say, hoping to be correct, that if the system examined were to be the whole universe, then the whole universe tends to a state of maximum disorder.
In this case, it would mean that no usable energy is available, matter is still there, but stars went out and the big-enough masses that could light one last star are too far away from each other.
Now, without radiation coming from stars or any other sources, the temperature should be at/nearby the absolute zero.

If somehow we were still there and could measure the total entropy change ΔS from, for example, today to that far time where the universe is just darkness, we would have to know the amount of heat exchanged during this time.

Let's say we know this amount Q for unspecified reasons, now if we compare that Q with the temperature measured at the end of the times we would get an incredibly high amount (for convenience T = 1 K so that we won't have definition problems when T = 0 K).
Would ΔS = Q/T give us a measure of how much the exchanged heat influenced the universe's temperature?
And in general:
- highest ΔS means "Ok you exchanged heat but can't really do anything anymore"
- lowest ΔS means "A few joules are enough to suddenly rise the temperature" ?

As a non-cosmolgist I must change your example to a one with one very hot rock and one extremely cold rock, and a bag into witch the two rocks are put and kept there for a long enough time for the rocks to reach the same temperature.

Originally there were approximately no heat joules in the cold rock as we purposefully made it extremely cold. Afterwards the heat joules that originally were in the hot rock are in a less hot rock.

Let's say the absolute temperature of the hot rock halved.

Before: S=Q/T
After : S=Q/(T/2)

Entropy doubled. (Because number of joules stayed the same and the temperature of those joules halved)


Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
 
  • #13
328
65
By the way I have given a very superficial look to your article (unfortunately I lack time these days) which, for what I can judge as an undergrad, seems great. The only problem is that I haven't studied Thermodynamics yet as I said, so I can barely follow.

Although in a few weeks it might appear much clearer, at the moment I still have problems when I read statements such as

" We now define a quantity (in the system) called entropy, S, that cannot be measured directly. We can only measure its change, which we define, for reversible processes only, as 𝑑𝑆 = 𝑑𝑄 𝑇 .
For a larger change during a process at constant temperature this gives: 𝛥𝑆 = 𝑄𝑇 . " .

Why do we have to define it and how would anyone know if he/she were to experience that cycle for the first time? I can't imagine someone seeing a full Carnot's cycle and say "yep, dS = dQ/T is exactly what I need." .
As I'm trying to convey, I get why we needed entropy (for instance in Chemistry, not all reaction happen just because the systems tend to evolve in the direction of lower energy), but I don't get its mathematical form.
Did someone notice dQ/T was appearing too many times in too many Thermodynamic equations? Why dQ/T and not, to say an absurdity, dQT? or dQT2? Does it have an history or did someone wake up one day and said "I think this random function I dreamt of (dQ/T) is called 'entropy'. It is the 'guide' which leads systems to evolve in a certain way rather than other."
I don't think you need a complete thermodynamics course to understand my text, but you will have to spend some time on. If you do I'm pretty sure it will help you.

The only background you need is the first law (I use the form Q = ΔU +W) and the efficiency of a heat engine.

I'm not so good at history of science, but I think Carnot did something very much like you describe.
He looked at the efficiency of the Carnot process (equation 1 in my text) and said something like: "S&&t (pardon my French), there's a state function hidden in there."

Equation 1 is a dead giveaway: You can read it as QH/TH+QC/TC=0
All the changes of state function in a cyclic process add up to zero!
Get it?

None of the alternative definitions of entropy change that you suggest will add up to zero in the Carnot process.
There is one other possibility: You could define deltaS as T/Q, but temperatures don't add up the way heats do and what would dS (for a small change) be then?

Read the text first and whatever you need from a very basic textbook and then send another message if you still need help.
 
  • #14
32
6
Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
I get this one, and I think it's very helpful, but I still can't get why I should compare Q with T.
 
  • #15
32
6
I don't think you need a complete thermodynamics course to understand my text, but you will have to spend some time on. If you do I'm pretty sure it will help you.
It surely will, but at the moment I don't even have that knowledge unfortunately.

Philip Koeck said:
I'm not so good at history of science, but I think Carnot did something very much like you describe.
He looked at the efficiency of the Carnot process (equation 1 in my text) and said something like: "S&&t (pardon my French), there's a state function hidden in there."
I guess it's the only way I can explain myself how entropy was born, but I'll get what I need, read what I need to and eventually come back. Thanks for the resources!
 
  • #16
1,538
131
I get this one, and I think it's very helpful, but I still can't get why I should compare Q with T.

Well, maybe we can think that large Q means that the hot rock emits many photons, and small T means that the photons get measured with high accuracy by the cold rock. These two factors together mean large amount of information, or entropy, gets created.

What we can do with that knowledge that the thing called entropy is large ... that is another good question. There are some laws about the entropy, I guess that's why its important.
 
  • #17
21,747
4,943
As a non-cosmolgist I must change your example to a one with one very hot rock and one extremely cold rock, and a bag into witch the two rocks are put and kept there for a long enough time for the rocks to reach the same temperature.

Originally there were approximately no heat joules in the cold rock as we purposefully made it extremely cold. Afterwards the heat joules that originally were in the hot rock are in a less hot rock.

Let's say the absolute temperature of the hot rock halved.

Before: S=Q/T
After : S=Q/(T/2)

Entropy doubled. (Because number of joules stayed the same and the temperature of those joules halved)


Alternative wording of that above story: Hot rock produced random photons, cold rock measured those photons, amount of information doubled during that process, or amount of disorder if we call random information disorder.
This response doesn't make any sense to me.

If I have two bodies of mass M and heat capacity C, with one at absolute temperature ##T_H## and the other at absolute temperature ##T_C##, and I allow them to equilibrate, then the final temperature will be ##T_F=(T_H+T_C)/2##, the amount of heat transferred from the hot body to the cold body will be $$Q=MC(T_H-T_F)=MC\frac{(T_H-T_C)}{2}$$, and the entropy change will be $$\Delta S=MC\ln{(T_F/T_H)}+MC\ln{(T_F/T_C)}$$$$=2MC\ln{\left[\frac{(T_H+T_C)/2}{\sqrt{M_HM_C}}\right]}$$$$=\frac{4Q}{(T_H-T_C)}\ln{\left[\frac{(T_H+T_C)/2}{\sqrt{M_HM_C}}\right]}$$
 
  • Like
Likes Ale_Rodo and 256bits
  • #18
anuttarasammyak
Gold Member
1,329
607
This part intrigues me and I feel this is the route I was trying to walk to discover why entropy is defined that way, but still it starts as if dS is something that has always existed in physics or that is easily understandable; something like mass, which from a certain point of view doesn't need much explanation (at least on a simpler level).
May be it would rather make you confuse more, what is temperature T ? Surely we know the ways how to measure it and make use of it in daily scenes but what is the concept of temperature and why do we need it ?

I would say behind temperature there exists a hidden quantity S or entropy affect us through behavior of T with the relation .
[tex]\frac{1}{T}=\frac{dS}{dQ}[/tex]
or
[tex]T=\frac{dE}{dS}|_V[/tex]

In virtue of 1/T we know the ratio of entropy increase with addition of unit heat energy. The thermodynamics 2nd law teaches us from which side to which heat energy flows i.e. high to low, as I noted in my previous post, which we feel temperature convenient in our daily life.

All T, or kT to make it clearer, Q and E have physical dimension of energy, thus S has no physical dimension. It is a dimensionless number and has rather not physical but mathematical or information nature which remind us of number of cases in mathematics or statistics, e.g. permutation, combination. More profoundly the fact that number S works suggests that matter consists of atoms which has same nature and that properties of matters should be explained with configuration of same pieces, i.e. atoms.
 
Last edited:
  • #19
1,538
131
This response doesn't make any sense to me.

Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
 
Last edited:
  • #20
vanhees71
Science Advisor
Insights Author
Gold Member
2021 Award
19,495
10,252
A very good book on thermodynamics and statistics, using the approach with processes like the Carnot process is

R. Becker, Theory of Heat, Springer (1967)
 
  • Like
Likes Ale_Rodo and Lord Jestocost
  • #21
328
65
Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
Two comments about all your postings in this thread:

You mention several times that an object contains a certain amount of heat.
Heat can only be transferred, not contained! You should check this in a basic text on thermodynamics.

You also state that S is supposed to equal Q / T. The only thing we can say about S, based on an analysis of the Carnot cycle, is that its change dS equals dQ / T for a reversible process.
For a reversible process at constant T involving a larger heat transfer Q this becomes ΔS = Q / T.
(Then we can also learn that S is a state function and a few more things.)
We can't say anything about the absolute value of S from this analysis, I believe.
 
  • #22
21,747
4,943
Saying "entropy doubled" only makes sense if we are talking about absolute entropy. So I must have been sloppily talking about absolute entropy. Also my "rocks" must have been perfect diamonds, as only those have zero absolute entropy at zero absolute temperature.

So I failed to articulate my idea.

It's also possible that my idea is wrong. Hmm ... the absolute entropy of ideal gas is quite simple thing, I hope. So maybe I could do some calculation that hopefully shows that halving of absolute temperature of ideal gas while keeping the thermal energy of the gas constant results in doubling of the absolute entropy of the gas.

Like, ideal gas expands in a cylinder keeping its entropy constant and doing work until temperature is halved. Then more gas at the same temperature as the gas in the cylinder is added to the cylinder so that thermal energy inside the cylinder becomes the original thermal energy, which means doubling the amount of gas. There is now double the original amount of entropy in he cylinder.
What is your definition of the "thermal energy of a gas?"
 
  • #23
DrDu
Science Advisor
6,210
866
I want to mention 2 points here:
1. When thermodynamics was invented, it was not quite clear, what heat really means.
One of the meanings is the modern concept of heat as a form of energy.
The second meaning was latter dubbed entropy. So entropy is a concept of heat which takes into account the temperature, where processes are taking place.
In cryophysics, even tiniest ammounts of heat (energy form) can destroy your experimental setup, because heat capacities are so low. This is another way of stating that entropy is a temperature weighted heat.
2. In classical thermodynamics empirical temperature scales ##\theta## (there are infinitely many of them!) are defined via the zeroth law of thermodynamics, which states the transitivity of thermal equilibrium: If A and B are in equilibrium and B and C are also, then A and C have to be in equilibrium, too. Hence you can introduce a state variable ##\theta##, which is the same for two systems A and B if they are in equilibrium.
The heat change in reversible processes is not a difference (or differential) of a state variable, but one can show that ##\delta Q## (##\delta##, because this is not a total differential of a state variable, but only an infinitesimal small path dependent amount of energy) has an integrating factor, which must be a function of empirical temperature ##\theta##, only, i.e.
## dS =\delta Q/f(\theta)##. If the integrating factor ##f## would depend on anything else but on ##\theta##, we could build a perpetuum mobile.
This integrating factor ##f(\theta)=:T## is what we call absolute temperature. So to come back to your original question: no, S can't be something different from Q/T because we not only define S here but also T as a function of ##\theta##.
 
  • Like
Likes Ale_Rodo and vanhees71
  • #24
21,747
4,943
I want to mention 2 points here:
1. When thermodynamics was invented, it was not quite clear, what heat really means.
One of the meanings is the modern concept of heat as a form of energy.
The second meaning was latter dubbed entropy. So entropy is a concept of heat which takes into account the temperature, where processes are taking place.
In cryophysics, even tiniest ammounts of heat (energy form) can destroy your experimental setup, because heat capacities are so low. This is another way of stating that entropy is a temperature weighted heat.
2. In classical thermodynamics empirical temperature scales ##\theta## (there are infinitely many of them!) are defined via the zeroth law of thermodynamics, which states the transitivity of thermal equilibrium: If A and B are in equilibrium and B and C are also, then A and C have to be in equilibrium, too. Hence you can introduce a state variable ##\theta##, which is the same for two systems A and B if they are in equilibrium.
The heat change in reversible processes is not a difference (or differential) of a state variable, but one can show that ##\delta Q## (##\delta##, because this is not a total differential of a state variable, but only an infinitesimal small path dependent amount of energy) has an integrating factor, which must be a function of empirical temperature ##\theta##, only, i.e.
## dS =\delta Q/f(\theta)##. If the integrating factor ##f## would depend on anything else but on ##\theta##, we could build a perpetuum mobile.
This integrating factor ##f(\theta)=:T## is what we call absolute temperature. So to come back to your original question: no, S can't be something different from Q/T because we not only define S here but also T as a function of ##\theta##.
I don't regard entropy and entropy change to be directly related to heat at all. The entropy of a substance or system can change (increase) as a result of many mechanisms, only one of which is related to heat:
1. finite viscous dissipation of mechanical energy
2. finite conductive heat transfer within system and heat transfer across external boundary of a system
3. finite molecular diffusion
4. chemical reaction

In classical thermodynamics, the only way we have of determining the entropy change of a system experiencing an irreversible process is to devise an alternative reversible path between the same initial and final thermal thermodynamic equilibrium states as for the irreversible change and calculating the integral of dq/T for that alternate reversible path.

I will add the following to contribute to the basic picture on entropy:
There are only two ways in which the entropy of a closed system can change:

1. Entropy transfer across the boundary of the system with its surroundings by heat flow dQ across the boundary, at the boundary temperature ##T_B## (at which the heat flow occurs). The magnitude of this entropy change is equal to the integral of ##dQ/T_B##. This mechanism can be present in both reversible and irreversible processes.

2. Entropy generation within the system as a result of process irreversibility (which can easily be the result of viscous dissipation, chemical reaction, and diffusion just as easily as conductive heat transfer). This mechanism is present only in an irreversible process, and the amount of entropy generated can only be positive.
 
  • #25
DrDu
Science Advisor
6,210
866
I don't regard entropy and entropy change to be directly related to heat at all. The entropy of a substance or system can change (increase) as a result of many mechanisms, only one of which is related to heat:
I don't doubt this, as I was talking about the founders of thermodynamics.
I just looked up the history in the wonderful book by Friedrich Hund, "Geschichte der physikalischen Begriffe" (History of physical concepts). It was Rudolf Claudius who coined the term "entropy" in 1865 after calling it "Verwandlungswert der Wärme" (heat conversion value) before in 1862. The importance of entropy in chemical reactions was recognized in 1880, by Gibbs, van't Hoff and Helmholtz.
 
Last edited:
  • Like
Likes Chestermiller and Ale_Rodo

Related Threads on Why is Entropy defined as a fraction of heat over temperature?

Replies
4
Views
7K
Replies
1
Views
594
  • Last Post
Replies
6
Views
2K
Replies
3
Views
3K
Replies
10
Views
9K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
15
Views
6K
Top