Exploring Entropy: Understanding its Intuitive Meaning and Calculation

In summary,In this conversation, entropy is defined, and it is shown that entropy is always increasing. It is also shown that Cp/T varies between different phases of a substance. Lastly, it is asked if entropy is going to be S > 28.3 J/K.mol, or 23 J/K.mol < S < 28.3 J/K.mol.
  • #1
PainterGuy
940
69
Hi,

Could you please help me to clarify a few points to understand entropy intuitively?

Entropy is defined as:
?temp_hash=c3eff243800c3cae9bd293564ddc7e30.jpg


Please have a look at the attachment, "entropy111".
Source of attachment: http://faculty.chem.queensu.ca/people/faculty/mombourquette/chem221/4_secondthirdlaws/SecondLaw.asp

The attachment shows how entropy is calculated for a substance. Figure 1 shows the plot of Cp/T against T and Figure 2 shows the plot of integral ∫(Cp/T)dt from 0 K to 'T' K, i.e. ΔS, entropy. I would say that the plots shown relate closely to those of many real substances.

Question 1:
In Figure 1, the heat capacity, Cp, is less at 'A' compared to at point 'B', while the value of Cp at points 'B' and 'C' is almost equal.

i: Why does Cp become almost constant between 'B' and 'C'?
ii: Between Tf and Tb or between 'D' and 'E', Cp decreases. What could be the reason for this?
iii: Between Tb and T, Cp again decreases drastically. It simply means that less energy is required to raise the temperature by 1 K.

Question 2:
In Figure 2, the entropy is changing at a faster rate at point 'G' than at point 'H'. Do I have it correct?

Question 3:
I have tried to find the entropy of air at 25 °C and 1 atm without any success. The closest I could get was "Entropy of air at 0°C and 1 bar: 0.1100 kJ/mol K = 3.796 kJ/kg K". Could you please help me with it?

Thank you!
 

Attachments

  • entropy_121.jpg
    entropy_121.jpg
    11.3 KB · Views: 283
  • entropy111.jpg
    entropy111.jpg
    64.6 KB · Views: 374
  • ?temp_hash=c3eff243800c3cae9bd293564ddc7e30.jpg
    ?temp_hash=c3eff243800c3cae9bd293564ddc7e30.jpg
    11.3 KB · Views: 515
Science news on Phys.org
  • #2
Q1: What is the y-axis of Figure 1?
Q2: Yes
Q3: Can you find Cp of air? Does this help you?
 
  • Like
Likes PainterGuy
  • #3
Thank you!

mjc123 said:
Q1: What is the y-axis of Figure 1?
It's Cp/T where 'T' is a constant therefore Cp should vary, i.e. it's a variable, as the plot shows.

Actually in simple terms, my question was that what factors could affect Cp in different phases of a substance as shown.

mjc123 said:
Q3: Can you find Cp of air? Does this help you?
Cp for air at 300 K is 1.005 kJ/kg.K.
Entropy of air at 0°C and 1 bar: 0.1100 kJ/mol K = 3.796 kJ/kg K. Source: https://www.engineeringtoolbox.com/air-properties-d_156.html

Right now this question could be ignored. It won't be much helpful to me but please help me with the question below.

Question:
It is said that entropy of a spontaneous process always increases and I wanted to know if my understanding is correct.

The standard entropy of aluminum is 28.3 J/K.mol at 25 °C and 1 atm. I found this value from a book. I don't know the entropy of aluminum at 0 °C and 1 atm but I'm sure it must be less than 28.3 J/K.mol; let's assume that it's 23 J/K.mol.

Suppose that we have two solid blocks of aluminum of same size sitting separately in an isolated environment. One block is at temperature of 25 °C and the other one is at 0 °C. The two blocks are put together and the temperature would equalize between the two and it would be 0 °C < Temperature < 25 °C. What about the entropy, S, after temperature equalization? Is it going to be S > 28.3 J/K.mol, or 23 J/K.mol < S < 28.3 J/K.mol?

Thanks a lot!
 
  • #4
Is T a constant? T is the x-axis variable, so it is varying.
Cp/T = dS/dT, so you integrate the top curve to get the bottom one (including ΔS for phase transitions).
For your latest question - why don't you work it out? You have the information (including the guess of 23 J/K.mol at 0°C) and the necessary equations. Assume Cp is constant over the temperature range. You can if necessary calculate Cp from the value of ΔS between 0 and 25°C. Or you could look up Cp and calculate the value for S at 0°C.
What is the equilibrium temperature reached by the two blocks? What is the increase in the entropy of the cold block, and the decrease in entropy of the hot block?
 
  • Like
Likes PainterGuy
  • #5
PainterGuy said:
Thank you!It's Cp/T where 'T' is a constant therefore Cp should vary, i.e. it's a variable, as the plot shows.

Actually in simple terms, my question was that what factors could affect Cp in different phases of a substance as shown.Cp for air at 300 K is 1.005 kJ/kg.K.
Entropy of air at 0°C and 1 bar: 0.1100 kJ/mol K = 3.796 kJ/kg K. Source: https://www.engineeringtoolbox.com/air-properties-d_156.html

Right now this question could be ignored. It won't be much helpful to me but please help me with the question below.

Question:
It is said that entropy of a spontaneous process always increases and I wanted to know if my understanding is correct.

The standard entropy of aluminum is 28.3 J/K.mol at 25 °C and 1 atm. I found this value from a book. I don't know the entropy of aluminum at 0 °C and 1 atm but I'm sure it must be less than 28.3 J/K.mol; let's assume that it's 23 J/K.mol.

Suppose that we have two solid blocks of aluminum of same size sitting separately in an isolated environment. One block is at temperature of 25 °C and the other one is at 0 °C. The two blocks are put together and the temperature would equalize between the two and it would be 0 °C < Temperature < 25 °C. What about the entropy, S, after temperature equalization? Is it going to be S > 28.3 J/K.mol, or 23 J/K.mol < S < 28.3 J/K.mol?

Thanks a lot!
It is going to be option B.
 
  • Like
Likes PainterGuy
  • #7
Thank you!

mjc123 said:
Is T a constant? T is the x-axis variable, so it is varying.

I had always thought that 'T' is the temperature at which entropy is being found therefore a constant value. If we assume that entropy is actually found the way shown in the plot then if the entropy is being calculated at 25 °C, it will make T=25 °C. So, please correct me. For example, I had always thought that 'T' in the highlights of attachment 'entropy141' represents the temperature at which heat transfer process starts. (For better resolution please see this link https://imageshack.com/a/img923/665/Wse71G.jpg). But obviously as the transfer of heat takes place, with every dQ quantity of heat, there would be a corresponding dT fall or rise in value of temperature.

Source for attachment: https://en.wikipedia.org/wiki/History_of_entropy

For my other question, I tried to work it out.

specific heat capacity of aluminum 0.900 at 25°C in J/g⋅°C
mass of each block is 27 grams
grams = number of moles * molar mass
atomic mass of aluminum 27 therefore for aluminum, grams = 1 * 27/1 mole = 27
Q=Cp*ΔT

?temp_hash=07ba5955966a6bb8b5627bceaa776c6b.jpg


I think that ΔS part is wrong because I didn't really know what I was doing!

Chestermiller said:
It is going to be option B.

This one "23 J/K.mol < S < 28.3 J/K.mol"?

Thanks a lot for your help.
 

Attachments

  • entropy141.jpg
    entropy141.jpg
    64.1 KB · Views: 297
  • entropy777.jpg
    entropy777.jpg
    31.7 KB · Views: 235
  • ?temp_hash=07ba5955966a6bb8b5627bceaa776c6b.jpg
    ?temp_hash=07ba5955966a6bb8b5627bceaa776c6b.jpg
    31.7 KB · Views: 335
  • #8
Nothing changes from 25 to 0°C. One block changes from 25 to 12.5°, and the other from 0 to 12.5°. You have to calculate the change of entropy for each separately.
 
  • Like
Likes Chestermiller
  • #9
If the temperature is changing during the transition from the initial state to the final state, then you don't use the initial temperature; you integrate over the variable temperature change. So please see my tutorial again for the correct way to do this.
 
  • #10
Thank you!

mjc123 said:
Nothing changes from 25 to 0°C. One block changes from 25 to 12.5°, and the other from 0 to 12.5°. You have to calculate the change of entropy for each separately.

Block 1 changes from temperature 25 °C to 12.5 °C.
Block 2 changes in temperature from 0 °C to 12.5 °C.

Do I have it correct?
?temp_hash=3ee0f6e52790c49612e8310a9582ef3d.jpg


Chestermiller said:
If the temperature is changing during the transition from the initial state to the final state, then you don't use the initial temperature; you integrate over the variable temperature change. So please see my tutorial again for the correct way to do this.

Sorry, I had missed your tutorial post. I'll have a look at it.

You earlier said "option B". Which one was it, "23 J/K.mol < S < 28.3 J/K.mol"? Thanks.
 

Attachments

  • entropy321.jpg
    entropy321.jpg
    11.3 KB · Views: 236
  • ?temp_hash=3ee0f6e52790c49612e8310a9582ef3d.jpg
    ?temp_hash=3ee0f6e52790c49612e8310a9582ef3d.jpg
    11.3 KB · Views: 285
  • #11
T1 and T2 are wrong for the cold block. In any case your equation is the wrong one. Q(1/T2 - 1/T1) applies to transferring an amount of heat Q from a body at a constant temperature of T1 to one at a constant temperature T2. In your example the temperatures of the blocks are changing. Therefore you need to integrate the equation dS = (Cp/T)dT just as in the very first equation you posted (except from T1 to T2, not 0 to T).
 
  • Like
Likes PainterGuy
  • #12
PainterGuy said:
Thank you!
Block 1 changes from temperature 25 °C to 12.5 °C.
Block 2 changes in temperature from 0 °C to 12.5 °C.

Do I have it correct?
No. The correct exact result is $$\Delta S_1=(27)(0.9)\ln{(285.65/298.15)}=-1.0408$$
$$\Delta S_1=(27)(0.9)\ln{(285.65/273.15)}=+1.0873$$
So the combined change in entropy is 0.0466 J/C

The average temperatures for the hot and cold blocks would have been 18.75 C and 6.25 C. If you had approximated the entropy changes using these temperatures, you would have obtained:$$\Delta S_1=-\frac{303.75}{291.9}=-1.0406$$$$\Delta S_2=+\frac{303.75}{279.4}=1.0872$$So, with this approximation, the combined entropy change would 0.0466 J/C. So, in this case, the approximation using the average temperature would be pretty close to the exact result.
You earlier said "option B". Which one was it, "23 J/K.mol < S < 28.3 J/K.mol"? Thanks.
Yes.
 
  • Like
Likes PainterGuy
  • #13
Thank you so much, both of you!

mjc123 said:
T1 and T2 are wrong for the cold block.

I corrected it.
?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg


Anyway, the correct calculation is shown below.
?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg


mjc123 said:
Q(1/T2 - 1/T1) applies to transferring an amount of heat Q from a body at a constant temperature of T1 to one at a constant temperature T2.

I understand that it would be an isothermic process but I can't think of any particular one. Could you please give an example of one?

@Chestermiller, thanks a lot for showing your calculation; the one with average temperature was helpful to see it differently.

Also, I found the entropy values at 0 C and 12.5 C.

@0 °C
?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg


@12.5 °C
?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg


This shows how entropy values at three different temperatures relate to each other.
?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg


The net increase in entropy calculation confirms that for every spontaneous process net entropy increases.

Important:
If entropy is given as J/K.mole then make sure you use molar heat capacity. For conversion help, see here: https://sciencing.com/calculate-molar-heat-capacity-6184868.html
Also Cp changes with temperature therefore for exact calculations constant value shouldn't be used.

Thanks.
 

Attachments

  • ent222.jpg
    ent222.jpg
    11.5 KB · Views: 245
  • ent333.jpg
    ent333.jpg
    32.6 KB · Views: 266
  • ent273_15.jpg
    ent273_15.jpg
    19.4 KB · Views: 245
  • ent285_65.jpg
    ent285_65.jpg
    19.8 KB · Views: 236
  • ent666.jpg
    ent666.jpg
    13 KB · Views: 291
  • ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    11.5 KB · Views: 274
  • ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    32.6 KB · Views: 334
  • ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    19.4 KB · Views: 307
  • ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    19.8 KB · Views: 286
  • ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    ?temp_hash=1d9df38a0176729386779f2f2260ecce.jpg
    13 KB · Views: 276
  • #14
PainterGuy said:
I understand that it would be an isothermic process but I can't think of any particular one. Could you please give an example of one?
What you are trying to do here is determine the change in entropy of the hot block from 25 C to 12.5 C. To make this calculation, you must devise a reversible path between the initial state and the final state, and calculate the integral of dq/T along that path. This can't be done with just a single constant temperature reservoir. You need to use a sequence of reservoirs at temperatures running from 25 C to 12.5 C. Then you transfer a tiny amount of the heat from the block to each of the reservoirs. As the number of reservoirs becomes infinite, you approach the integral of ##C_pdT/T##.

You do a similar thing for the cold reservoir running from 0 C to 12.5 C.
 
  • Like
Likes PainterGuy
  • #15
Thank you!

By the way, in this attachment, ent666, from my previous post, there should have been a "-" sign in front of "1.041".

Question 1:
I understand that the laws of thermodynamics were originally discovered in more of 'empirical' sense, or based on phenomenological model. I wonder how come they came to the conclusion that net entropy change would always be positive for a spontaneous process. Did they use it as a kind of postulate or axiom derived from the experimental results?

Question 2:
I'm going to state my confusion using few examples. I don't see how entropy plays such an important role when most phenomenon could be explained independently of entropy. The inclusion of entropy in the explanation seems kind of superfluous. Perhaps, it's needed to make the mathematical model more formal and systematic.

It is said that although melting of ice is an endothermic process, it still happens because it results in increase of entropy. But would this melting happen if the temperature of surroundings is 0 °C or less?! No. It requires energy to melt and is provided by the surroundings at any temperature above 0 °C. We don't need entropy talk to explain this.

As another example it is said that ammonia reaction is spontaneous at 25 °C. The enthalpy of ammonia is calculated experimentally and found that reaction is going to be exothermic because enthalpy is negative at 25 °C. At some temperature lower than 0 °C, I'm sure the enthalpy would be positive and reaction won't be exothermic. We don't need entropy to explain the synthesis of ammonia.

N2 (g) + 3H2 (g) → 2NH3 (g), ΔH°=-92.6 kJ

Could you please help me to understand why entropy is important, possibly using some example(s)? Thanks a lot.
 
  • #16
PainterGuy said:
Thank you!

By the way, in this attachment, ent666, from my previous post, there should have been a "-" sign in front of "1.041".

Question 1:
I understand that the laws of thermodynamics were originally discovered in more of 'empirical' sense, or based on phenomenological model. I wonder how come they came to the conclusion that net entropy change would always be positive for a spontaneous process. Did they use it as a kind of postulate or axiom derived from the experimental results?
That's my understanding.
Question 2:
I'm going to state my confusion using few examples. I don't see how entropy plays such an important role when most phenomenon could be explained independently of entropy. The inclusion of entropy in the explanation seems kind of superfluous. Perhaps, it's needed to make the mathematical model more formal and systematic.

It is said that although melting of ice is an endothermic process, it still happens because it results in increase of entropy. But would this melting happen if the temperature of surroundings is 0 °C or less?! No. It requires energy to melt and is provided by the surroundings at any temperature above 0 °C. We don't need entropy talk to explain this.

As another example it is said that ammonia reaction is spontaneous at 25 °C. The enthalpy of ammonia is calculated experimentally and found that reaction is going to be exothermic because enthalpy is negative at 25 °C. At some temperature lower than 0 °C, I'm sure the enthalpy would be positive and reaction won't be exothermic. We don't need entropy to explain the synthesis of ammonia.

N2 (g) + 3H2 (g) → 2NH3 (g), ΔH°=-92.6 kJ

Could you please help me to understand why entropy is important, possibly using some example(s)? Thanks a lot.
Entropy is very important in chemical thermodynamics and in mechanical engineering thermodynamics of power cycles and refrigeration cycles. In the latter, many process steps in a cycle take place nearly adiabatically and reversibly (i.e., at nearly constant entropy). If we are using thermodynamic tables (e.g., steam tables) to quantify the behavior of the working fluid in these cycles, we need to have entropy of the substance included in these tables (to establish the final states of these process steps). You are aware that entropy is a physical property of the substance being processed, correct?

In chemical thermodynamics, entropy is extremely important. It helps us quantify multiphase and/or multicomponent physical equilibrium and chemical equilibrium. Virtually all processes in industry involve these. An example is determining the equilibrium between a multicomponent solution and its vapor in terms of the concentrations of the various species in both phases. Without this, we could not design distillation columns, absorbers, flashers, etc. Also, crystallizers could not be quantified in terms of concentration of species in both phases. For the reaction you cited, entropy comes into play in predicting the equilibrium constant for the reaction and the effect of temperature and pressure on the equilibrium. Without entropy, we would have to measure all that for every reaction of interest. With it, that is not necessary if we know (from other measurements) the "entropy of formation" of the various species involved in a reaction and their heat capacities. So, in quantifying chemical reaction equilibrium, entropy plays a major role.

All you have learned so far about entropy is only the tip of the iceberg.
 
  • Like
Likes PainterGuy and BvU
  • #17
Thank you!

Chestermiller said:
Entropy is very important in chemical thermodynamics and in mechanical engineering thermodynamics of power cycles and refrigeration cycles. In the latter, many process steps in a cycle take place nearly adiabatically and reversibly (i.e., at nearly constant entropy). If we are using thermodynamic tables (e.g., steam tables) to quantify the behavior of the working fluid in these cycles, we need to have entropy of the substance included in these tables (to establish the final states of these process steps). You are aware that entropy is a physical property of the substance being processed, correct?

In chemical thermodynamics, entropy is extremely important. It helps us quantify multiphase and/or multicomponent physical equilibrium and chemical equilibrium. Virtually all processes in industry involve these. An example is determining the equilibrium between a multicomponent solution and its vapor in terms of the concentrations of the various species in both phases. Without this, we could not design distillation columns, absorbers, flashers, etc. Also, crystallizers could not be quantified in terms of concentration of species in both phases. For the reaction you cited, entropy comes into play in predicting the equilibrium constant for the reaction and the effect of temperature and pressure on the equilibrium. Without entropy, we would have to measure all that for every reaction of interest. With it, that is not necessary if we know (from other measurements) the "entropy of formation" of the various species involved in a reaction and their heat capacities. So, in quantifying chemical reaction equilibrium, entropy plays a major role.

All you have learned so far about entropy is only the tip of the iceberg.

I understand that it must be really important and thanks for making it more clear. The point I was trying to make using those examples was that those 'standard' examples among some others which are used almost everywhere to introduce the concept of entropy are not really good ones because they make to seem the entropy concept a little superfluous, in my opinion at least; the reason being that all those phenomena could easily be explained independently of entropy. Do you know of any other example(s) which really emphasize the importance of entropy by showing how the explanation of phenomenon become somewhat difficult and clumsy without the use of entropy? Thanks for your help!
 
  • #18
PainterGuy said:
Thank you!
I understand that it must be really important and thanks for making it more clear. The point I was trying to make using those examples was that those 'standard' examples among some others which are used almost everywhere to introduce the concept of entropy are not really good ones because they make to seem the entropy concept a little superfluous, in my opinion at least; the reason being that all those phenomena could easily be explained independently of entropy. Do you know of any other example(s) which really emphasize the importance of entropy by showing how the explanation of phenomenon become somewhat difficult and clumsy without the use of entropy? Thanks for your help!
I'm only able to cite the examples from my area of expertise. In my area, it would have been very difficult to establish the importance of entropy to chemical equilibrium thermodynamics before introducing its more elementary derivation. It wasn't until we learned about multicomponent phase- and chemical equilibrium did its real power revealed itself to me.
 
  • Like
Likes PainterGuy

1. What is entropy and why is it important?

Entropy is a measure of the disorder or randomness in a system. It is important because it helps us understand the behavior of physical and chemical systems and allows us to predict how they will change over time.

2. How is entropy calculated?

Entropy is calculated using the equation S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates or possible arrangements of a system. This equation can be applied to both classical and statistical mechanics.

3. What factors affect entropy?

Entropy is affected by the number of particles in a system, the volume of the system, the temperature, and the phase of the system. Increasing any of these factors will increase the entropy of the system.

4. Can entropy decrease?

In a closed system, the total entropy will always remain the same or increase. However, it is possible for the entropy of one part of the system to decrease as long as there is a corresponding increase in entropy elsewhere in the system.

5. How does entropy relate to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of a closed system will always increase over time. This is because as energy is transferred and transformed within a system, some of it is inevitably lost as heat, increasing the disorder and entropy of the system.

Similar threads

  • Thermodynamics
Replies
3
Views
1K
  • Biology and Chemistry Homework Help
Replies
1
Views
2K
  • Thermodynamics
Replies
7
Views
2K
Replies
2
Views
572
  • Thermodynamics
Replies
7
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
11
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
3K
Back
Top