Understanding the Units of Entropy in Thermodynamics

  • Thread starter nobahar
  • Start date
  • Tags
    Entropy
In summary, the units of entropy, kJ Mol-1 K-1, derive from its thermodynamic definition as the change in heat divided by temperature. In real processes, the change in entropy is always greater than zero. The number of microstates available to a system is a function of temperature and volume. The decrease in energy can decrease the entropy of a system, unless it is a reversible adiabatic expansion. The units of entropy are a matter of convenience when considering it as the number of microstates with the same thermodynamic properties.
  • #1
nobahar
497
2
Hello!
I have a question on entropy.
Having read other posts, it can be somewhat difficult to grasp conceptually. I thought I understood the concept, until I tried to understand Gibbs Free Energy, and also when I considered the units for entropy.
Firstly, why are the units of entropy kJ Mol-1 K-1. If entropy is a measure of the number of macrostates open to a particular system, then I do not understand the units. I thought perhaps they related indirectly to this concept. I read that a system requires a certain amount of energy for rotational/translational/chemical/etc. energy. So when defining a system, it is attributed with certain characteristics, including the kind of configurations open to it in terms of rotational/kinetic/chemical/etc and the number of molecules that can be in these various configuration BASED ON THE AMOUNT OF ENERGY IN THE SYSTEM. If, say, there wasn't sufficient energy, not all the molecules could travel with a velocity, V1. So, when a system is attributed with a certain number of microstates, it has to have a minimum amount of energy in the system to be able to satisfy the POTENTIAL to be in one of those microstates. Is this where the kJ in the units comes from? Put a little more succintly, does the entropy state variable identify the amount of energy required to have, well, a certain entropy (i.e. access to a certain number of microstates)?
I apologise if that was long and convoluted!
I considered posting the second part as another question, but I think it may perhaps relate to the above.
Gibbs free energy is derived from the relation:
[tex]\Delta S_{Univ} = \Delta S_{Sys} + \Delta S_{Surr}[/tex]
If this is done at constant temperature and pressure, then:
[tex]\Delta S_{Surr} = \frac{-q}{T_{Surr}}[/tex]
From this, I would conclude:
[tex]\Delta S_{Sys} = \frac{q}{T_{Sys}}[/tex]
Is this correct?
If the T of the system = T of the surroundings, then the entropy change of the universe is zero.
When an exothermic chemical reaction takes place, the system increases in heat, and then all this heat is transferred to the surroundings, which acts like a heat reservoir in an isothermal process. In which case all the heat liberated in the reaction is transferred away from the system. Is this correct?
I apologise for this being quite long. From reading other threads I have noticed that entropy troubles many people!
Many thanks,
Nobahar.
 
Physics news on Phys.org
  • #2
The units of entropy derive from its thermodynamic definition: dS = dQ/T (T being the temperature in Kelvins). Since Q has dimensions of energy, entropy has dimensions of energy/degree K.

The change in entropy of the universe in a process is zero only if it is reversible. The change in entropy of the surroundings + change in entropy of the system = change in entropy of the universe and cannot be < 0.

To determine the change in entropy of the surroundings you take the beginning and end states of the surroundings and calculate [itex]\int dQ/T[/itex] over the reversible path between those two states (of the surroundings). To determine the change in entropy of the system you take the beginning and end states of the system and calculate [itex]\int dQ/T[/itex] over the reversible path between those two states.

In real processes, the beginning and end states are such that the total change in entropy is always > 0.

AM
 
  • #3
Thanks Andrew.
May I ask what is probably a silly question? If the change in entropy is equal to Q/T, how is it possible for a system to give up energy and yet increase in entropy? Surely that extra energy permits the system the potential to occupy more microstates?
 
  • #4
nobahar said:
Thanks Andrew.
May I ask what is probably a silly question? If the change in entropy is equal to Q/T, how is it possible for a system to give up energy and yet increase in entropy? Surely that extra energy permits the system the potential to occupy more microstates?
A system can give up energy by doing work or by heat flowing out of it. If it does work by undergoing a reversible adiabatic expansion, there is no change in entropy. If there is a negative heat flow ie. heat flow out of the system, the entropy decreases. That is just by definition: dS = dQ/T < 0 if dQ < 0.

Now, if you want to talk about the number of microstates that are available, which is a more complicated way to think of entropy and usually not a very illuminating one, you have to think of all the possible states that all the molecules in the system could have which are thermodynamically equivalent. This is a function of the temperature of the system and the volume that it occupies. The negative heat flow will decrease the range of kinetic energies that the molecules would otherwise have (ie a decrease in temperature). A decrease in volume reduces the number of different positions the molecules of the system can otherwise have. So the number of potential, thermodynamically equivalent, microstates decreases.

So in answer to your question, the loss of energy will decrease the entropy of the system, unless it is a reversible adiabatic loss of energy (reversible adiabatic expansion) in which case there is no change in entropy.

AM
 
Last edited:
  • #5
Andrew Mason said:
That is just by definition: dS = dQ/T < 0 if dQ < 0.

Just to be more exact, [itex]dS=dQ_{rev}/T[/itex], where the subscript "rev" refers to a reversible path. Q, the heat transfered, is not a state function, while entropy is. (You can bring a system from state A to state B by different processes for which Q and even the integral dQ/T are different, but the entropy change between the two states is always the same.) It means that to calculate the entropy change between two states, you need to calculate the heat transferred during a reversible process, and use that heat in the integral.

If you take the view that entropy is the number of microstates giving the same thermodynamic properties, then the units are in some sense a matter of convenience. The definition of entropy in this case is:
[tex]S=k\log{\Omega}[/tex]
where [itex]\Omega[/itex] is the number of microstates in the so called microcanonical ensemble (set of states giving the same total energy and volume). The factor k ("Boltzmann's constant") has units of J/K, since in statistical thermodynamics you don't usually deal in moles, but to give the usual form you would use R, the gas constant from PV=NRT to get the usual units.

If you got rid of the factor of "k" in the above definition, entropy would retain many of its important properties, like the entropy of the universe is always increasing. The factor of "k" is still useful though, because it connects back to the classical thermodynamic definition, and you can talk about the conditions for a system at constant pressure and temperature, where the Gibb's free energy, G=H-TS = U+PV-TS, must always decrease. So from a statistical point of view, the units don't obviously need to be there, they just arise from a desire to compare the effects of entropy and energy in a convenient form
 
  • #6
LeonhardEuler said:
...and you can talk about the conditions for a system at constant pressure and temperature, where the Gibb's free energy, G=H-TS = U+PV-TS, must always decrease.

Thankyou for the responses Andrew and Euler. Using the above explanations, I am trying to understand Gibb's energy, which, if I have understood correctly, appears simply to be a way of identifying what the change in the entropy of the universe would be if the reaction was to take place (with the second law dictating whether or not the reaction could occur).
With a reaction A -> B, the entropy of A and B are both dependent on the temperature of the system. I assume that their entropy values do not necessarily change at the same rate (i.e. S(A) = 2T+5 and S(B) = 2T, then the difference in entropy between the two would always be the same regardless of temperature). If this is so, then how can the standard entropies be used to compute the change in Gibbs energy?, the T(Sfinal-Sinitial) component makes no sense, as the entropy values of the reactants and products would be different at different temperatures, and therefore the change would be different.
Initially, I came to the conclusion that the change in entropy for a process A -> B would have a specific value, regardless of temperature. If the reaction took place at different temperatures, then different amounts of energy would be liberated or taken in in order to achieve that specific entropy change. For example, at high temperatures, a given amount of energy added to a system induces a smaller change in entropy than the same amount of energy added at lower temperatures. If a reaction requires a certain entropy change to occur, the amount of energy required to be liberated or taken in for that entropy change will be dependent on the temperature of the system. This was my reasoning, and so I tried to convince myself that the use of standard entropies could be explained by this reasoning. To clarify, a reaction A -> B in standard conditions has a certain entropy change, a certain amount of energy is taken in or liberated at standard temperature to achieve this, and since S is Q/T, it will be the energy in or out for each Kelvin. The use of T(Sfinal-Sinitial) can then be interpreted as the T is 'scaling up'. If the system was at temperature T, how much energy would be required to achieve the same entropy change? Because division of the resulting number by the same T would give you the same energy per Kelvin as is necessary under standard state conditions.
Any further help appreciated,
Nobahar.
 
  • #7
nobahar said:
With a reaction A -> B, the entropy of A and B are both dependent on the temperature of the system. I assume that their entropy values do not necessarily change at the same rate (i.e. S(A) = 2T+5 and S(B) = 2T, then the difference in entropy between the two would always be the same regardless of temperature). If this is so, then how can the standard entropies be used to compute the change in Gibbs energy?,
Standard entropy per mole for a reaction is the change in entropy per mole at standard conditions (standard temperature and pressure).

AM
 
  • #8
Thanks for the response Andrew. This is where my confusion arises. How can the standard entropies be used in the Gibb's energy equation when the experiment is conducted at a different temperature?
 
  • #9
nobahar said:
... Gibb's energy, which, if I have understood correctly, appears simply to be a way of identifying what the change in the entropy of the universe would be if the reaction was to take place (with the second law dictating whether or not the reaction could occur).
You might have it right in your head or not, this is a little unclear. The second law of thermodynamics requires entropy to never decrease for an isolated system. In such a system total energy and volume are constant. It is often desirable to study systems at constant temperature and pressure, which will not be isolated from the environment because changes in volume and the release or absorption of heat require interactions with the environment to keep T and P fixed during a reaction. Gibbs free energy is a way of treating those interactions implicitly, so that the entropy change of the universe as a result of the process is taken into account by looking at the properties of the system only. It must never increase for a system at constant T and P.

nobahar said:
With a reaction A -> B, the entropy of A and B are both dependent on the temperature of the system. I assume that their entropy values do not necessarily change at the same rate (i.e. S(A) = 2T+5 and S(B) = 2T, then the difference in entropy between the two would always be the same regardless of temperature). If this is so, then how can the standard entropies be used to compute the change in Gibbs energy?, the T(Sfinal-Sinitial) component makes no sense, as the entropy values of the reactants and products would be different at different temperatures, and therefore the change would be different.
Initially, I came to the conclusion that the change in entropy for a process A -> B would have a specific value, regardless of temperature. If the reaction took place at different temperatures, then different amounts of energy would be liberated or taken in in order to achieve that specific entropy change. For example, at high temperatures, a given amount of energy added to a system induces a smaller change in entropy than the same amount of energy added at lower temperatures. If a reaction requires a certain entropy change to occur, the amount of energy required to be liberated or taken in for that entropy change will be dependent on the temperature of the system. This was my reasoning, and so I tried to convince myself that the use of standard entropies could be explained by this reasoning. To clarify, a reaction A -> B in standard conditions has a certain entropy change, a certain amount of energy is taken in or liberated at standard temperature to achieve this, and since S is Q/T, it will be the energy in or out for each Kelvin. The use of T(Sfinal-Sinitial) can then be interpreted as the T is 'scaling up'. If the system was at temperature T, how much energy would be required to achieve the same entropy change? Because division of the resulting number by the same T would give you the same energy per Kelvin as is necessary under standard state conditions.
Any further help appreciated,
Nobahar.

As I see it, you are curious about how the entropy change of the reactants in a reaction as a function of temperature is related to the entropy of the products as a function of T, and how the [itex]\Delta S[/itex] for the reaction changes, so that everything is all consistent and the entropy change does not depend on the path.

There is a lot right about the reasoning you are doing. If you consider that there must not be a contradiction here, you can derive a relation between the entropy change of a reaction as a function of T and the entropy changes of the reactants and products as a function of T. Take an especially simple case: The reactants and products are ideal gases, A,B and C, and we have
[tex]A+B\rightarrow C[/tex]
Since they are ideal gases, we know that for each species, the entropy change as a function of temperature for species "i" is:
[tex]\Delta S_{i}=n_{i}C_{V,i}\ln{\frac{T_2}{T_1}[/tex]
Suppose we carry out a reaction at TR. The entropy change for the reaction at this temperature, which we will call [tex]\Delta S_{R}(T_{R})[/tex], can also be calculated by supposing the reactants are first brought to standard temperature, [itex]T^{\circ}[/itex], then the reaction proceeds at constant T to completion, and finally the product is brought back to the original temperature. The entropy changes for these three steps:
Step 1:
[tex]\Delta S_{1}= n_{A}C_{V,A}\ln{\frac{T^{\circ}}{T_R}} + n_{B}C_{V,B}\ln{\frac{T^{\circ}}{T_B}} = (n_{A}C_{V,A}+n_{B}C_{V,B})\ln{\frac{T^{\circ}}{T_R}}[/tex]
Step 2:
[tex]\Delta S_{2}=\Delta S_{R}(T^{\circ})[/tex]
Step 3:
[tex]\Delta S_{3}=n_{C}C_{V,C}\ln{\frac{T_R}{T^{\circ}}}[/tex]
We know this needs to give the same answer as the direct route, [tex]\Delta S_{R}(T_{R})[/tex], so adding all the steps together and equating gives:
[tex]\Delta S_{R}(T_{R}) = \Delta S_{R}(T^{\circ}) + (n_{C}C_{V,C} - n_{A}C_{V,A}-n_{B}C_{V,B})\ln{\frac{T_R}{T^{\circ}}}[/tex]
This is the relation that must hold in this case to avoid the problems you were talking about.
 
  • #10
LeonhardEuler said:
[tex]\Delta S_{R}(T_{R}) = \Delta S_{R}(T^{\circ}) + (n_{C}C_{V,C} - n_{A}C_{V,A}-n_{B}C_{V,B})\ln{\frac{T_R}{T^{\circ}}}[/tex]
This is the relation that must hold in this case to avoid the problems you were talking about.

Thankyou for the response, and apologies for the delay my delay in replying - I have had exams recently.

If this relations is to hold, then it must also be able to be extended:
[tex]\Delta S_{R}(T_{R}) = \Delta S_{R}(T^{\circ}) + (n_{C}C_{V,C} - n_{A}C_{V,A}-n_{B}C_{V,B})\ln{\frac{T_R}{T^{\circ}}} = T\Delta S^{\circle}[/tex]
Is this so? As I thought this term in the Gibb's free energy was to 'correct' for temperature.
Apologies if I am completely mistaken, which I suspect is the case.
Many thanks.
 
  • #11
nobahar said:
Thankyou for the response, and apologies for the delay my delay in replying - I have had exams recently.

If this relations is to hold, then it must also be able to be extended:
[tex]\Delta S_{R}(T_{R}) = \Delta S_{R}(T^{\circ}) + (n_{C}C_{V,C} - n_{A}C_{V,A}-n_{B}C_{V,B})\ln{\frac{T_R}{T^{\circ}}} = T\Delta S^{\circle}[/tex]
Is this so? As I thought this term in the Gibb's free energy was to 'correct' for temperature.
Apologies if I am completely mistaken, which I suspect is the case.
Many thanks.

To be clear, when I wrote
[tex]\Delta S_{R}(T_{R})[/tex]
I meant to refer to delta S of the reaction as a function of T_R, the temperature of the reaction, not delta S of the reaction times the temperature of the reaction. As I meant it, we couldn't have
[tex]\Delta S_{R}(T_{R}) = T\Delta S^{\circle}[/tex]
because that doesn't work dimensionally. Hopefully you can reread my post and see what I meant to say. Feel free to ask if anything is unclear.
 
  • #12
Thanks for all the help so far.
I was aware that it was referring to S as a function of T, I was trying to reconcile the result with the Gibb's free energy component. I just didn't realize it couldn't work.
Apologies for the delay in responding. I am still working on entropy and will get back to this thread. I just didn't want to ask too many questions without doing some of the legwork myself. I added this post so that you do not think I am not thankful for all of your help so far.
 

1. What is entropy?

Entropy is a measure of the amount of disorder or randomness in a system. It is a concept in thermodynamics that describes the tendency of a system to move towards a state of maximum disorder.

2. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that the disorder or randomness of the system will also increase, as systems tend to move towards a state of maximum entropy.

3. What are some examples of entropy in everyday life?

Some examples of entropy in everyday life include the melting of ice cubes, the rusting of metal, and the mixing of hot and cold water. In each of these cases, a system moves towards a state of maximum disorder.

4. How is entropy calculated?

Entropy is calculated using the equation S = k ln W, where S represents entropy, k is the Boltzmann constant, and W is the number of possible microstates or arrangements that a system can have. The higher the number of possible microstates, the higher the entropy.

5. Can entropy ever decrease?

While the total entropy of a closed system will always increase, it is possible for the entropy of a specific part of the system to decrease. This is because energy can be transferred from one part of the system to another, resulting in a decrease in entropy in one part and an increase in another.

Similar threads

  • Introductory Physics Homework Help
Replies
2
Views
609
Replies
11
Views
332
  • Introductory Physics Homework Help
Replies
4
Views
838
  • Introductory Physics Homework Help
Replies
3
Views
730
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
3
Views
1K
  • Introductory Physics Homework Help
Replies
10
Views
1K
  • Introductory Physics Homework Help
Replies
2
Views
986
  • Introductory Physics Homework Help
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
887
Back
Top