- #1
GravitatisVis
- 18
- 0
I'm teaching myself thermodynamics and I'm having trouble understanding Helmholtz and Gibbs free energy.
My understanding of enthalpy (H) is pretty solid I think. I understand that enthalpy is a state variable, meaning its change tells you how much heat (Q) is absorbed or released by a system at constant pressure. (ΔH=Q)
I also understand that if the change in enthalpy is negative (ΔH<0) for example the process is exothermic, which means the system's temperature increases and this increase in temperature causes heat to spontaneously flow from the system to surroundings in accordance with the first law of thermodynamics. This also means that in accordance with the second law, the entropy (S) of the system decreases (since it loses heat) and the entropy of the surroundings increase, but the increase outweighs the decrease.
I'm really having trouble, however, understanding why Gibb's free energy is defined as G = H - TS and ΔG = ΔH - TΔS. What's the deal with this TS term? What wasn't enthalpy telling us before? In this case the temperature and pressure are held fixed. I know that if a process is quasistatic, Q=TΔS. And since ΔH=Q, wouldn't the Qs cancel out? I'm pretty sure sure this isn't the case. And since T is constant, how can heat be release/ absorbed at all? Heat only flows in the context of a temperature gradient.
I understand that Gibb's free energy tells you two things: 1) the spontaneity of the reaction and 2) the energy available to do useful work. What I'm having difficulty understanding is the nature of the TS term that's slapped on to enthalpy. Since T is constant is there heat flow between the system and surroundings at all?
Same with Helmholtz Free Energy: F = U - TS. The change in F is defined as ΔF = ΔU - TΔS. What is the motivation for introducing the TS term? In this case the temperature and volume are held fixed.
Thanks everyone.
My understanding of enthalpy (H) is pretty solid I think. I understand that enthalpy is a state variable, meaning its change tells you how much heat (Q) is absorbed or released by a system at constant pressure. (ΔH=Q)
I also understand that if the change in enthalpy is negative (ΔH<0) for example the process is exothermic, which means the system's temperature increases and this increase in temperature causes heat to spontaneously flow from the system to surroundings in accordance with the first law of thermodynamics. This also means that in accordance with the second law, the entropy (S) of the system decreases (since it loses heat) and the entropy of the surroundings increase, but the increase outweighs the decrease.
I'm really having trouble, however, understanding why Gibb's free energy is defined as G = H - TS and ΔG = ΔH - TΔS. What's the deal with this TS term? What wasn't enthalpy telling us before? In this case the temperature and pressure are held fixed. I know that if a process is quasistatic, Q=TΔS. And since ΔH=Q, wouldn't the Qs cancel out? I'm pretty sure sure this isn't the case. And since T is constant, how can heat be release/ absorbed at all? Heat only flows in the context of a temperature gradient.
I understand that Gibb's free energy tells you two things: 1) the spontaneity of the reaction and 2) the energy available to do useful work. What I'm having difficulty understanding is the nature of the TS term that's slapped on to enthalpy. Since T is constant is there heat flow between the system and surroundings at all?
Same with Helmholtz Free Energy: F = U - TS. The change in F is defined as ΔF = ΔU - TΔS. What is the motivation for introducing the TS term? In this case the temperature and volume are held fixed.
Thanks everyone.