- #1
Absentee
- 23
- 0
Ok, I'll get pretty straight-forward.
So, the usual assumptions:
Energy given to the system: > 0
Energy taken from the system: < 0
Enthalpy is defined as:
ΔH = ΔU + pΔV, so
ΔU = ΔH - pΔV
Let's say that during some chemical reaction there is heat produced (Energy taken from the system):
ΔH = -49 kJ
... And a gas was compressed (Energy given to the system):
W = 5 kJ
To calculate the change in internal energy
ΔU = ΔH - pΔV
ΔU = -49 kJ - 5 kJ = -54 kJ
However, this does not make sense to me.
Shoudn't it be -44 kJ by common sense?
Is it the sign convention is intentionally switched in the first term?
Thanks.
So, the usual assumptions:
Energy given to the system: > 0
Energy taken from the system: < 0
Enthalpy is defined as:
ΔH = ΔU + pΔV, so
ΔU = ΔH - pΔV
Let's say that during some chemical reaction there is heat produced (Energy taken from the system):
ΔH = -49 kJ
... And a gas was compressed (Energy given to the system):
W = 5 kJ
To calculate the change in internal energy
ΔU = ΔH - pΔV
ΔU = -49 kJ - 5 kJ = -54 kJ
However, this does not make sense to me.
Shoudn't it be -44 kJ by common sense?
Is it the sign convention is intentionally switched in the first term?
Thanks.