I have an exothermic reaction which evolves gas. The change in entropy of the reaction is -837.02 kJ per 3 moles of gas evolved (about -279 kJ/mol). The reaction takes place inside a sealed container with a constant volume [say volume = 4.5 E-3 m3]. We will assume the temperature of the reaction to be a constant 80° C (far hotter than the temperature outside the container) so as to prevent heat energy flowing into the system from the surroundings. As the reaction proceeds, the gas evolved must be expanded into the atmosphere above at an ever increasing pressure…a process which takes energy. If we assume all the energy comes from the ambient heat inside the system (which in term comes from the change in enthalpy of the reaction), as the pressure above the reaction grows, eventually the rate of reaction will drop to zero (reaction will stop) since it would take more energy than is released in order to expand the gas. I am trying to solve for the maximum pressure at which the reaction will still take place. The work done by the system to expand the gas is equal to, W = ∆P * V The point at which the reaction would stop would be when the work to expand the gas equals the change in enthalpy of the reaction. ∆H = ∆P * V 279000 Joules = ∆P * (4.5 E-3 Liters), ∆P = 6.2 E7 Pa ~= 612 atm This would mean that the reaction would proceed until the “atmospheric” pressure was greater than or equal to about 612 atmospheres! This is an extremely high pressure, far greater than what I would expect, which leads me to believe I made an error on one of the steps. Does anyone see where I might have gone wrong?