A bomb calorimeter is used to measure the overall heat output. It is calibrated by burning 1.00g of methanol (Change in enthalpy of combustion- 715 kJ mol–1) in O2 which produces a temperature rise of 8.40 K. Use this information to determine the heat capacity of the calorimeter. Why must the calorimeter be saturated with water vapour?
Going over past papers and seen this question. Just a little confused as to how to do it. Previous capacity questions I've done, have given the power and time, which i multipled for q and and divided by the change in temperature for the heat capacity.
I understand the to get q= change in combustion x no.of moles
Then plug it into Cv= q/Change in temp
to get the heat capacity of the calorimeter at -2.66 KJ/K
But it doesn't make sense to me? isn't the definition of heat capacity, the amount of energy needed to raise the temp by 1k?
So a negative answer can't be right?