- #1
GregC
- 2
- 0
In a chemical investigation I'm currently undertaking as part of a chemistry course I'm am studying I'm recording how the temperature changes in several reactions and using the rate of temperature change as a measure of the rate of reaction. I believe this is a valid way to do it as you generally measure rate of reaction as the change in the concentration of some reactant or product with respect to time and I figured that as every reaction has a constant delta H per mole value (change in enthalpy per mole) and the rise in temperature of a substance is directly related to energy transferred to it, that the rate of change of the concentration of a given reactant with respect to time is directly proportional to the rate of change of temperate with respect to time.
Now after doing many different runs of several reactions and recording how the temperature changes in each one and then plotting the initial rate of temperature change (i.e. the gradient of a tangent to the temperature-time graph at the beginning of a reaction) against the initial concentration of a reactant (the initial concentration of the other reactants was kept constant) it appears that measuring the rate of reaction via the rate of temperature range with respect to time appears to work. However I'm also getting quite a few anomalous results (i.e. I need to remove them from the graph so it looks correct) and while plotting the rate of temperature change with respect to time against the concentration of a reactant gives me a pretty nice straight line (which is what I was expecting), the line doesn't intercept the y-axis when the rate of temperature change is 0 (i.e. the line of best fit drawn predicts that the reaction will still occur when the concentration of one of the reactants is 0), it intercepts at something like 1 and given that the range of the rates of the temperature changes I'm measuring is between 1 and 2 this is quite a significant error.
So my question is, can you measure the rate of a reaction by using the rate of temperature change? My results could suggest that you can but then again there appears to be a lot of error and the apparent connection between reaction rate and rate of temperature change could be mere coincidence.
Now after doing many different runs of several reactions and recording how the temperature changes in each one and then plotting the initial rate of temperature change (i.e. the gradient of a tangent to the temperature-time graph at the beginning of a reaction) against the initial concentration of a reactant (the initial concentration of the other reactants was kept constant) it appears that measuring the rate of reaction via the rate of temperature range with respect to time appears to work. However I'm also getting quite a few anomalous results (i.e. I need to remove them from the graph so it looks correct) and while plotting the rate of temperature change with respect to time against the concentration of a reactant gives me a pretty nice straight line (which is what I was expecting), the line doesn't intercept the y-axis when the rate of temperature change is 0 (i.e. the line of best fit drawn predicts that the reaction will still occur when the concentration of one of the reactants is 0), it intercepts at something like 1 and given that the range of the rates of the temperature changes I'm measuring is between 1 and 2 this is quite a significant error.
So my question is, can you measure the rate of a reaction by using the rate of temperature change? My results could suggest that you can but then again there appears to be a lot of error and the apparent connection between reaction rate and rate of temperature change could be mere coincidence.