Imagine I am trying to heat water from room temperature to 100 degrees Celsius, with a heating element of constant power. One would expect that for low temperatures, Temperature vs. Time graph to be linear. When I did the experiment, the result however was linear up to about 60 degrees, but then it started to look more exponential. I expect this is due to Newton's law of cooling between the water and the surroundings, and more complicated factors as the water asymptotic to 100 degrees at boiling point. How should I model heating water to the boil. Using T = 100 - Ae^-kt, did not fit my data well. What equations can used to model this phenomenon from 0 to 100 degrees. PS. If you need to use multiple equations for different portions of the curve, that's fine by me.