Hello, When the Celsius temperature was first defined, it was defined by saying that 0°C was the freezing point and 100°C the boiling point. Now, this is far (infinitely far, actually) from a definition of temperature (this was before the 2nd law or Stat. Mech.; they are not important for our discussion). One might intuitively say "well you got two points: two points define a line so now you can extrapolate the celsius temperature for any other system", but surely this is completely wrong? What kind of graph is one thinking of? The x-axis temperature, okay; and the y-axis? (if my last two lines confuse you: ignore them and skip on to the next paragraph) The thing I'm trying to get is why all our temperature scales (Fahrenheit, Celsius, Kelvin, ...) are linear to each other, cause there is no reason at all why they should be, or is there? Anyway, Celsius then defined his scale by saying: if V is the expansion of water from 0°C to 100°C, then if the water rises by V/100, the temperature has gone up by one degree Celsius. So Celsius defined his temperature to be linear with the expansion of water: this is arbitrary, right? If he had made it linear with the expansion of mercury, then it wouldn't have been linear with the expansion of water? I'm basing this last line on following quote from the Feynman Lectures I'm not sure what he means by "even", but I assume he means in correlation to the volumetric expansion of the fluid? So with our current definition of T, water and ideal gases expand linearly with the degree, and mercury doesn't? (I suppose the difference is due to... ehm, chemistry? Well one would expect that water and mercury would ressemble each other more chemically than water and an ideal gas...) N.B.: so is the linearity of Celsius, Fahrenheit, Kelvin, ... with respect to each other simply because they were all (arbitrarily) made to be linear with the expansion of water? Thank you very much!