1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Making sense of the units of entropy: J/K

  1. Dec 29, 2014 #1
    Hi everyone,

    I have a conceptual question about entropy. I understand perfectly why mathematically the units of entropy are energy per temperature (SI: J / K). However, I would like to better understand the significance of these units.

    For example, the SI units for speed/velocity are m / s. Conceptually, this means that a particle with a speed of 3 m/s will travel 3 meters during an elapsed time of 1 second. By this same reasoning, what does it mean for entropy to have units of J / K? If the temperature increases by 1 K, what is the energy that is changing?

    Thank you!
  2. jcsd
  3. Dec 29, 2014 #2
    In getting randomized total energy of the participating systems decreases .
  4. Dec 29, 2014 #3


    User Avatar
    Gold Member

    Dimensions don't have much physical meaning. You can as well work with dimensionless entropy [itex] \sigma=\frac{S}{k} [/itex]. Also in units where you set c=1, mass and energy will have same units so you may say the unit of entropy is [itex] \frac{kg}{K} [/itex]. Physical insight to a quantity doesn't come from its units, but from using it in different examples.
  5. Dec 30, 2014 #4
    Originally, there were two definitions of entropy: thermodynamic entropy and statistical mechanic entropy. It was Boltzmann who proved that the two were the same.

    Before the statistical mechanic definition, the change of thermodynamic entropy was defined as [tex]\Delta S \equiv \int_{E_i}^{E_f} \frac{1}{T} \ d E_{internal}[/tex]
    This measures the change of the distribution of energy in a thermodynamic system.

    Later, Boltzmann devised an equation showing that entropy is dependent on the number of microstates in a system: [tex]S \equiv k_B \ln{Ω}[/tex]
    This is called the statistical mechanic definition of entropy.

    The reason why Boltzmann's constant is in there is to make sure that statistical mechanic entropy is one in the same as thermodynamic entropy. Boltzmann's constant is measured in J/K.

    Now we just call this physical quantity entropy so that our tongues won't get into knots.
  6. Dec 31, 2014 #5
    Entropy is the measure of randomness right? Heat is a form of energy that kind of shows how random a system is. Hence there is the SI unit of heat in the numerator as more of it icreases randomness. As for the K in the denominator, for a particular system and a given amount of heat, entropy increases more if the systems temperature is lower. It kind of means that a given amt of heat has more "value" to increase entropy if the system is at a lower temperature.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook