# Making sense of the units of entropy: J/K

Tags:
1. Dec 29, 2014

### stimulus

Hi everyone,

I have a conceptual question about entropy. I understand perfectly why mathematically the units of entropy are energy per temperature (SI: J / K). However, I would like to better understand the significance of these units.

For example, the SI units for speed/velocity are m / s. Conceptually, this means that a particle with a speed of 3 m/s will travel 3 meters during an elapsed time of 1 second. By this same reasoning, what does it mean for entropy to have units of J / K? If the temperature increases by 1 K, what is the energy that is changing?

Thank you!

2. Dec 29, 2014

### gianeshwar

In getting randomized total energy of the participating systems decreases .

3. Dec 29, 2014

### ShayanJ

Dimensions don't have much physical meaning. You can as well work with dimensionless entropy $\sigma=\frac{S}{k}$. Also in units where you set c=1, mass and energy will have same units so you may say the unit of entropy is $\frac{kg}{K}$. Physical insight to a quantity doesn't come from its units, but from using it in different examples.

4. Dec 30, 2014

### Joshua L

Originally, there were two definitions of entropy: thermodynamic entropy and statistical mechanic entropy. It was Boltzmann who proved that the two were the same.

Before the statistical mechanic definition, the change of thermodynamic entropy was defined as $$\Delta S \equiv \int_{E_i}^{E_f} \frac{1}{T} \ d E_{internal}$$
This measures the change of the distribution of energy in a thermodynamic system.

Later, Boltzmann devised an equation showing that entropy is dependent on the number of microstates in a system: $$S \equiv k_B \ln{Ω}$$
This is called the statistical mechanic definition of entropy.

The reason why Boltzmann's constant is in there is to make sure that statistical mechanic entropy is one in the same as thermodynamic entropy. Boltzmann's constant is measured in J/K.

Now we just call this physical quantity entropy so that our tongues won't get into knots.

5. Dec 31, 2014

### <sHoRtFuSe>

Entropy is the measure of randomness right? Heat is a form of energy that kind of shows how random a system is. Hence there is the SI unit of heat in the numerator as more of it icreases randomness. As for the K in the denominator, for a particular system and a given amount of heat, entropy increases more if the systems temperature is lower. It kind of means that a given amt of heat has more "value" to increase entropy if the system is at a lower temperature.