Making sense of the units of entropy: J/K

  • Thread starter Thread starter stimulus
  • Start date Start date
  • Tags Tags
    entropy units
AI Thread Summary
Entropy is measured in units of energy per temperature (J/K), indicating how energy distribution changes with temperature in a system. The concept of entropy encompasses both thermodynamic and statistical mechanics definitions, with Boltzmann's equation linking the two by relating entropy to the number of microstates. Higher temperatures reduce the "value" of added heat in increasing entropy, as the same amount of energy contributes less to randomness at elevated temperatures. The significance of entropy lies in its ability to quantify the randomness or disorder within a system. Understanding entropy requires applying these concepts in various contexts rather than focusing solely on its units.
stimulus
Messages
2
Reaction score
0
Hi everyone,

I have a conceptual question about entropy. I understand perfectly why mathematically the units of entropy are energy per temperature (SI: J / K). However, I would like to better understand the significance of these units.

For example, the SI units for speed/velocity are m / s. Conceptually, this means that a particle with a speed of 3 m/s will travel 3 meters during an elapsed time of 1 second. By this same reasoning, what does it mean for entropy to have units of J / K? If the temperature increases by 1 K, what is the energy that is changing?

Thank you!
 
Science news on Phys.org
In getting randomized total energy of the participating systems decreases .
 
Dimensions don't have much physical meaning. You can as well work with dimensionless entropy \sigma=\frac{S}{k}. Also in units where you set c=1, mass and energy will have same units so you may say the unit of entropy is \frac{kg}{K}. Physical insight to a quantity doesn't come from its units, but from using it in different examples.
 
Originally, there were two definitions of entropy: thermodynamic entropy and statistical mechanic entropy. It was Boltzmann who proved that the two were the same.

Before the statistical mechanic definition, the change of thermodynamic entropy was defined as \Delta S \equiv \int_{E_i}^{E_f} \frac{1}{T} \ d E_{internal}
This measures the change of the distribution of energy in a thermodynamic system.

Later, Boltzmann devised an equation showing that entropy is dependent on the number of microstates in a system: S \equiv k_B \ln{Ω}
This is called the statistical mechanic definition of entropy.

The reason why Boltzmann's constant is in there is to make sure that statistical mechanic entropy is one in the same as thermodynamic entropy. Boltzmann's constant is measured in J/K.

Now we just call this physical quantity entropy so that our tongues won't get into knots.
 
  • Like
Likes gianeshwar
Entropy is the measure of randomness right? Heat is a form of energy that kind of shows how random a system is. Hence there is the SI unit of heat in the numerator as more of it icreases randomness. As for the K in the denominator, for a particular system and a given amount of heat, entropy increases more if the systems temperature is lower. It kind of means that a given amt of heat has more "value" to increase entropy if the system is at a lower temperature.
 
Thread 'Thermo Hydrodynamic Effect'
Vídeo: The footage was filmed in real time. The rotor takes advantage of the thermal agitation of the water. The agitation is uniform, so the resultant is zero. When the aluminum cylinders containing frozen water are immersed in the water, about 30% of their surface is in contact with the water, and the rest is thermally insulated by styrofoam. This creates an imbalance in the agitation: the cold side of the water "shrinks," so that the hot side pushes the cylinders toward the cold...
Back
Top