Making sense of the units of entropy: J/K

  • Context: Undergrad 
  • Thread starter Thread starter stimulus
  • Start date Start date
  • Tags Tags
    entropy units
Click For Summary

Discussion Overview

The discussion revolves around the conceptual understanding of the units of entropy, specifically why entropy is expressed in joules per kelvin (J/K). Participants explore the significance of these units in relation to energy and temperature, as well as the implications for understanding entropy in thermodynamic and statistical mechanics contexts.

Discussion Character

  • Conceptual clarification
  • Debate/contested
  • Technical explanation

Main Points Raised

  • One participant questions the physical significance of entropy's units, seeking to understand what it means for entropy to have units of J/K and how this relates to changes in energy with temperature.
  • Another participant suggests that dimensions may not hold much physical meaning and proposes the use of dimensionless entropy, indicating that physical insight comes from practical applications rather than units.
  • A historical perspective is provided, noting the two definitions of entropy (thermodynamic and statistical) and how Boltzmann's work unified them, emphasizing the role of Boltzmann's constant in relating the two definitions.
  • One participant asserts that entropy measures randomness and connects this to heat as a form of energy, explaining that the impact of heat on increasing entropy is greater at lower temperatures.

Areas of Agreement / Disagreement

Participants express differing views on the significance of entropy's units and the interpretation of entropy itself. There is no consensus on a singular understanding of the implications of the units of entropy.

Contextual Notes

Some discussions involve assumptions about the relationship between energy, temperature, and entropy that may not be universally accepted. The interpretations of entropy's significance and its units vary among participants.

stimulus
Messages
2
Reaction score
0
Hi everyone,

I have a conceptual question about entropy. I understand perfectly why mathematically the units of entropy are energy per temperature (SI: J / K). However, I would like to better understand the significance of these units.

For example, the SI units for speed/velocity are m / s. Conceptually, this means that a particle with a speed of 3 m/s will travel 3 meters during an elapsed time of 1 second. By this same reasoning, what does it mean for entropy to have units of J / K? If the temperature increases by 1 K, what is the energy that is changing?

Thank you!
 
Science news on Phys.org
In getting randomized total energy of the participating systems decreases .
 
Dimensions don't have much physical meaning. You can as well work with dimensionless entropy \sigma=\frac{S}{k}. Also in units where you set c=1, mass and energy will have same units so you may say the unit of entropy is \frac{kg}{K}. Physical insight to a quantity doesn't come from its units, but from using it in different examples.
 
Originally, there were two definitions of entropy: thermodynamic entropy and statistical mechanic entropy. It was Boltzmann who proved that the two were the same.

Before the statistical mechanic definition, the change of thermodynamic entropy was defined as \Delta S \equiv \int_{E_i}^{E_f} \frac{1}{T} \ d E_{internal}
This measures the change of the distribution of energy in a thermodynamic system.

Later, Boltzmann devised an equation showing that entropy is dependent on the number of microstates in a system: S \equiv k_B \ln{Ω}
This is called the statistical mechanic definition of entropy.

The reason why Boltzmann's constant is in there is to make sure that statistical mechanic entropy is one in the same as thermodynamic entropy. Boltzmann's constant is measured in J/K.

Now we just call this physical quantity entropy so that our tongues won't get into knots.
 
  • Like
Likes   Reactions: gianeshwar
Entropy is the measure of randomness right? Heat is a form of energy that kind of shows how random a system is. Hence there is the SI unit of heat in the numerator as more of it icreases randomness. As for the K in the denominator, for a particular system and a given amount of heat, entropy increases more if the systems temperature is lower. It kind of means that a given amt of heat has more "value" to increase entropy if the system is at a lower temperature.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 45 ·
2
Replies
45
Views
6K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K