- #1

- 17

- 0

Now we know that Boltzmann's constant is the average kinetic energy of one molecule of ideal gas, related to its temperature T.

The German scientist Clausius defined entropy change of some substance as the amount of energy dispersed reversibly at a specific temperature T:

dS=dQ/T

What is weird to me is that according to this formula and the definitions of entropy in thermodynamics it turns out that small changes in temperature at or near absolute zero correspond to large or infinite change in entropy , in other words as we get closer to T zero the entropy skyrockets, but this doesn't sound logical.

Also ideal gas has greater heat capacity at higher temperatures than at absolute zero as does most other liquids , gasses and even solids , so where is the misunderstanding here? Or maybe it's the problem with what we define absolute zero - either a true zero temperature or simply the lowest possible energy/temperature state?

I hope you can understand my question thank you.