Intutive understanding of entropy

  • Thread starter Thread starter pardesi
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
Entropy is defined as a measure of energy transfer per state, with two primary interpretations: statistical entropy, which relates to the number of microstates in a macrostate, and informational entropy, linked to Shannon's law. When two systems are brought into contact, they exchange energy, volume, and particles until they reach equilibrium, with entropy representing the quantity exchanged in this process. The integral expressions for entropy, whether path-dependent or independent, are also discussed, emphasizing the importance of understanding these concepts through textbooks and educational resources. The relationship between entropy and concepts like dark matter and dark energy raises questions about its implications in cosmology. Overall, entropy remains a complex yet fundamental concept in thermodynamics and statistical mechanics.
pardesi
Messages
337
Reaction score
0
can someone please define entropy intutively to me
also is that \int_{i}^{f}\frac{dq}{T} independent of the path or
\int_{i}^{f}\frac{dq_{rev}}{T}
 
Science news on Phys.org
Have you tried looking in a textbook, or in your course notes? These should be the first port of call, since your teacher knows the level of knowledge that you have, and so can scale the answer suitably.

Here's a link to a good physics website which may help you: http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/seclaw.html#c4
 
Entropy is - to me still - a strange concept. There are two ways to look at it.

The first is from an intuitive point of view. We (at least I, hopefully you too) know that the infinitesimal change in energy of a system is
dE = T dS - p dV + \mu dN
with E energy, T temperature, S entropy, p pressure, V volume, \mu chemical potential and N number of particles.

If you bring two systems into contact, they will flow until the temperature, pressure and chemical potential are equal. To get the pressure equal, they will exchange volume (when allowed; for example, if you have a box with a movable wall, the wall will go to that position where the pressure in both parts is equal). To get the chemical potential equal, they will exchange particles (when allowed; for example through a permeable membrane). To get the temperature equal, heat flows from one system into another, you could consider entropy the quantity (whatever it is) that is exchanged to reach equilibrium.

Yet another way to introduce entropy is by the microcanonical ensemble. What you actually do is, given the energy of a system, count the number of microstates g it can be in. As these numbers often get very large, we can introduce entropy as the logarithm of this number, so that numbers in the order of 10^{23} get order 23, which is more managable.

By the way, a book I can recommend to you is "Thermal Physics" by Charles Kittel / Herbert Kroemer. It's a very gentle and intuitive introduction to thermal physics.
 
Last edited:
Doesn't dark mattter and dark energy (assuming these exist) mess up the concept of entropy? For example, the fact that galaxies appear to be accelerating away from each other (rather than decelerating)?
 
Dude

Entropy is a measure of energy transfer per state.
 
There are two definitions of entropy as I understand it, Statistical entropy and entropy with regard to information theory.

Statistical entropy (from memory) is an expression of the number of micro-states that can occupy a particular macro-state. Entropy with regard to information theory is the inverse of another abstract quantity - information, and is tied in with Shannon's law. The relation between statistical entropy and "informational" entropy is a subject of debate.

Claude.
 
Back
Top