# Intutive understanding of entropy

1. Jul 2, 2007

### pardesi

can someone please define entropy intutively to me
also is that $$\int_{i}^{f}\frac{dq}{T}$$ independent of the path or
$$\int_{i}^{f}\frac{dq_{rev}}{T}$$

2. Jul 2, 2007

### cristo

Staff Emeritus
Have you tried looking in a text book, or in your course notes? These should be the first port of call, since your teacher knows the level of knowledge that you have, and so can scale the answer suitably.

3. Jul 2, 2007

### CompuChip

Entropy is - to me still - a strange concept. There are two ways to look at it.

The first is from an intuitive point of view. We (at least I, hopefully you too) know that the infinitesimal change in energy of a system is
dE = T dS - p dV + $\mu$ dN
with E energy, T temperature, S entropy, p pressure, V volume, $\mu$ chemical potential and N number of particles.

If you bring two systems into contact, they will flow until the temperature, pressure and chemical potential are equal. To get the pressure equal, they will exchange volume (when allowed; for example, if you have a box with a movable wall, the wall will go to that position where the pressure in both parts is equal). To get the chemical potential equal, they will exchange particles (when allowed; for example through a permeable membrane). To get the temperature equal, heat flows from one system into another, you could consider entropy the quantity (whatever it is) that is exchanged to reach equilibrium.

Yet another way to introduce entropy is by the microcanonical ensemble. What you actually do is, given the energy of a system, count the number of microstates g it can be in. As these numbers often get very large, we can introduce entropy as the logarithm of this number, so that numbers in the order of $10^{23}$ get order 23, which is more managable.

By the way, a book I can recommend to you is "Thermal Physics" by Charles Kittel / Herbert Kroemer. It's a very gentle and intuitive introduction to thermal physics.

Last edited: Jul 2, 2007
4. Jul 3, 2007

### rcgldr

Doesn't dark mattter and dark energy (assuming these exist) mess up the concept of entropy? For example, the fact that galaxies appear to be accelerating away from each other (rather than decelerating)?

5. Jul 3, 2007

### PhillipKP

Dude

Entropy is a measure of energy transfer per state.

6. Jul 3, 2007

### Claude Bile

There are two definitions of entropy as I understand it, Statistical entropy and entropy with regard to information theory.

Statistical entropy (from memory) is an expression of the number of micro-states that can occupy a particular macro-state. Entropy with regard to information theory is the inverse of another abstract quantity - information, and is tied in with Shannon's law. The relation between statistical entropy and "informational" entropy is a subject of debate.

Claude.