1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Intutive understanding of entropy

  1. Jul 2, 2007 #1
    can someone please define entropy intutively to me
    also is that [tex]\int_{i}^{f}\frac{dq}{T}[/tex] independent of the path or
  2. jcsd
  3. Jul 2, 2007 #2


    User Avatar
    Staff Emeritus
    Science Advisor

    Have you tried looking in a text book, or in your course notes? These should be the first port of call, since your teacher knows the level of knowledge that you have, and so can scale the answer suitably.

    Here's a link to a good physics website which may help you: http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/seclaw.html#c4
  4. Jul 2, 2007 #3


    User Avatar
    Science Advisor
    Homework Helper

    Entropy is - to me still - a strange concept. There are two ways to look at it.

    The first is from an intuitive point of view. We (at least I, hopefully you too) know that the infinitesimal change in energy of a system is
    dE = T dS - p dV + [itex]\mu[/itex] dN
    with E energy, T temperature, S entropy, p pressure, V volume, [itex]\mu[/itex] chemical potential and N number of particles.

    If you bring two systems into contact, they will flow until the temperature, pressure and chemical potential are equal. To get the pressure equal, they will exchange volume (when allowed; for example, if you have a box with a movable wall, the wall will go to that position where the pressure in both parts is equal). To get the chemical potential equal, they will exchange particles (when allowed; for example through a permeable membrane). To get the temperature equal, heat flows from one system into another, you could consider entropy the quantity (whatever it is) that is exchanged to reach equilibrium.

    Yet another way to introduce entropy is by the microcanonical ensemble. What you actually do is, given the energy of a system, count the number of microstates g it can be in. As these numbers often get very large, we can introduce entropy as the logarithm of this number, so that numbers in the order of [itex]10^{23}[/itex] get order 23, which is more managable.

    By the way, a book I can recommend to you is "Thermal Physics" by Charles Kittel / Herbert Kroemer. It's a very gentle and intuitive introduction to thermal physics.
    Last edited: Jul 2, 2007
  5. Jul 3, 2007 #4


    User Avatar
    Homework Helper

    Doesn't dark mattter and dark energy (assuming these exist) mess up the concept of entropy? For example, the fact that galaxies appear to be accelerating away from each other (rather than decelerating)?
  6. Jul 3, 2007 #5

    Entropy is a measure of energy transfer per state.
  7. Jul 3, 2007 #6

    Claude Bile

    User Avatar
    Science Advisor

    There are two definitions of entropy as I understand it, Statistical entropy and entropy with regard to information theory.

    Statistical entropy (from memory) is an expression of the number of micro-states that can occupy a particular macro-state. Entropy with regard to information theory is the inverse of another abstract quantity - information, and is tied in with Shannon's law. The relation between statistical entropy and "informational" entropy is a subject of debate.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook