Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

About the 2nd law of thermodynamics

  1. Jul 18, 2013 #1
    this is going to be a newbie question....

    i couldn't understand why a system in thermodynamic equilibrium must with maximum entropy?


    and this is just 100% sure that within an isolated system entropy never decreases?


    thank you in advance for your reply...
     
  2. jcsd
  3. Jul 19, 2013 #2

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    https://en.wikipedia.org/wiki/Entropy

    Imagine a nice regular array of unconstrained particles of some sort. If they are vibrating (aka hot) it's not difficult to imagine how they could bounce off each and fly about until the nice regular structure is gone and disorder rules.

    It's much harder to imagine how a randomly distributed group of vibrating particles could somehow form a nice regular array. Clearly it's possible for crystals to form but when a crystal is formed energy is released (so it's not an isolated system).
     
  4. Jul 19, 2013 #3
    There are two perspectives on entropy: the thermodynamic perspective, and the statistical perspective. In the thermodynamic picture, the existence of entropy is an empirical fact that follows from the observation that certain processes in nature are irreversible, while others are "nearly" reversible. In the classic example of a piston filled with an ideal gas, we observe that free expansion (breaking a diaphragm and letting the gas expand to a further wall) is an irreversible process, in the sense that we cannot restore the system to its original state while leaving the rest of the universe unchanged. The existence of irreversible processes establishes something like an "ordering" on states of a thermodynamic system. Thus, for each closed system we can construct an entropy function "S(X)", such that S(X) increases under irreversible processes. The axioms of thermodynamics are then put in place so that the familiar thermodynamic manipulations can be done. For example, thermodynamic energy is introduced as a measurable parameter and entropy is assumed to be additive (and can be chosen to be differentiable), allowing temperature to be defined through energy transfer to equilibrium in the absence of work. Note that the concept of equilibrium is also purely empirical in thermodynamics: the properties of equilibrium allow the division of physical systems into equivalence classes, and absolute temperature (as defined in any introductory thermodynamics textbook) is just a convenient way of labeling equivalence classes.

    The concept of entropy in statistical mechanics is stronger, in the sense that it carries a significantly greater amount of predictive power. Instead of just classifying processes as reversible or irreversible, statistical mechanics postulates that the entropy of a system at a certain energy is proportional to the logarithm of the number of ways in which the system can assume that energy (in classical physics, this means the size of a surface in phase space of constant energy). This establishes the functional relationship between the entropy and energy, which is assumed already to be a function of the observable parameters such as volume, magnetization, etc. It is important to note that entropy does not have an instantaneous meaning in the way that energy does. In principle you could measure the energy of a system at any time, and get an answer that will fluctuate around the mean value. Entropy, on the other hand, requires a distribution in order to be meaningful. It is possible to define a limiting notion of entropy as an average over a finite interval of time, and it is certainly possible for this notion of entropy to decrease in time. However, the "real" entropy that is discussed in statistical mechanics is the limit as the averaging time approaches infinity, and thus acquires a time-independent meaning. (If you know information theory, then you may notice that the assumption of equal apriori probabilities provides an upper bound on the statistical entropy of a system with a conservative Hamiltonian).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: About the 2nd law of thermodynamics
Loading...