Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Quantifying Entropy - an analogy

  1. Aug 10, 2008 #1
    My friend and I were playing cards, and he was shuffling the deck. He claimed that the longer he shuffled, the 'more random' the arrangement of the cards would become. I argued that at a certain point the cards would be sufficiently disorganised such that they were in a state with no pattern whatsoever, or at least they would tend asymptotically towards some limiting value of disorganisation - logistic growth of disorganisation.

    I compared a deck of cards to a thermodynamic system and noticed that the entropy - ie organisation of the cards increases with time. No matter how much you shuffle organisation will not increase, which I noticed is remarkably similar to the 2nd law of thermodynamics. Upon reasearch I confirmed this was not an original analogy (Lambert, 1999)

    This lead me to question: how does one quantify entropy? In states approaching the limiting value for disorganisation, is it possible to 'more disordered' (I argued not, stating that shuffling for two hours instead of a couple of minutes will not make the game any fairer - ie will not make the allocation of cards any more 'random')? Can I state at a certain point that a thermodynamic system, or my deck of cards, is 'random' / fully disordered etc?

    Is a deck of cards a reasonable analogy to a thermodynamic system?
     
  2. jcsd
  3. Aug 10, 2008 #2
    Although I note Lambert looks at Entropy literally not as an analogy
     
  4. Aug 10, 2008 #3
    Yup...it is. Of course, one thing that one might have to look through a bit is what is shuffling the cards. Depending on whether the shuffle is mechanistic, in which the shuffler is not subjective throughout the process, or not does affect the entropy of the system. So human freedom does seem to have a bit of an effect on entropy.
     
  5. Aug 10, 2008 #4
    Yes I assume it'd be non-subjective shuffling
     
  6. Aug 10, 2008 #5

    Cthugha

    User Avatar
    Science Advisor

    No, you have to distinguish between information entropy as defined by Shannon and entropy in the sense of thermodynamics and QM.

    In information theory one quantizes the uncertainty, which is associated with a random variable. This general formulation can be applied to card decks (and in some formulations also to thermodynamics).

    In thermodynamics changes in entropy are usually associated with irreversible processes, which are of course absent, if the only defining property of your system is the order of macroscopic objects like a card deck.
     
  7. Aug 10, 2008 #6

    Ygggdrasil

    User Avatar
    Science Advisor
    2015 Award

    I like to use a different analogy to explain how entropy is quantified. First, the statistical mechanical definition of entropy:

    S = k ln Γ

    where Γ is the number of ways in which a system can be arranged. As you can see, the increase in entropy will level off as Γ increases.

    How do we think of Γ? Well, consider a two state system, a coin that can be either heads or tails. So, let's take a collection of ten coins in a box. Remember that S is a property of an ensemble or particles (e.g. a deck of cards, a mole of atoms) and not a property of a single particle.

    Lets say all of these coins are initially showing heads, but you shake the box. As you would expect, you should get something near 5 heads and 5 tails after shaking the box up. The coins tend toward 5 heads and 5 tails because this represents an increase in entropy.

    Here's how it works. There is only one arrangement of coins that gives 10 heads, so S = 0. However, there are many arrangements of coins that can give 5 heads, 5 tails, for example: HHHHHTTTTT, HHHHTHTTTT, HTHTHTHTHT, TTHHTTHHTH, etc. In fact there are [tex]\Gamma = \frac{10!}{(5!)(5!)} = 252[/tex] such arrangements. You can also see that the entropy for 5 heads & 5 tails is higher than the entropy for other combinations (e.g. for 4 heads, 6 tails, Γ = 210). Of course, in this example with just 10 coins, you will see considerable fluctiation around the point of maximum entropy, (because 252 is not so much larger than 210), but with a system of 1023 particles you will see very little fluctuations.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Quantifying Entropy - an analogy
  1. Quantifying Energy (Replies: 6)

  2. Entropy ? (Replies: 2)

  3. Wave analogies (Replies: 6)

Loading...