Quantifying Entropy - an analogy

  • Thread starter Thread starter quark1005
  • Start date Start date
  • Tags Tags
    Analogy Entropy
AI Thread Summary
The discussion explores the analogy between a deck of cards and thermodynamic systems in relation to entropy. It highlights that while shuffling increases the disorder of the cards, there is a limit to how much randomness can be achieved, paralleling the second law of thermodynamics. The conversation distinguishes between information entropy and thermodynamic entropy, emphasizing that the latter is linked to irreversible processes. The statistical mechanical definition of entropy is introduced, illustrating how arrangements of particles (or cards) determine entropy levels. Ultimately, the analogy holds, but the nuances of shuffling and entropy quantification are critical to understanding the concept.
quark1005
Messages
15
Reaction score
0
My friend and I were playing cards, and he was shuffling the deck. He claimed that the longer he shuffled, the 'more random' the arrangement of the cards would become. I argued that at a certain point the cards would be sufficiently disorganised such that they were in a state with no pattern whatsoever, or at least they would tend asymptotically towards some limiting value of disorganisation - logistic growth of disorganisation.

I compared a deck of cards to a thermodynamic system and noticed that the entropy - ie organisation of the cards increases with time. No matter how much you shuffle organisation will not increase, which I noticed is remarkably similar to the 2nd law of thermodynamics. Upon reasearch I confirmed this was not an original analogy (http://www.jce.divched.org/Journal/issues/1999/oct/abs1385.html")

This lead me to question: how does one quantify entropy? In states approaching the limiting value for disorganisation, is it possible to 'more disordered' (I argued not, stating that shuffling for two hours instead of a couple of minutes will not make the game any fairer - ie will not make the allocation of cards any more 'random')? Can I state at a certain point that a thermodynamic system, or my deck of cards, is 'random' / fully disordered etc?

Is a deck of cards a reasonable analogy to a thermodynamic system?
 
Last edited by a moderator:
Science news on Phys.org
Although I note Lambert looks at Entropy literally not as an analogy
 
quark1005 said:
Is a deck of cards a reasonable analogy to a thermodynamic system?

Yup...it is. Of course, one thing that one might have to look through a bit is what is shuffling the cards. Depending on whether the shuffle is mechanistic, in which the shuffler is not subjective throughout the process, or not does affect the entropy of the system. So human freedom does seem to have a bit of an effect on entropy.
 
Gear300 said:
Yup...it is. Of course, one thing that one might have to look through a bit is what is shuffling the cards. Depending on whether the shuffle is mechanistic, in which the shuffler is not subjective throughout the process, or not does affect the entropy of the system. So human freedom does seem to have a bit of an effect on entropy.

Yes I assume it'd be non-subjective shuffling
 
quark1005 said:
Is a deck of cards a reasonable analogy to a thermodynamic system?

No, you have to distinguish between information entropy as defined by Shannon and entropy in the sense of thermodynamics and QM.

In information theory one quantizes the uncertainty, which is associated with a random variable. This general formulation can be applied to card decks (and in some formulations also to thermodynamics).

In thermodynamics changes in entropy are usually associated with irreversible processes, which are of course absent, if the only defining property of your system is the order of macroscopic objects like a card deck.
 
I like to use a different analogy to explain how entropy is quantified. First, the statistical mechanical definition of entropy:

S = k ln Γ

where Γ is the number of ways in which a system can be arranged. As you can see, the increase in entropy will level off as Γ increases.

How do we think of Γ? Well, consider a two state system, a coin that can be either heads or tails. So, let's take a collection of ten coins in a box. Remember that S is a property of an ensemble or particles (e.g. a deck of cards, a mole of atoms) and not a property of a single particle.

Lets say all of these coins are initially showing heads, but you shake the box. As you would expect, you should get something near 5 heads and 5 tails after shaking the box up. The coins tend toward 5 heads and 5 tails because this represents an increase in entropy.

Here's how it works. There is only one arrangement of coins that gives 10 heads, so S = 0. However, there are many arrangements of coins that can give 5 heads, 5 tails, for example: HHHHHTTTTT, HHHHTHTTTT, HTHTHTHTHT, TTHHTTHHTH, etc. In fact there are \Gamma = \frac{10!}{(5!)(5!)} = 252 such arrangements. You can also see that the entropy for 5 heads & 5 tails is higher than the entropy for other combinations (e.g. for 4 heads, 6 tails, Γ = 210). Of course, in this example with just 10 coins, you will see considerable fluctiation around the point of maximum entropy, (because 252 is not so much larger than 210), but with a system of 1023 particles you will see very little fluctuations.
 
Back
Top