Quantifying Entropy - an analogy

In summary, my friend and I were playing cards, and he was shuffling the deck. He claimed that the longer he shuffled, the 'more random' the arrangement of the cards would become. I argued that at a certain point the cards would be sufficiently disorganised such that they were in a state with no pattern whatsoever, or at least they would tend asymptotically towards some limiting value of disorganisation - logistic growth of disorganisation. I compared a deck of cards to a thermodynamic system and noticed that the entropy - ie organisation of the cards increases with time. No matter how much you shuffle organisation will not increase, which I noticed is remarkably similar to the 2nd law of thermodynamics.
  • #1
quark1005
15
0
My friend and I were playing cards, and he was shuffling the deck. He claimed that the longer he shuffled, the 'more random' the arrangement of the cards would become. I argued that at a certain point the cards would be sufficiently disorganised such that they were in a state with no pattern whatsoever, or at least they would tend asymptotically towards some limiting value of disorganisation - logistic growth of disorganisation.

I compared a deck of cards to a thermodynamic system and noticed that the entropy - ie organisation of the cards increases with time. No matter how much you shuffle organisation will not increase, which I noticed is remarkably similar to the 2nd law of thermodynamics. Upon reasearch I confirmed this was not an original analogy (http://www.jce.divched.org/Journal/issues/1999/oct/abs1385.html")

This lead me to question: how does one quantify entropy? In states approaching the limiting value for disorganisation, is it possible to 'more disordered' (I argued not, stating that shuffling for two hours instead of a couple of minutes will not make the game any fairer - ie will not make the allocation of cards any more 'random')? Can I state at a certain point that a thermodynamic system, or my deck of cards, is 'random' / fully disordered etc?

Is a deck of cards a reasonable analogy to a thermodynamic system?
 
Last edited by a moderator:
Science news on Phys.org
  • #2
Although I note Lambert looks at Entropy literally not as an analogy
 
  • #3
quark1005 said:
Is a deck of cards a reasonable analogy to a thermodynamic system?

Yup...it is. Of course, one thing that one might have to look through a bit is what is shuffling the cards. Depending on whether the shuffle is mechanistic, in which the shuffler is not subjective throughout the process, or not does affect the entropy of the system. So human freedom does seem to have a bit of an effect on entropy.
 
  • #4
Gear300 said:
Yup...it is. Of course, one thing that one might have to look through a bit is what is shuffling the cards. Depending on whether the shuffle is mechanistic, in which the shuffler is not subjective throughout the process, or not does affect the entropy of the system. So human freedom does seem to have a bit of an effect on entropy.

Yes I assume it'd be non-subjective shuffling
 
  • #5
quark1005 said:
Is a deck of cards a reasonable analogy to a thermodynamic system?

No, you have to distinguish between information entropy as defined by Shannon and entropy in the sense of thermodynamics and QM.

In information theory one quantizes the uncertainty, which is associated with a random variable. This general formulation can be applied to card decks (and in some formulations also to thermodynamics).

In thermodynamics changes in entropy are usually associated with irreversible processes, which are of course absent, if the only defining property of your system is the order of macroscopic objects like a card deck.
 
  • #6
I like to use a different analogy to explain how entropy is quantified. First, the statistical mechanical definition of entropy:

S = k ln Γ

where Γ is the number of ways in which a system can be arranged. As you can see, the increase in entropy will level off as Γ increases.

How do we think of Γ? Well, consider a two state system, a coin that can be either heads or tails. So, let's take a collection of ten coins in a box. Remember that S is a property of an ensemble or particles (e.g. a deck of cards, a mole of atoms) and not a property of a single particle.

Lets say all of these coins are initially showing heads, but you shake the box. As you would expect, you should get something near 5 heads and 5 tails after shaking the box up. The coins tend toward 5 heads and 5 tails because this represents an increase in entropy.

Here's how it works. There is only one arrangement of coins that gives 10 heads, so S = 0. However, there are many arrangements of coins that can give 5 heads, 5 tails, for example: HHHHHTTTTT, HHHHTHTTTT, HTHTHTHTHT, TTHHTTHHTH, etc. In fact there are [tex]\Gamma = \frac{10!}{(5!)(5!)} = 252[/tex] such arrangements. You can also see that the entropy for 5 heads & 5 tails is higher than the entropy for other combinations (e.g. for 4 heads, 6 tails, Γ = 210). Of course, in this example with just 10 coins, you will see considerable fluctiation around the point of maximum entropy, (because 252 is not so much larger than 210), but with a system of 1023 particles you will see very little fluctuations.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a concept in thermodynamics that describes the amount of energy that is no longer available to do work.

2. How is entropy quantified?

Entropy is quantified using the equation S = k ln W, where S represents the entropy, k is the Boltzmann constant, and W is the number of microstates or possible arrangements of a system.

3. Can you provide an analogy to explain entropy?

One analogy for entropy is a deck of cards. When the deck is in order, it has low entropy because there is only one possible arrangement. As the cards are shuffled and become more disordered, the entropy increases because there are more possible arrangements.

4. How is entropy related to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to move towards a state of maximum disorder or entropy.

5. Can entropy be reversed or decreased?

In isolated systems, entropy can only increase or remain constant. However, in open systems where energy and matter can be exchanged with the surroundings, local decreases in entropy are possible but will always result in an overall increase in the entropy of the universe.

Similar threads

  • Thermodynamics
Replies
2
Views
9K
Replies
3
Views
968
Replies
13
Views
1K
  • Thermodynamics
Replies
2
Views
773
Replies
17
Views
1K
  • Thermodynamics
Replies
3
Views
788
  • Thermodynamics
Replies
1
Views
2K
Replies
2
Views
844
Replies
7
Views
5K
Back
Top