Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is there any way to measure how random something is?

  1. Jun 17, 2009 #1
    Odd question that at first seemed to have an obvious answer: no.

    It all started when I realized that my friend is horrible at shuffling cards. They fall in large packets such that they fall in layers of unshuffled cards, rather than all of the cards being randomized. Then I realized to myself that no matter how they are shuffled, they are still as random as being shuffled 100 times... aren't they?

    Is there any way to measure how "well shuffled" cards are, or more generally, how random something is?
  2. jcsd
  3. Jun 17, 2009 #2


    User Avatar

    Staff: Mentor

  4. Jun 17, 2009 #3
    You are correct that no matter what order they are in there's no way to say if they are random or not. However there are tests designed for determining if http://en.wikipedia.org/wiki/Pseudorandom_number_generator" [Broken] are producing decent results or not. Again it is possible for a truly random source to produce a sequence that fails these tests, but it would be extremely unlikely. Also it's possible for a sequence to pass the tests and still have a complex underlying pattern.

    It's been a while since I messed with this stuff, but Diehard is the sort of industry standard. I found some newer ones that also worked well. The problem with the tests is that they are generally intended to be used with a computer based PRNG, so they expect input in binary form, and generally require millions of bits worth of data. A single deck of cards has around 200 and something bits, so your friend will need to do a lot of shuffling to get enough data for those tests.

    Last edited by a moderator: May 4, 2017
  5. Jun 17, 2009 #4
    I agree with what the other posters have said about random number tests, but if we look at what the OP is trying to quantify I think that Boltzmann Entropy would be a better measure.

    Boltzman Entropy for a deck of cards D with respect to a particular game G could be defined as:

    S = natural log of (the number of distinct decks D' equivalent to D for the purposes of a round of G)

    from this definition it follows that the state with the maximum entropy is the state where the arrangement of the cards was 'most typical' with respect to G, and states with a low entropy represent an atypical shuffling of the deck. Here is an analysis from another page (http://www.cs.unm.edu/~saia/infotheory.html): [Broken]

    In my experience the problem with bad shuffling is that it causes the appearance of card combinations similar to those in the previous round of the game, which is repetitive and therefore boring. It should be pretty simple to generate statistics about these kind of repeats, just by counting them. Say something like "Your shuffling has caused X incidents of repetition in the last Y rounds of the game, while we know that with proper shuffling the probability for this is negligible." Finding the exact probability of "negligible" means would be a worthy homework problem in combinatorics.
    Last edited by a moderator: May 4, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook