Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The Deal or no Deal dillemma

  1. May 14, 2007 #1
    The "Deal or no Deal" dillemma

    The problem below combines the familiar quiz show with a dash of Monty Hall.

    Deal or No Deal.
    22 boxes hiding money prizes.
    11 blue (low). 11 red (high).
    You chose one box at random.
    Obviously you want it to be red.
    The other boxes are revealed one by one

    The game follows the normal rules of "Deal or No Deal", with the following caveat:

    With every box you open, the host will always reveal one of the opposite colour. He has been told which boxes have reds and which have blues, but he does not know the values.

    Thus, at every stage of the game (after the host opens a box) the number of blues and reds unopened is exactly equal.

    You've just played to the end, refusing all deals.
    Twenty boxes have been opened.
    There are two remaining.
    For the sake of drama, lets say these are 1c and $1,000,000.
    The banker has offered you a swap.
    Do you accept it?

    Does it matter which boxes you or the host opened?
    Or are the odds 50/50 regardless?

    Typical example:

    Say, in the course of the game you opened 7 blues and 3 reds.
    The host complimented each occasion with 7 reds and 3 blues.

    Should you stick or swap?
    Or does it make no difference?


    Simon
     
  2. jcsd
  3. May 15, 2007 #2

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    What do you mean by swap? You never get to switch which box you open in that game... do you mean take the deal? It depends on what he offers you obviously. You have no way of telling which box is 1c and which box is 1,000,000 dollars
     
  4. May 15, 2007 #3

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Deal or no deal differs in 2 fundamental ways from Monty Hall.

    1. You don't swap. You are only ever offered a total to duck out at any given point. That total is always less than your expected winnings at that point in time. The point is that you only get to play the game once, so there is no way to 'play the averages'.

    2. The boxes that are opened in that game are done at random, thus it is perfectly possible that at each reveal you see all the high money disappearing. Monty Hall is predicated on the knowledge that the host only shows you one of the booby prizes. If we change the Monty dilemma so that Monty opens the door at random, then that is a different question.
     
  5. May 15, 2007 #4
    In the the UK version, you do. The banker almost always offers you yhe opportunity to swap boxes at the end. I figured this was universal. Anyway, that is exactly what swap means.

    This puzzle is influenced by "Deal or no Deal". That does not mean it is "Deal or no Deal".

    As an excercise in probabilty, it should simply be focussed on as you see it written. The fact that the actual show differs in any way is a red herring..

    For this version:

    Assume that you turn down all the banker's money offers.

    Assume that for each box you randomly open, the host makes sure that one of the opposite colour is revealed. So every pair of boxes opened always consists of a red and a blue, in any order.

    Assume that you are allowed to swap at the end when there are only two boxes left - your own box and one other. You know that one is blue and the other is red.

    The question is:

    Are the odds always 50/50, whether you take the swap or not?
    Or can the order in which each previous pair of boxes was eliminated affect the odds of where the red is most likely to be?

    Bear in mind, there are always 10 rounds before you get to the end and make your decision. A round always consists of you opening one box followed by the host opening one of the opposite colour.

    Typical example, restated:

    In 7 rounds, a blue prize was revealed followed by a red.
    In 3 rounds, a red prize was revealed followed by a blue.

    Is this relevant to whether your box or the other is more likely to contain the red prize? Or are the odds 50/50?



    Simon
     
  6. May 15, 2007 #5

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    So, at the end, you have two boxes You get to pick either of them, then. There is no gain, nor loss, in switching, and it is not like the Monty Hall problem. (Why do you bother invoking Deal or no Deal at all?) The reveals have given you no more information about where things are. I mean, the whole point about Monty Hall is that the reveals are actually irrelevant: you're being given the opportunity to choose either 1 door or 2 doors in effect.

    Of course there might be some hidden stuff going on. If you want to think about it more then why bother with 10 rounds of choosing - it's less confusing to think about 1 round of choosing, and is precisely the same problem.
     
    Last edited: May 15, 2007
  7. May 15, 2007 #6

    I watched the first two or three, and Howie did offer to swap cases in one of those first shows to one person.
     
  8. May 15, 2007 #7
    No it isn't. :smile:

    You're assuming that 50/50 odds apply, no matter what. You're wrong about this.

    With multiple rounds, the probability of what's in the last two boxes can vary. 10 rounds illustrate this well, and was one reason to invoke "Deal or No Deal". Another was people's familiarity with the format. However, the same principle applies with any number of rounds.

    The solution is:

    The odds for the last two boxes are only 50/50 when you and the host both open an equal number of blues and reds. In every other scenario, the odds are never 50/50. The probability of which box contains the red is determined by who revealed the most of what.

    The law of probability here is:

    Whatever colour the host revealed more of is in your box.
    Whatever colour you revealed more of is in the other box.

    If the host revealed more reds, you should stick.
    If you revealed more reds, you should swap.

    In the example given, where the host opened 7 reds and 3 blues, your box is the one that probably contains a red. The odds are exactly 2/3 that you will win by sticking.

    Though I've given the answer, I haven't yet explained why this is so or how I arrive at the precise probability.

    I'll do that in the next post, unless someone else agrees with me and works it out. :wink:
     
    Last edited: May 15, 2007
  9. May 15, 2007 #8
    This is pretty interesting. Suppose there are 4 cases. Let B_1 be my chosen case, and let B_1 = 1 denote the event that the case is red and B_1 = 0 the event that the case is blue. Once I choose my case, I will then reveal a case from the remaining three. Call this B_2. Then

    P(B_1 = 1 | B_2 = 1) = P(B_1 = 1, B_2 = 1)/P(B_2 = 1)
    = P(B_1 = 1, B_2 = 1)/[P(B_1 = 0, B_2 = 1) + P(B_1 = 1, B_2 = 1)]
    = (1/2)(1/3)/[(1/2)(2/3) + (1/2)(1/3)]
    = 1/3.

    So if I reveal a red case, then the probability my own case is red is only 1/3. The host will subsequently reveal a blue case, but that does not affect this probability, so I should switch.

    However, things get more complicated with 6 cases. Again, let B_1 be my chosen case. Let B_2 be the first case I reveal from the remaining five. The host then reveals a case of opposite color. Let B_3 be the case I reveal after that. Then

    P(B_1 = 1 | B_2 = 0, B_3 = 1)
    = P(B_1 = 1, B_2 = 0, B_3 = 1)/[P(B_1 = 0, B_2 = 0, B_3 = 1) + P(B_1 = 1, B_2 = 0, B_3 = 1)].

    In order to get B_1 = 1, B_2 = 0, and B_3 = 1, I must first choose a red case (prob. 1/2), then reveal a blue case (prob. 3/5). The host then reveals a red case, leaving one red and two blue in the field. I must then choose a red (prob. 1/3). So

    P(B_1 = 1, B_2 = 0, B_3 = 1) = (1/2)(3/5)(1/3) = 1/10.

    Similarly,

    P(B_1 = 0, B_2 = 0, B_3 = 1) = (1/2)(2/5)(2/3) = 2/15.

    So

    P(B_1 = 1 | B_2 = 0, B_3 = 1) = (1/10)/(2/15 + 1/10) = 3/7.

    On the other hand, similar calculations show that

    P(B_1 = 1, B_2 = 1, B_3 = 0) = (1/2)(2/5)(2/3) = 2/15.
    P(B_1 = 0, B_2 = 1, B_3 = 0) = (1/2)(3/5)(1/3) = 1/10.
    P(B_1 = 1 | B_2 = 1, B_3 = 0) = (2/15)/(1/10 + 2/15) = 4/7.

    In summary, in the 6 cases scenario, if the first case I reveal is blue and the second is red, then my case is red with probability 3/7 and I should swap. But if the first case I reveal is red and the second is blue, then my case is red with probability 4/7 and I should keep it. In other words, it is not enough to know how many red cases I revealed. I must also know the order in which they were revealed.
     
  10. May 15, 2007 #9
    I'm carefully analysing this Jason.

    In the six box scenario, I expected to find that the odds were 50/50 whenever you revealed one of each colour.

    But you're right. The odds are indeed 3/7 or 4/7, depending on which you reveal first.

    I assume this must affect my answer in the 22 box scenario?


    Simon
     
  11. May 15, 2007 #10
    My guess is that it does. It would be interesting to work out the general solution and find some elegant and compact way of representing it.
     
  12. May 15, 2007 #11
    Ok everyone. Exact answers subject to analysis. :smile:

    However, you are getting information which of the two remaining boxes is most likely to be red.

    I'm now wondering:

    Regardless of how many boxes you start with, is there any scenario - with the host revealing the opposite colour of what you revealed - where the probability is 1/2 that your box is red at the end?

    We must assume an even number of boxes, with an equal number of reds and blues. This ensures that your chosen box always begins with a 1/2 chance of being red or blue.

    With 4 boxes, where you randomly revealed one colour and the host knowingly revealed the other, the odds for the remaining 2 boxes are never 50/50. That I already knew.

    Confounding my expectation, Jason's equations proved that with 6 boxes, the odds are also never 50/50. If you revealed one of each colour and the host mirrored you, the odds are either 3/7 or 4/7 that your box is red - depending on what you revealed first. If you revealed two of the same colour and the host mirrored you, the odds are 1/4 or 3/4 that your box is red - depending on what colour you revealed.

    Does the same pattern continue with n boxes?

    Simon
     
    Last edited: May 15, 2007
  13. May 16, 2007 #12

    NateTG

    User Avatar
    Science Advisor
    Homework Helper

    Let's say we have [itex]2n[/itex] boxes with [itex]n\geq1[/itex], and play proceeds as follows:

    First the player chooses one box, and puts it on the table.


    From there forward, as long as there is more than 1 box left,
    the player chooses a box and it is revealed, and then the host
    chooses and reveals a box of the other color.
    When there is one box left, the player may choose to switch.

    Then, the the probability that the ith box revealed by the player matches the player's hidden box is:
    [tex]\frac{n-i}{2(n-i)+1}[/tex]
    and the probability that the ith revealed box does not match is:
    [tex]\frac{n-i+1}{2(n-i)+1}[/tex]

    Since the denominators match for initial red and inital blue scenarios, the relative probabilities are determined by the numerators.

    Now, the original post gives an example with 3 of 10 of the player-revealed boxes being one color. We know that order matches, and there are 10 choose 3 = 120 possible scenarios, which is more that what I want to go through by hand - so I'll just do extreme ones here:

    In the 'most blue' scenario, all 7 of the blue boxes revealed by the player come first, so the relative probabilties will be:
    [tex]10\times9\times8\times7\times6\times5\times4\times4\times 3\times 2:11 \times 10 \times 9 \times 8 \times 7 \times 6 \times 5 \times 3 \times 2 \times 1[/tex]
    Using some cancelation this works out to
    [tex]16:11[/tex]
    so the chance that the player's box is blue would be [itex]\frac{16}{27}[/itex]

    If, on the other hand, the last 7 boxes are blue, it's
    [tex]11 \times 10 \times 9 \times 7 \times 6 \times 5 \times 4 \times 3 \times 2 \times 1 : 10 \times 9 \times 8 \times 8 \times 7 \times 6 \times 5 \times 4 \times 3 \times 2[/tex]
    [tex]11 : 64[/tex]
    so the chance that the player's box is blue is [itex]\frac{11}{75}[/itex]

    Since we're not given significant information that the player would have, the natural reaction is to suggest that the initial post should be more specific.

    It shouldn't be too hard to work out that the only scenario where, provided with the actual sequence of reveals, the odds are actually 1:1 will be when [itex]n=1[/itex].

    ...
    Now, because the initial post does not actually give us the ordering, it's also possible to generate probabilities for all 240 scenarios involving those reveals, and then make a determination of whether the player is more likely to have a red or blue box based on that. In this type of scenario, Simon 6's notion of the player's box being the complement of the color the player most frequently revealed will hold, with a 50/50 split for even red/blue reveal counts.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?