The Deal or no Deal dillemma

  • Thread starter Simon 6
  • Start date
In summary, the problem is that the 'Deal or no Deal' dillemma combines the familiar quiz show with a dash of Monty Hall. In this version, you are only ever offered a total to duck out at any given point, which is always less than your expected winnings at that point in time. The boxes that are opened in that game are done at random, thus it is perfectly possible that at each reveal you see all the high money disappearing. The odds of where the red is most likely to be are unaffected by the previous reveals. Bear in mind, there are always 10 rounds before you get to the end and make your decision.
  • #1
Simon 6
44
0
The "Deal or no Deal" dillemma

The problem below combines the familiar quiz show with a dash of Monty Hall.

Deal or No Deal.
22 boxes hiding money prizes.
11 blue (low). 11 red (high).
You chose one box at random.
Obviously you want it to be red.
The other boxes are revealed one by one

The game follows the normal rules of "Deal or No Deal", with the following caveat:

With every box you open, the host will always reveal one of the opposite colour. He has been told which boxes have reds and which have blues, but he does not know the values.

Thus, at every stage of the game (after the host opens a box) the number of blues and reds unopened is exactly equal.

You've just played to the end, refusing all deals.
Twenty boxes have been opened.
There are two remaining.
For the sake of drama, let's say these are 1c and $1,000,000.
The banker has offered you a swap.
Do you accept it?

Does it matter which boxes you or the host opened?
Or are the odds 50/50 regardless?

Typical example:

Say, in the course of the game you opened 7 blues and 3 reds.
The host complimented each occasion with 7 reds and 3 blues.

Should you stick or swap?
Or does it make no difference?


Simon
 
Physics news on Phys.org
  • #2
What do you mean by swap? You never get to switch which box you open in that game... do you mean take the deal? It depends on what he offers you obviously. You have no way of telling which box is 1c and which box is 1,000,000 dollars
 
  • #3
Deal or no deal differs in 2 fundamental ways from Monty Hall.

1. You don't swap. You are only ever offered a total to duck out at any given point. That total is always less than your expected winnings at that point in time. The point is that you only get to play the game once, so there is no way to 'play the averages'.

2. The boxes that are opened in that game are done at random, thus it is perfectly possible that at each reveal you see all the high money disappearing. Monty Hall is predicated on the knowledge that the host only shows you one of the booby prizes. If we change the Monty dilemma so that Monty opens the door at random, then that is a different question.
 
  • #4
You never get to switch which box you open in that game

In the the UK version, you do. The banker almost always offers you yhe opportunity to swap boxes at the end. I figured this was universal. Anyway, that is exactly what swap means.

This puzzle is influenced by "Deal or no Deal". That does not mean it is "Deal or no Deal".

As an excercise in probabilty, it should simply be focussed on as you see it written. The fact that the actual show differs in any way is a red herring..

For this version:

Assume that you turn down all the banker's money offers.

Assume that for each box you randomly open, the host makes sure that one of the opposite colour is revealed. So every pair of boxes opened always consists of a red and a blue, in any order.

Assume that you are allowed to swap at the end when there are only two boxes left - your own box and one other. You know that one is blue and the other is red.

The question is:

Are the odds always 50/50, whether you take the swap or not?
Or can the order in which each previous pair of boxes was eliminated affect the odds of where the red is most likely to be?

Bear in mind, there are always 10 rounds before you get to the end and make your decision. A round always consists of you opening one box followed by the host opening one of the opposite colour.

Typical example, restated:

In 7 rounds, a blue prize was revealed followed by a red.
In 3 rounds, a red prize was revealed followed by a blue.

Is this relevant to whether your box or the other is more likely to contain the red prize? Or are the odds 50/50?



Simon
 
  • #5
So, at the end, you have two boxes You get to pick either of them, then. There is no gain, nor loss, in switching, and it is not like the Monty Hall problem. (Why do you bother invoking Deal or no Deal at all?) The reveals have given you no more information about where things are. I mean, the whole point about Monty Hall is that the reveals are actually irrelevant: you're being given the opportunity to choose either 1 door or 2 doors in effect.

Of course there might be some hidden stuff going on. If you want to think about it more then why bother with 10 rounds of choosing - it's less confusing to think about 1 round of choosing, and is precisely the same problem.
 
Last edited:
  • #6
Office_Shredder said:
What do you mean by swap? You never get to switch which box you open in that game... do you mean take the deal? It depends on what he offers you obviously. You have no way of telling which box is 1c and which box is 1,000,000 dollars


I watched the first two or three, and Howie did offer to swap cases in one of those first shows to one person.
 
  • #7
why bother with 10 rounds of choosing - it's less confusing to think about 1 round of choosing, and is precisely the same problem.

No it isn't. :smile:

You're assuming that 50/50 odds apply, no matter what. You're wrong about this.

With multiple rounds, the probability of what's in the last two boxes can vary. 10 rounds illustrate this well, and was one reason to invoke "Deal or No Deal". Another was people's familiarity with the format. However, the same principle applies with any number of rounds.

The solution is:

The odds for the last two boxes are only 50/50 when you and the host both open an equal number of blues and reds. In every other scenario, the odds are never 50/50. The probability of which box contains the red is determined by who revealed the most of what.

The law of probability here is:

Whatever colour the host revealed more of is in your box.
Whatever colour you revealed more of is in the other box.

If the host revealed more reds, you should stick.
If you revealed more reds, you should swap.

In the example given, where the host opened 7 reds and 3 blues, your box is the one that probably contains a red. The odds are exactly 2/3 that you will win by sticking.

Though I've given the answer, I haven't yet explained why this is so or how I arrive at the precise probability.

I'll do that in the next post, unless someone else agrees with me and works it out. :wink:
 
Last edited:
  • #8
This is pretty interesting. Suppose there are 4 cases. Let B_1 be my chosen case, and let B_1 = 1 denote the event that the case is red and B_1 = 0 the event that the case is blue. Once I choose my case, I will then reveal a case from the remaining three. Call this B_2. Then

P(B_1 = 1 | B_2 = 1) = P(B_1 = 1, B_2 = 1)/P(B_2 = 1)
= P(B_1 = 1, B_2 = 1)/[P(B_1 = 0, B_2 = 1) + P(B_1 = 1, B_2 = 1)]
= (1/2)(1/3)/[(1/2)(2/3) + (1/2)(1/3)]
= 1/3.

So if I reveal a red case, then the probability my own case is red is only 1/3. The host will subsequently reveal a blue case, but that does not affect this probability, so I should switch.

However, things get more complicated with 6 cases. Again, let B_1 be my chosen case. Let B_2 be the first case I reveal from the remaining five. The host then reveals a case of opposite color. Let B_3 be the case I reveal after that. Then

P(B_1 = 1 | B_2 = 0, B_3 = 1)
= P(B_1 = 1, B_2 = 0, B_3 = 1)/[P(B_1 = 0, B_2 = 0, B_3 = 1) + P(B_1 = 1, B_2 = 0, B_3 = 1)].

In order to get B_1 = 1, B_2 = 0, and B_3 = 1, I must first choose a red case (prob. 1/2), then reveal a blue case (prob. 3/5). The host then reveals a red case, leaving one red and two blue in the field. I must then choose a red (prob. 1/3). So

P(B_1 = 1, B_2 = 0, B_3 = 1) = (1/2)(3/5)(1/3) = 1/10.

Similarly,

P(B_1 = 0, B_2 = 0, B_3 = 1) = (1/2)(2/5)(2/3) = 2/15.

So

P(B_1 = 1 | B_2 = 0, B_3 = 1) = (1/10)/(2/15 + 1/10) = 3/7.

On the other hand, similar calculations show that

P(B_1 = 1, B_2 = 1, B_3 = 0) = (1/2)(2/5)(2/3) = 2/15.
P(B_1 = 0, B_2 = 1, B_3 = 0) = (1/2)(3/5)(1/3) = 1/10.
P(B_1 = 1 | B_2 = 1, B_3 = 0) = (2/15)/(1/10 + 2/15) = 4/7.

In summary, in the 6 cases scenario, if the first case I reveal is blue and the second is red, then my case is red with probability 3/7 and I should swap. But if the first case I reveal is red and the second is blue, then my case is red with probability 4/7 and I should keep it. In other words, it is not enough to know how many red cases I revealed. I must also know the order in which they were revealed.
 
  • #9
I'm carefully analysing this Jason.

In the six box scenario, I expected to find that the odds were 50/50 whenever you revealed one of each colour.

But you're right. The odds are indeed 3/7 or 4/7, depending on which you reveal first.

I assume this must affect my answer in the 22 box scenario?


Simon
 
  • #10
Simon 6 said:
I assume this must affect my answer in the 22 box scenario?
My guess is that it does. It would be interesting to work out the general solution and find some elegant and compact way of representing it.
 
  • #11
Ok everyone. Exact answers subject to analysis. :smile:

However, you are getting information which of the two remaining boxes is most likely to be red.

I'm now wondering:

Regardless of how many boxes you start with, is there any scenario - with the host revealing the opposite colour of what you revealed - where the probability is 1/2 that your box is red at the end?

We must assume an even number of boxes, with an equal number of reds and blues. This ensures that your chosen box always begins with a 1/2 chance of being red or blue.

With 4 boxes, where you randomly revealed one colour and the host knowingly revealed the other, the odds for the remaining 2 boxes are never 50/50. That I already knew.

Confounding my expectation, Jason's equations proved that with 6 boxes, the odds are also never 50/50. If you revealed one of each colour and the host mirrored you, the odds are either 3/7 or 4/7 that your box is red - depending on what you revealed first. If you revealed two of the same colour and the host mirrored you, the odds are 1/4 or 3/4 that your box is red - depending on what colour you revealed.

Does the same pattern continue with n boxes?

Simon
 
Last edited:
  • #12
Let's say we have [itex]2n[/itex] boxes with [itex]n\geq1[/itex], and play proceeds as follows:

First the player chooses one box, and puts it on the table.From there forward, as long as there is more than 1 box left,
the player chooses a box and it is revealed, and then the host
chooses and reveals a box of the other color.
When there is one box left, the player may choose to switch.

Then, the the probability that the ith box revealed by the player matches the player's hidden box is:
[tex]\frac{n-i}{2(n-i)+1}[/tex]
and the probability that the ith revealed box does not match is:
[tex]\frac{n-i+1}{2(n-i)+1}[/tex]

Since the denominators match for initial red and inital blue scenarios, the relative probabilities are determined by the numerators.

Now, the original post gives an example with 3 of 10 of the player-revealed boxes being one color. We know that order matches, and there are 10 choose 3 = 120 possible scenarios, which is more that what I want to go through by hand - so I'll just do extreme ones here:

In the 'most blue' scenario, all 7 of the blue boxes revealed by the player come first, so the relative probabilties will be:
[tex]10\times9\times8\times7\times6\times5\times4\times4\times 3\times 2:11 \times 10 \times 9 \times 8 \times 7 \times 6 \times 5 \times 3 \times 2 \times 1[/tex]
Using some cancelation this works out to
[tex]16:11[/tex]
so the chance that the player's box is blue would be [itex]\frac{16}{27}[/itex]

If, on the other hand, the last 7 boxes are blue, it's
[tex]11 \times 10 \times 9 \times 7 \times 6 \times 5 \times 4 \times 3 \times 2 \times 1 : 10 \times 9 \times 8 \times 8 \times 7 \times 6 \times 5 \times 4 \times 3 \times 2[/tex]
[tex]11 : 64[/tex]
so the chance that the player's box is blue is [itex]\frac{11}{75}[/itex]

Since we're not given significant information that the player would have, the natural reaction is to suggest that the initial post should be more specific.

It shouldn't be too hard to work out that the only scenario where, provided with the actual sequence of reveals, the odds are actually 1:1 will be when [itex]n=1[/itex].

...
Now, because the initial post does not actually give us the ordering, it's also possible to generate probabilities for all 240 scenarios involving those reveals, and then make a determination of whether the player is more likely to have a red or blue box based on that. In this type of scenario, Simon 6's notion of the player's box being the complement of the color the player most frequently revealed will hold, with a 50/50 split for even red/blue reveal counts.
 

1. What is the "Deal or no Deal" dilemma?

The "Deal or no Deal" dilemma is a popular game show in which contestants are presented with a series of sealed briefcases containing varying amounts of money. The contestant must choose between accepting an offer from the host to sell their briefcase for a guaranteed amount or continuing to open briefcases in hopes of winning a larger amount.

2. How is probability involved in the "Deal or no Deal" game?

Probability plays a major role in the "Deal or no Deal" game. As the contestant opens briefcases, the probability of their briefcase containing a higher amount of money decreases. The host's offers are based on the remaining probabilities of the briefcases, and the contestant must use their understanding of probability to make the best decision.

3. What strategies can be used to increase chances of winning in "Deal or no Deal"?

One strategy that can increase chances of winning in "Deal or no Deal" is to use the "knockout" method, in which the contestant eliminates briefcases in a strategic pattern to increase the likelihood of their briefcase containing a higher amount of money. Another strategy is to set a "walkaway" amount and stick to it, rather than being swayed by the host's offers.

4. Is there a guaranteed way to win in "Deal or no Deal"?

No, there is no guaranteed way to win in "Deal or no Deal". The game is based on probabilities and luck, so it is impossible to predict the outcome. Even with strategic gameplay, there is always a chance of ending up with a low amount of money or making a decision that results in a lower payout than the initial offer.

5. Has anyone ever won the top prize in "Deal or no Deal"?

Yes, there have been winners of the top prize in "Deal or no Deal". In the US version of the game, there have been four winners of the top prize of $1 million, and in the UK version, there have been six winners of the top prize of £250,000. However, it is important to note that these are rare occurrences and do not guarantee a win for every contestant.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
904
  • Set Theory, Logic, Probability, Statistics
7
Replies
212
Views
11K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Replies
12
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
15
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
13K
Back
Top