What should you do in the Two-box Monty Hall problem?

In summary: It seems like if you switch you should gain more than if you don't switch. It's counterintuitive because when you switch, you are gaining twice the amount of what you would have gained if you had not switched. In summary, D H, a PF Mentor, wisely mentions the Monte Hall problem as an example of the type of puzzle that people have difficulty dealing with. That is indeed the case. For example, here is a slight modification of that Monte Hall puzzle that has been shown to be quite counterintuitive (this version of the problem has generated quite a bit of discussion on the net): Remember the infamous Three-box, or Monty Hall, problem? This time there
  • #1
peteb
35
1
In the thread for my post of "The Boys Puzzle", D H, a PF Mentor, wisely mentions the Monte Hall problem as an example of the type of puzzle that people have difficulty dealing with. That is indeed the case. For example, here is a slight modification of that Monte Hall puzzle that has been shown to be quite counterintuitive (this version of the problem has generated quite a bit of discussion on the net):

Remember the infamous Three-box, or Monty Hall, problem? This time there are just two boxes. In one of them there is a prize of some value and in the other another prize of twice the value. Pick one of the boxes and open it.
You are now offered the chance to switch your choice to the other box. What should you do?


Lets give that a spin here and see how people solve it. So post your reasons for arriving at whatever you feel is the solution...

Pete B
 
Physics news on Phys.org
  • #2
If the amounts are x and 2x, then the average winnings for a random choice must be 1.5x.

If you had chosen x, switching will gain you another x. If you had chosen 2x, switching will
lose you x. The actual amount you stand to gain or lose is equal, and the average is 1.5x.

What creates confusion is that if you get the opportunity to switch, then it would seem that by switching to have a 50% chance of doubling your winnings and a 50% chance of halving them, and if you take the average of 2 and 1/2 it is 1.25, so it seems that on average you would gain by switching. However, this fails to take into account the fact that the doubling and halving would start from levels differing by a factor of 2, so it is misleading to compare the average gains or losses as if they were on the same scale.
 
  • #3
monty hall only works with 3 or more.

i would just switch if I'm unhappy with the first choice.
 
  • #4
The problem is symmetric, so there should be no advantage to switching.

The only new information you gain by opening the envelope is you get a better idea of what the prize values may be. If you have some knowledge of the distribution of prize values, you can design a strategy that will give you a greater than 50% chance of getting the larger prize, but without knowing anything about the distribution of prize values, it seems like you should only be able to get the larger prize 50% of the time.
 
  • #5
Say you open the box and $100 is in it. You have two options: switch or don't switch.

If you switch, there are two possible outcomes: you find $200 or you find $50. These are equally probable, so the expected value of choosing to switch is $125.

If you don't switch, the only possible outcome is that you end up with $100.

Switching has the greater expected value, so you should switch. Correct? This is the result Jonathan Scott came up with but then rejected, but I don't see why it should be rejected.

Individuals who are risk-averse with bounded-above utility of wealth functions will not trade at a certain threshold observed prize, however. Say you observe in the box $1 billion and wealth beyond $1 billion is completely worthless to you. Then you won't trade because the expected utility of trading becomes less than of not because wealth gains bring no utility gains but wealth losses do bring utility losses.
 
  • #6
Welcome back LYN! Are you going to be around more?
 
  • #7
I can be.
 
  • #8
loseyourname said:
Say you open the box and $100 is in it. You have two options: switch or don't switch.

If you switch, there are two possible outcomes: you find $200 or you find $50. These are equally probable, so the expected value of choosing to switch is $125.

If you don't switch, the only possible outcome is that you end up with $100.

Switching has the greater expected value, so you should switch. Correct? This is the result Jonathan Scott came up with but then rejected, but I don't see why it should be rejected.

I agree that this seems very plausible and I had some difficulty finding the flaw in it.

If the scenario involved first choosing an amount x then optionally taking a chance on randomly doubling it or halving it with equal probability, then of course you should take it, as that will give you an expected value of 1.25 times the initial value.

However, the amount x was decided in advance, so if you've chosen the box with x then switching would give you 2x, but if you've chosen the box with 2x, switching will give you x, so you actually have an equal probability of gaining or losing the same amount x.

That is, if you have in fact chosen the higher valued box, your first choice is twice as much as if you've chosen the lower valued box, so although the relative amounts you stand to win or lose appear to be unequal relative to your first choice, they are actually equal in value.
 
  • #9
loseyourname said:
If you switch, there are two possible outcomes: you find $200 or you find $50. These are equally probable, so the expected value of choosing to switch is $125.

Bzzzt.

Before you start an experiment, you must make a prediction about the total amount of money that could've been put into boxes.

It could be something as trivial as "any amount from $1 to $1,000,000 with equal probabilities", but it must exist.

You will only be winning consistently if you do the right decision expectation-wise every time you're asked to switch, AND your prediction function is somewhat correct.

Once you have that prediction function, for any amount X you see in the first box, you can calculate probabilities of finding X/2 and 2X in the other box.

You are talking about switch outcomes 2X and X/2 as "equally probable", but the problem is, this is not generated by any physically possible prediction function, because any physical prediction function must be nonzero on a bounded interval and it must integrate to 1 over its domain.
 
  • #10
Jonathan Scott said:
If the scenario involved first choosing an amount x then optionally taking a chance on randomly doubling it or halving it with equal probability, then of course you should take it, as that will give you an expected value of 1.25 times the initial value.

However, the amount x was decided in advance, so if you've chosen the box with x then switching would give you 2x, but if you've chosen the box with 2x, switching will give you x, so you actually have an equal probability of gaining or losing the same amount x.

That is, if you have in fact chosen the higher valued box, your first choice is twice as much as if you've chosen the lower valued box, so although the relative amounts you stand to win or lose appear to be unequal relative to your first choice, they are actually equal in value.

If you open the box and see $200, if you had the 2x amount, then x is smaller than if you had picked the x amount. The logic doesn't work here.

The real problem with the game is that everyone assumes the amount of money is chosen uniformly between 0 and infinity. Thus, the expected value of the winnings is infinite. But whenever you open up the box you can only win a finite amount. You're essentially winning a below average amount every time, which is why it seems that you should always try to open up the other box.

Remove the doubling factor and just play the stripped down game: Two boxes each have a random amount of money (up to but not including an infinite amount). You open up a box. You're then allowed to switch. Well, the other box with probability 1 has more money than the box you just opened up, so of course you switch. But on the other hand you have a fifty/fifty chance of having more money in the first box than in the second box.
 
  • #11
Office_Shredder said:
If you open the box and see $200, if you had the 2x amount, then x is smaller than if you had picked the x amount. The logic doesn't work here.

I disagree. The point is that the amounts x and 2x are fixed in advance and the probabilities relate to which one you have picked, not the possible values of x, so the chance of increasing or decreasing is linked to the relative value of the one you have already picked, and this has to be taken into account when calculating whether you expect to gain or lose by switching.

To avoid all worries about infinities, don't open the first box, as the original problem is not affected by knowing the first amount!

There's nothing special about double or halving. If you choose one of two unknown different amounts, and have the option of switching to the other (without even looking at the first), the chance of the second one being larger is obviously 50%. If you call the values x and y, then it would seem that switching would give an expected value of (x/y + y/x)/2 times the original choice, which is always greater than 1 when x and y are different positive amounts, specifically 1 + (x-y)2/2xy. However, this is a meaningless value, as it is averaging percentages of quantities which are known to be different. The correct expected return after switching can be calculated by multiplying these ratios by the corresponding initial amounts, giving (y(x/y) + x(y/x))/2 which is (x+y)/2, the same as before switching.
 
  • #12
Suppose it is 3N dollars of money in total in the two boxes. It is a 50% chance that you pick the smallest amount, and a 50% chance that you pick the largest. We can calculate the possible end result in the two cases individually:

Picked the smallest: You have N. Switching gives you 2N.
Picked the largest: You have 2N. Switching gives you N.

So the expectancy value of switching is (2N+N)/2 = 3/2 N, but that is the same as the expectancy value of staying: (N+2N)/2 = 3/2 N. So it doesn't matter.

On the other hand, suppose you open a box and find x dollars. If you have the smallest amount, switching would give you 2x. If you have the largest, switching would give you x/2. The expectancy value of switching is (2x+x/2)/2 = 1.25x, so you'd switch.

This is essentially the dilemma as I see it. I am fairly sure the second explanation is correct. You should switch.

Suppose the proportion was not 2:1, but k:1 for a positive real number k. Would you switch?
 
Last edited:
  • #13
Jarle said:
...
On the other hand, suppose you open a box and find x dollars. If you have the smallest amount, switching would give you 2x. If you have the largest, switching would give you x/2. The expectancy value of switching is (2x+x/2)/2 = 1.25x, so you'd switch.

That's just the same fallacy again. By symmetry, switching cannot help.

In any given situation, you have either chosen the higher or lower, and this is totally linked to whether switching will make it lower or higher. If you've chosen the higher, then even though the PERCENTAGE loss for switching relative to the first amount is smaller, the AMOUNT you will lose by switching is the same.

By switching, you stand either to gain 100% of the smaller amount or lose 50% of the larger amount. Although these percentages average to a gain of 25%, that is not a measure of whether you stand to gain or lose on average, because the amounts are different (even though you only see one of them and do not know which one it is).
 
  • #14
You're right. You would never double the larger prize, and never halve the lesser. So it's not a 50% chance of doubling or halving your picked box by switching, that is determined beforehand.

If a large amount of people played the game, and always switched, it would have been equivalent to having picked the other box and stayed. The error lies in the fact that we are tricked into believing that the prize in a box is determined lesser or larger after the box is chosen. We think that we choose either x or 2x, and then a coin-flip determines whether you will double or halve the prize, but that's not the mechanism. 2x will always turn into x, and x into 2x. 2x will never be 4x, and x will never be x/2.
 
Last edited:
  • #15
On the other hand, suppose you open a box and find x dollars. If you have the smallest amount, switching would give you 2x. If you have the largest, switching would give you x/2. The expectancy value of switching is (2x+x/2)/2 = 1.25x, so you'd switch.

You are assuming that the probability of doubling is 50% and the probability of halving is 50%, and that you gain no information by seeing the number in the first box.

But, in any realistic implementation of the experiment, that is not true. The actual probabilities depend on the state of mind of the designer. You need a correct assessment of those probabilities if you want to be winning in the long run.

Suppose you repeat the experiment many times, and you see $1,000 in the first box on 100 different occasions. How many times out of 100 will you see $2,000 in the other box? If it's 50, you should switch.

Suppose you repeat the experiment many times, and you see $2,000 in the first box on 100 different occasions. How many times out of 100 will you see $4,000 in the other box? If it's 50, you should switch.

As you repeat ad infinitum, you'll realize one of two things. Either the experimenter has an infinite money supply, exceeding the GDP of the country by an infinite amount, in which case, if you're so unlucky that you see measly $1,000 in the first box, you might as well switch. Or his money is finite, and he can't put more than, say, $2,000 into any box, in which case, seeing more than $1,000 in the first box assures you that you have the larger box and you should stay.
 
  • #16
Ok, let's discretize the problem. I'm going to stick with the person having infinite money since if he doesn't the infinite expectation issue goes away. Let's suppose that with probability [tex]\frac{1}{2^n}[/tex] the proprietor puts [tex]10^n[/tex] dollars in one box, and [tex]10^{n+1}[/tex] dollars in the other box. (this is not exactly the same problem as in the OP but it makes the numbers work out easier).

You open a box and find 1,000 dollars. We know that with original probability 1/8 the boxes had 1,000 and 10,000 dollars, and with probability 1/4 they had 100 and 1,000 dollars. Since in each of those cases you were as likely to pick the box with 1,000 dollars, our conditional probabilities say that there is a 1/3 chance of the other box having 10,000 dollars, and a 2/3 chance of the box having 100 dollars. Expected value of the other box? 3400 dollars.

The fact that it was 1,000 is not really relevant though. If we find a box with [tex]10^n[/tex] dollars, the other box has [tex]10^{n-1}[/tex] dollars with probability 2/3 and [tex]10^{n+1}[/tex] dollars with probability 1/3. Expected value of the other box? [tex]3.4*10^n[/tex]. So you should always switch.
 
  • #17
All you've done here is add more complicated numbers to the original fallacy. What you have still missed is the linkage between whether switching will increase or decrease the result and whether the initial amount is the larger or smaller one.
 
  • #18
Jonathan Scott said:
All you've done here is add more complicated numbers to the original fallacy. What you have still missed is the linkage between whether switching will increase or decrease the result and whether the initial amount is the larger or smaller one.

Do you think that seeing the initial amount can give you any information to determine whether it's the larger or the smaller amount?
 

1. What is the Two-box Monty Hall problem?

The Two-box Monty Hall problem is a famous puzzle in probability theory named after the game show host Monty Hall. It involves a game where a player is presented with three closed doors, one of which has a prize behind it. After the player chooses a door, the host opens one of the remaining doors to reveal that it does not have the prize. The player is then given the option to switch their choice to the other unopened door. The question is, should the player switch or stick with their original choice?

2. What should I do in the Two-box Monty Hall problem?

The answer to this question depends on what your goal is. If your goal is to have the highest chance of winning the prize, then you should switch your choice. This is because switching gives you a 2/3 chance of winning, while sticking with your original choice only gives you a 1/3 chance of winning. However, if your goal is to simply stick with your initial choice and not be influenced by the host's actions, then you should stick with your original choice.

3. Why is switching the better option in the Two-box Monty Hall problem?

Switching is the better option because when you initially choose a door, you have a 1/3 chance of being correct. This means that the other two doors have a combined 2/3 chance of housing the prize. When the host opens one of those two doors to reveal that it does not have the prize, the remaining unopened door now has a 2/3 chance of being correct. So by switching, you increase your chances of winning to 2/3.

4. Is the Two-box Monty Hall problem a real-life situation?

The Two-box Monty Hall problem is a simplified version of real-life situations involving probability and decision-making. It highlights the concept of conditional probability, where new information can change the probability of an outcome. While it may not be exactly the same as the game show scenario, the principles behind it can be applied to real-life situations.

5. Are there any variations of the Two-box Monty Hall problem?

Yes, there are variations of the Two-box Monty Hall problem that involve different numbers of doors, different rules for the host's actions, and different goals for the player. Some variations may have more than one prize, some may allow the player to switch multiple times, and some may have the goal of minimizing the player's chances of winning. However, the basic concept of conditional probability remains the same in all variations.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
15
Views
1K
Replies
12
Views
1K
  • General Math
Replies
30
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
13K
Replies
10
Views
4K
  • Other Physics Topics
Replies
4
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
Back
Top