Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Thinking Inside The Box

  1. Jul 30, 2010 #1
    In the thread for my post of "The Boys Puzzle", D H, a PF Mentor, wisely mentions the Monte Hall problem as an example of the type of puzzle that people have difficulty dealing with. That is indeed the case. For example, here is a slight modification of that Monte Hall puzzle that has been shown to be quite counterintuitive (this version of the problem has generated quite a bit of discussion on the net):

    Remember the infamous Three-box, or Monty Hall, problem? This time there are just two boxes. In one of them there is a prize of some value and in the other another prize of twice the value. Pick one of the boxes and open it.
    You are now offered the chance to switch your choice to the other box. What should you do?


    Lets give that a spin here and see how people solve it. So post your reasons for arriving at whatever you feel is the solution....

    Pete B
     
  2. jcsd
  3. Jul 30, 2010 #2

    Jonathan Scott

    User Avatar
    Gold Member

    If the amounts are x and 2x, then the average winnings for a random choice must be 1.5x.

    If you had chosen x, switching will gain you another x. If you had chosen 2x, switching will
    lose you x. The actual amount you stand to gain or lose is equal, and the average is 1.5x.

    What creates confusion is that if you get the opportunity to switch, then it would seem that by switching to have a 50% chance of doubling your winnings and a 50% chance of halving them, and if you take the average of 2 and 1/2 it is 1.25, so it seems that on average you would gain by switching. However, this fails to take into account the fact that the doubling and halving would start from levels differing by a factor of 2, so it is misleading to compare the average gains or losses as if they were on the same scale.
     
  4. Jul 30, 2010 #3
    monty hall only works with 3 or more.

    i would just switch if i'm unhappy with the first choice.
     
  5. Jul 31, 2010 #4

    Ygggdrasil

    User Avatar
    Science Advisor

    The problem is symmetric, so there should be no advantage to switching.

    The only new information you gain by opening the envelope is you get a better idea of what the prize values may be. If you have some knowledge of the distribution of prize values, you can design a strategy that will give you a greater than 50% chance of getting the larger prize, but without knowing anything about the distribution of prize values, it seems like you should only be able to get the larger prize 50% of the time.
     
  6. Aug 14, 2010 #5

    loseyourname

    User Avatar
    Staff Emeritus
    Gold Member

    Say you open the box and $100 is in it. You have two options: switch or don't switch.

    If you switch, there are two possible outcomes: you find $200 or you find $50. These are equally probable, so the expected value of choosing to switch is $125.

    If you don't switch, the only possible outcome is that you end up with $100.

    Switching has the greater expected value, so you should switch. Correct? This is the result Jonathan Scott came up with but then rejected, but I don't see why it should be rejected.

    Individuals who are risk-averse with bounded-above utility of wealth functions will not trade at a certain threshold observed prize, however. Say you observe in the box $1 billion and wealth beyond $1 billion is completely worthless to you. Then you won't trade because the expected utility of trading becomes less than of not because wealth gains bring no utility gains but wealth losses do bring utility losses.
     
  7. Aug 14, 2010 #6

    Evo

    User Avatar

    Staff: Mentor

    Welcome back LYN!!! Are you going to be around more?
     
  8. Aug 14, 2010 #7

    loseyourname

    User Avatar
    Staff Emeritus
    Gold Member

    I can be.
     
  9. Aug 14, 2010 #8

    Jonathan Scott

    User Avatar
    Gold Member

    I agree that this seems very plausible and I had some difficulty finding the flaw in it.

    If the scenario involved first choosing an amount x then optionally taking a chance on randomly doubling it or halving it with equal probability, then of course you should take it, as that will give you an expected value of 1.25 times the initial value.

    However, the amount x was decided in advance, so if you've chosen the box with x then switching would give you 2x, but if you've chosen the box with 2x, switching will give you x, so you actually have an equal probability of gaining or losing the same amount x.

    That is, if you have in fact chosen the higher valued box, your first choice is twice as much as if you've chosen the lower valued box, so although the relative amounts you stand to win or lose appear to be unequal relative to your first choice, they are actually equal in value.
     
  10. Aug 14, 2010 #9
    Bzzzt.

    Before you start an experiment, you must make a prediction about the total amount of money that could've been put into boxes.

    It could be something as trivial as "any amount from $1 to $1,000,000 with equal probabilities", but it must exist.

    You will only be winning consistently if you do the right decision expectation-wise every time you're asked to switch, AND your prediction function is somewhat correct.

    Once you have that prediction function, for any amount X you see in the first box, you can calculate probabilities of finding X/2 and 2X in the other box.

    You are talking about switch outcomes 2X and X/2 as "equally probable", but the problem is, this is not generated by any physically possible prediction function, because any physical prediction function must be nonzero on a bounded interval and it must integrate to 1 over its domain.
     
  11. Aug 14, 2010 #10

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If you open the box and see $200, if you had the 2x amount, then x is smaller than if you had picked the x amount. The logic doesn't work here.

    The real problem with the game is that everyone assumes the amount of money is chosen uniformly between 0 and infinity. Thus, the expected value of the winnings is infinite. But whenever you open up the box you can only win a finite amount. You're essentially winning a below average amount every time, which is why it seems that you should always try to open up the other box.

    Remove the doubling factor and just play the stripped down game: Two boxes each have a random amount of money (up to but not including an infinite amount). You open up a box. You're then allowed to switch. Well, the other box with probability 1 has more money than the box you just opened up, so of course you switch. But on the other hand you have a fifty/fifty chance of having more money in the first box than in the second box.
     
  12. Aug 15, 2010 #11

    Jonathan Scott

    User Avatar
    Gold Member

    I disagree. The point is that the amounts x and 2x are fixed in advance and the probabilities relate to which one you have picked, not the possible values of x, so the chance of increasing or decreasing is linked to the relative value of the one you have already picked, and this has to be taken into account when calculating whether you expect to gain or lose by switching.

    To avoid all worries about infinities, don't open the first box, as the original problem is not affected by knowing the first amount!

    There's nothing special about double or halving. If you choose one of two unknown different amounts, and have the option of switching to the other (without even looking at the first), the chance of the second one being larger is obviously 50%. If you call the values x and y, then it would seem that switching would give an expected value of (x/y + y/x)/2 times the original choice, which is always greater than 1 when x and y are different positive amounts, specifically 1 + (x-y)2/2xy. However, this is a meaningless value, as it is averaging percentages of quantities which are known to be different. The correct expected return after switching can be calculated by multiplying these ratios by the corresponding initial amounts, giving (y(x/y) + x(y/x))/2 which is (x+y)/2, the same as before switching.
     
  13. Aug 15, 2010 #12

    disregardthat

    User Avatar
    Science Advisor

    Suppose it is 3N dollars of money in total in the two boxes. It is a 50% chance that you pick the smallest amount, and a 50% chance that you pick the largest. We can calculate the possible end result in the two cases individually:

    Picked the smallest: You have N. Switching gives you 2N.
    Picked the largest: You have 2N. Switching gives you N.

    So the expectancy value of switching is (2N+N)/2 = 3/2 N, but that is the same as the expectancy value of staying: (N+2N)/2 = 3/2 N. So it doesn't matter.

    On the other hand, suppose you open a box and find x dollars. If you have the smallest amount, switching would give you 2x. If you have the largest, switching would give you x/2. The expectancy value of switching is (2x+x/2)/2 = 1.25x, so you'd switch.

    This is essentially the dilemma as I see it. I am fairly sure the second explanation is correct. You should switch.

    Suppose the proportion was not 2:1, but k:1 for a positive real number k. Would you switch?
     
    Last edited: Aug 15, 2010
  14. Aug 15, 2010 #13

    Jonathan Scott

    User Avatar
    Gold Member

    That's just the same fallacy again. By symmetry, switching cannot help.

    In any given situation, you have either chosen the higher or lower, and this is totally linked to whether switching will make it lower or higher. If you've chosen the higher, then even though the PERCENTAGE loss for switching relative to the first amount is smaller, the AMOUNT you will lose by switching is the same.

    By switching, you stand either to gain 100% of the smaller amount or lose 50% of the larger amount. Although these percentages average to a gain of 25%, that is not a measure of whether you stand to gain or lose on average, because the amounts are different (even though you only see one of them and do not know which one it is).
     
  15. Aug 15, 2010 #14

    disregardthat

    User Avatar
    Science Advisor

    You're right. You would never double the larger prize, and never halve the lesser. So it's not a 50% chance of doubling or halving your picked box by switching, that is determined beforehand.

    If a large amount of people played the game, and always switched, it would have been equivalent to having picked the other box and stayed. The error lies in the fact that we are tricked into believing that the prize in a box is determined lesser or larger after the box is chosen. We think that we choose either x or 2x, and then a coin-flip determines whether you will double or halve the prize, but that's not the mechanism. 2x will always turn into x, and x into 2x. 2x will never be 4x, and x will never be x/2.
     
    Last edited: Aug 15, 2010
  16. Aug 15, 2010 #15
    You are assuming that the probability of doubling is 50% and the probability of halving is 50%, and that you gain no information by seeing the number in the first box.

    But, in any realistic implementation of the experiment, that is not true. The actual probabilities depend on the state of mind of the designer. You need a correct assessment of those probabilities if you want to be winning in the long run.

    Suppose you repeat the experiment many times, and you see $1,000 in the first box on 100 different occasions. How many times out of 100 will you see $2,000 in the other box? If it's 50, you should switch.

    Suppose you repeat the experiment many times, and you see $2,000 in the first box on 100 different occasions. How many times out of 100 will you see $4,000 in the other box? If it's 50, you should switch.

    As you repeat ad infinitum, you'll realize one of two things. Either the experimenter has an infinite money supply, exceeding the GDP of the country by an infinite amount, in which case, if you're so unlucky that you see measly $1,000 in the first box, you might as well switch. Or his money is finite, and he can't put more than, say, $2,000 into any box, in which case, seeing more than $1,000 in the first box assures you that you have the larger box and you should stay.
     
  17. Aug 19, 2010 #16

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Ok, let's discretize the problem. I'm going to stick with the person having infinite money since if he doesn't the infinite expectation issue goes away. Let's suppose that with probability [tex]\frac{1}{2^n}[/tex] the proprietor puts [tex]10^n[/tex] dollars in one box, and [tex]10^{n+1}[/tex] dollars in the other box. (this is not exactly the same problem as in the OP but it makes the numbers work out easier).

    You open a box and find 1,000 dollars. We know that with original probability 1/8 the boxes had 1,000 and 10,000 dollars, and with probability 1/4 they had 100 and 1,000 dollars. Since in each of those cases you were as likely to pick the box with 1,000 dollars, our conditional probabilities say that there is a 1/3 chance of the other box having 10,000 dollars, and a 2/3 chance of the box having 100 dollars. Expected value of the other box? 3400 dollars.

    The fact that it was 1,000 is not really relevant though. If we find a box with [tex]10^n[/tex] dollars, the other box has [tex]10^{n-1}[/tex] dollars with probability 2/3 and [tex]10^{n+1}[/tex] dollars with probability 1/3. Expected value of the other box? [tex]3.4*10^n[/tex]. So you should always switch.
     
  18. Aug 20, 2010 #17

    Jonathan Scott

    User Avatar
    Gold Member

    All you've done here is add more complicated numbers to the original fallacy. What you have still missed is the linkage between whether switching will increase or decrease the result and whether the initial amount is the larger or smaller one.
     
  19. Aug 20, 2010 #18
    Do you think that seeing the initial amount can give you any information to determine whether it's the larger or the smaller amount?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Thinking Inside The Box
  1. Think! (Replies: 34)

  2. Think out of the box (Replies: 30)

  3. Think Outside The Box (Replies: 4)

  4. What's inside the box? (Replies: 3)

Loading...