Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Can someone explain if this is true?

  1. Jul 9, 2011 #1
    I believe that it is correct, and it concisely represents a concept I have never understood. It seems obvious to me that the chance of getting the higher bill would be 50/50, and if you ran enough simulations then you would do no better by consistently choice to switch than by choosing to keep the tenner. Yet I have heard this concept of expected payoff repeatedly and it contradicts the above.

    Can you explain it?

    Suppose you know there are two bills in a hat and that one bill is twice the size of the other bill. You draw one bill from the hat and it is a $10 bill. You are then given the option to exchange you $10 bill for the other bill in the hat…do you make the exchange?

    Based on expected values, you should make the exchange. The expected value of the trade is 0.5*5 + 0.5*20 = 12.25 given the other bill is half the time a $5 bill and half the time a $20 bill.
     
  2. jcsd
  3. Jul 9, 2011 #2

    gb7nash

    User Avatar
    Homework Helper

    You're looking at two different things. Expected value utlilizes the law of large numbers. If you do the experiment a 'large' number of times, the result will approach the expected value.

    Hopefully this answered your question?
     
    Last edited: Jul 9, 2011
  4. Jul 9, 2011 #3
    I believe that clears most of my confusion.

    Thankyou
     
  5. Jul 9, 2011 #4
    Well... this reasoning is valid if you know a priori that (5, 10) and (10, 20) are equally probable. Let's assume that is in fact the case. You are right that if I switch every time, no matter which bill I pull out of the hat, I will not do any better than if I keep the bill I pull out every time. But of course I'm not going to do that. If I get a 5, I'm going to switch. If I get a 20, I'm going to keep it. It is only if I get a 10 that I will switch. So, you see, my actions are influenced by what I draw.

    Now, you are right to say "the chance of getting the higher bill would be 50/50" if I switch on 10. I make the right decision half the time and the wrong decision half the time. But the conclusion you draw, "you would do no better by consistently choice to switch than by choosing to keep the tenner" doesn't follow. My wrong decisions cost my $5, but my right decisions gain me $10. I make wrong and right decisions at equal frequency, and the right decisions are worth more than the wrong cost, I gain.

    Another case: say you present me with (5,10) and (10,20) with equal frequency, but you trick me, so that my draw is not really random: I get the 10 every time. Then the choice of switching or keeping is just the choice between getting a 10 every time, or getting half 5s and half 20s. Obviously I win by doing the second.

    Finally, suppose you cheat by giving me a (5,10) hat 2/3 of the time and (10,20) 1/3. Now if I switch I lose $5 2/3 of the time and gain $10 1/3 of the time, and so I break even.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook