Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

EV and Logic : My Brain Hurts

  1. Jul 31, 2014 #1
    So, I'm having a pretty lazy day and decided to do some thinking. And, while this usually turns out to be a splendid idea, today it resulted in me hurting myself. Psychically.

    So yeah. I've been through a lot of toy games trying to figure out a way to reconcile this all in my brain box, but here's the basic gist of it:

    Part 1) I tell you to pull out a $100 bill from your pocket and put it on the table. I'm going to make you a proposition and you can either accept or decline. If you accept, I will flip a fair coin and you will call it in the air. If you win, you're forced to trade your $100 for $200 of mine that I will give to you in exchange. If you lose, you have to trade your $100 for $50 of mine. What is the value of accepting?

    Part 2) I have two envelopes, one containing twice the amount of money as the other, and both containing some positive amount of money. I hand you one of them. You can switch anywhere from 0-100 times. If you ran it 101 times choosing a different number of switches each time, in which instance(s) would you do best?

    Part 3) Same as Part 2, but this time you look inside before being given the option to switch. You find $10,000.

    What difference, if any, is there between Parts 1 and 3?
     
  2. jcsd
  3. Aug 1, 2014 #2

    phinds

    User Avatar
    Gold Member
    2016 Award

    Have you made any attempt to solve this yourself? If so, show your work. If not, then go do it and THEN show your work.
     
  4. Aug 1, 2014 #3

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    This seems a bit like the Monty Hall problem that vos savant's (I heard her called Marylin Idiot Savant) controversy was about, arguably because the problem was ill-posed by her. I think if you look up the names you'll get the background.
     
  5. Aug 1, 2014 #4
    Yeah, you're right.

    I only worked on this in my head, and the questions posed are essentially part of the work I've done, i.e. I formulated the questions specifically to make previous versions of the questions harder and to pose problems to possible solutions of the previous versions.

    In Part 1, it seems hard to convince myself that there isn't a gain in accepting using some version of:

    .5(200) + .5(50) - 100 = 25

    In Part 2, it seems hard to convince myself that there can be any gain by switching, though I have a harder time proving it to myself.

    In Part 3, it seems hard to convince myself of specifically how the problem differs from either Part 1 or Part 2.
     
  6. Aug 1, 2014 #5

    phinds

    User Avatar
    Gold Member
    2016 Award

    Yeah, I agree that if it's a 50/50 bet, then accepting is a no-brainer.

    In part 2, I don't see how any number of switches is meaningful.

    Part 3 seems to be an incomplete problem statement.
     
  7. Aug 1, 2014 #6
    For Part 3, assume you receive the same offer as in Part 2, with the exception that you are allowed to look inside the envelope handed to you before you choose whether or not you want to switch. Inside is a positive real number. Do you want to switch, and if so, how many times?

    For clarification, I'm having a hard time finding meaningful differences between, a) looking inside the envelope and seeing $100 and then being given the opportunity to switch, b) and Part 1.
     
  8. Aug 1, 2014 #7

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Part 1 is a no-brainer. You accept. Your expected gain is +$50 if you accept, zero if you don't.

    Part 2 is the well-known two envelopes problem. Wikipedia link: http://en.wikipedia.org/wiki/Two_envelopes_problem. There is no agreed-upon solution to this problem.

    Part 3 is a specialization of the two envelopes problem.


    There's a huge difference. There's nothing to lose with the two envelopes problem. If you see $X and stay you'll get $X. If you switch, you'll either get $X/2 or $2X. There's no losing in this proposition.


    To me, the resolution of the two envelopes problem depends on two things, my perception of the value of money and my perception of the benefactor.

    My perceived value of money is not convex. This throws out a lot of decision theory that is based on convex value curves. Suppose I open the envelop and see two dollars. If I switch, I might get one dollar, I might get four. One dollar, two dollars, four dollars: They're all paltry sums of money. I might as well switch. Suppose instead I see two thousand dollars. There's a lot I can buy with two thousand dollars that I can't buy with just one thousand. I'll probably stay with that two thousand and say "thank you very much!" Finally suppose I see two million dollars. If I switch and guessed wrong I still get a million. That's a lot of money, but not enough for a decent retirement. Four million: That's enough for a decent retirement. Might as well go for it.

    That's assuming there's a reasonable chance that the benefactor has four million dollars to spare. If I perceive that there's no way that the benefactor has four million to spare, I'd be better of staying with the two million I already have.
     
  9. Aug 1, 2014 #8
    I'm guessing you meant +$25 if you accept, as you net $50 per 2 occurrences. And yeah, I suppose I'm kinda trying to come to an agreement, at least with myself, or at least convince myself that I can't, but it's obviously kinda difficult :P

    Perhaps it's my gambling for food that has made me take certain things for granted in my thinking, but when you say there's no losing in this proposition, I assume you're using the term 'losing' as synonymous with 'risk,' and my brain has a hard time accepting that statement as true. By agreeing to switch while knowing the outcomes are either {move from X to .5X, move from X to 2X}, it's hard for me see the flaw in conceptualizing this as risking .5X to win X.

    This isn't very appealing to me, personally. Having understood the idea of utility, it still seems to me like you're using arbitrary thoughts to make a decision, i.e. your orientation isn't based in anything objective, and thus seems subject to be in any given state at any given time. And if I tell myself I'm fine with that being the case, then I can essentially just say anything I choose to do is justifiable, simply because I decided that it was. E.g. you said you'd stick with $2k because you could think of "a lot I can buy with $2k that I can't buy with just $1k," while making the opposite decision when faced with $2MM, even though the same statement applies using $2MM and $1MM.
     
  10. Aug 2, 2014 #9

    disregardthat

    User Avatar
    Science Advisor

    I think the reason of this paradox is that the random distribution from which the least (or equivalently, the largest) envelope was selected from is not given. If you think the (positive) amount of money in the least envelope is totally random, what sense do we give this? A uniform distribution on the set of integers [itex][1,\infty)[/itex] ? You can't have that.
     
  11. Aug 2, 2014 #10
    But if we set a probability measure, it's allowed, right? Is anybody able to show me both why it can't be uniform if infinite and why it's (suddenly) doable when the distribution is abnormal or whatever?
     
  12. Aug 2, 2014 #11

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    My bad. You're right; the expected value of accepting is +$25 per trial.

    Per Monty Python, a Norwegian Blue parrot has very beautiful plumage. A dead Norwegian Blue might still have beautiful plumage, but it is nonetheless an ex-parrot. Per Matthew Rabin and Richard Ho Thaler (http://harbaugh.uoregon.edu/Readings/Risk/Rabin%202001%20JEP%20Risk%20Aversion%20Anomalies.pdf [Broken]), expected utility theory similarly has very beautiful plumage, and like the parrot in the Monty Python skit, "expected utility is an ex-hypothesis".

    Expected utility theory has its place (Rabin thinks that place is the trash bin), but it is perhaps a bit too simplistic and definitely is not universal. People do switch from risk seeking to risk aversion, and vice versa, depending on circumstance. Just because someone is risk averse in a moderate setting does not mean they are risk averse everywhere. People oftentimes are risk seeking when risk is small and/or positive outcomes are large.


    Exactly. So if you are playing the game you have to reason under uncertainty. In this case, it's extreme uncertainty because the underlying distribution is a complete unknown.
     
    Last edited by a moderator: May 6, 2017
  13. Aug 2, 2014 #12

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    In a discrete uniform distribution on a set S , every element has the same probability 1/(|S|), where |S| is the size/cardinality of S.
     
  14. Aug 2, 2014 #13
    Sure. Let's try to have a uniform dist. on the natural numbers. The measure of each number is e. e is the same for every number, otherwise the distribution fails to be uniform. e is greater than zero, otherwise the sum of the measures is zero. It must be one. For every value of e greater than zero, the sum of the measures of each number is infinite. It fails to be one.

    In contrast, set the measure of each number n as 1/2^n. The sum of the measures is one.
     
  15. Aug 2, 2014 #14
    He is using an arbitrary and subjective utility function. There is no rule that forbids this.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: EV and Logic : My Brain Hurts
  1. Check my logic (Replies: 2)

  2. Problem on Logic (Replies: 10)

Loading...