EV and Decision Making: How Utility Functions Impact Our Choices

In summary: I don't. Why not?The reason is that I don't trust the person who gave me the envelope. If they were to give me an envelope with $10,000 in it, I would definitely switch. The reason I don't trust them is that they could have easily substituted a worthless envelope for the $100 one. Given the choice between $100 and nothing, I would rather take nothing. Given the choice between $100 and nothing, I would rather take nothing.In summary, the person is having a hard time justifying why they would switch in Part 2, when they would switch in Part 1 if they had more information.
  • #1
grotto
6
0
So, I'm having a pretty lazy day and decided to do some thinking. And, while this usually turns out to be a splendid idea, today it resulted in me hurting myself. Psychically.

So yeah. I've been through a lot of toy games trying to figure out a way to reconcile this all in my brain box, but here's the basic gist of it:

Part 1) I tell you to pull out a $100 bill from your pocket and put it on the table. I'm going to make you a proposition and you can either accept or decline. If you accept, I will flip a fair coin and you will call it in the air. If you win, you're forced to trade your $100 for $200 of mine that I will give to you in exchange. If you lose, you have to trade your $100 for $50 of mine. What is the value of accepting?

Part 2) I have two envelopes, one containing twice the amount of money as the other, and both containing some positive amount of money. I hand you one of them. You can switch anywhere from 0-100 times. If you ran it 101 times choosing a different number of switches each time, in which instance(s) would you do best?

Part 3) Same as Part 2, but this time you look inside before being given the option to switch. You find $10,000.

What difference, if any, is there between Parts 1 and 3?
 
Physics news on Phys.org
  • #2
Have you made any attempt to solve this yourself? If so, show your work. If not, then go do it and THEN show your work.
 
  • #3
This seems a bit like the Monty Hall problem that vos savant's (I heard her called Marylin Idiot Savant) controversy was about, arguably because the problem was ill-posed by her. I think if you look up the names you'll get the background.
 
  • #4
phinds said:
Have you made any attempt to solve this yourself? If so, show your work. If not, then go do it and THEN show your work.

Yeah, you're right.

I only worked on this in my head, and the questions posed are essentially part of the work I've done, i.e. I formulated the questions specifically to make previous versions of the questions harder and to pose problems to possible solutions of the previous versions.

In Part 1, it seems hard to convince myself that there isn't a gain in accepting using some version of:

.5(200) + .5(50) - 100 = 25

In Part 2, it seems hard to convince myself that there can be any gain by switching, though I have a harder time proving it to myself.

In Part 3, it seems hard to convince myself of specifically how the problem differs from either Part 1 or Part 2.
 
  • #5
Yeah, I agree that if it's a 50/50 bet, then accepting is a no-brainer.

In part 2, I don't see how any number of switches is meaningful.

Part 3 seems to be an incomplete problem statement.
 
  • #6
phinds said:
Yeah, I agree that if it's a 50/50 bet, then accepting is a no-brainer.

In part 2, I don't see how any number of switches is meaningful.

Part 3 seems to be an incomplete problem statement.

For Part 3, assume you receive the same offer as in Part 2, with the exception that you are allowed to look inside the envelope handed to you before you choose whether or not you want to switch. Inside is a positive real number. Do you want to switch, and if so, how many times?

For clarification, I'm having a hard time finding meaningful differences between, a) looking inside the envelope and seeing $100 and then being given the opportunity to switch, b) and Part 1.
 
  • #7
Part 1 is a no-brainer. You accept. Your expected gain is +$50 if you accept, zero if you don't.

Part 2 is the well-known two envelopes problem. Wikipedia link: http://en.wikipedia.org/wiki/Two_envelopes_problem. There is no agreed-upon solution to this problem.

Part 3 is a specialization of the two envelopes problem.


grotto said:
For clarification, I'm having a hard time finding meaningful differences between, a) looking inside the envelope and seeing $100 and then being given the opportunity to switch, b) and Part 1.
There's a huge difference. There's nothing to lose with the two envelopes problem. If you see $X and stay you'll get $X. If you switch, you'll either get $X/2 or $2X. There's no losing in this proposition.


To me, the resolution of the two envelopes problem depends on two things, my perception of the value of money and my perception of the benefactor.

My perceived value of money is not convex. This throws out a lot of decision theory that is based on convex value curves. Suppose I open the envelop and see two dollars. If I switch, I might get one dollar, I might get four. One dollar, two dollars, four dollars: They're all paltry sums of money. I might as well switch. Suppose instead I see two thousand dollars. There's a lot I can buy with two thousand dollars that I can't buy with just one thousand. I'll probably stay with that two thousand and say "thank you very much!" Finally suppose I see two million dollars. If I switch and guessed wrong I still get a million. That's a lot of money, but not enough for a decent retirement. Four million: That's enough for a decent retirement. Might as well go for it.

That's assuming there's a reasonable chance that the benefactor has four million dollars to spare. If I perceive that there's no way that the benefactor has four million to spare, I'd be better of staying with the two million I already have.
 
  • #8
D H said:
Part 1 is a no-brainer. You accept. Your expected gain is +$50 if you accept, zero if you don't.

Part 2 is the well-known two envelopes problem. Wikipedia link: http://en.wikipedia.org/wiki/Two_envelopes_problem. There is no agreed-upon solution to this problem.

I'm guessing you meant +$25 if you accept, as you net $50 per 2 occurrences. And yeah, I suppose I'm kinda trying to come to an agreement, at least with myself, or at least convince myself that I can't, but it's obviously kinda difficult :P

Part 3 is a specialization of the two envelopes problem.

There's a huge difference. There's nothing to lose with the two envelopes problem. If you see $X and stay you'll get $X. If you switch, you'll either get $X/2 or $2X. There's no losing in this proposition.

Perhaps it's my gambling for food that has made me take certain things for granted in my thinking, but when you say there's no losing in this proposition, I assume you're using the term 'losing' as synonymous with 'risk,' and my brain has a hard time accepting that statement as true. By agreeing to switch while knowing the outcomes are either {move from X to .5X, move from X to 2X}, it's hard for me see the flaw in conceptualizing this as risking .5X to win X.

To me, the resolution of the two envelopes problem depends on two things, my perception of the value of money and my perception of the benefactor.

My perceived value of money is not convex. This throws out a lot of decision theory that is based on convex value curves. Suppose I open the envelop and see two dollars. If I switch, I might get one dollar, I might get four. One dollar, two dollars, four dollars: They're all paltry sums of money. I might as well switch. Suppose instead I see two thousand dollars. There's a lot I can buy with two thousand dollars that I can't buy with just one thousand. I'll probably stay with that two thousand and say "thank you very much!" Finally suppose I see two million dollars. If I switch and guessed wrong I still get a million. That's a lot of money, but not enough for a decent retirement. Four million: That's enough for a decent retirement. Might as well go for it.

This isn't very appealing to me, personally. Having understood the idea of utility, it still seems to me like you're using arbitrary thoughts to make a decision, i.e. your orientation isn't based in anything objective, and thus seems subject to be in any given state at any given time. And if I tell myself I'm fine with that being the case, then I can essentially just say anything I choose to do is justifiable, simply because I decided that it was. E.g. you said you'd stick with $2k because you could think of "a lot I can buy with $2k that I can't buy with just $1k," while making the opposite decision when faced with $2MM, even though the same statement applies using $2MM and $1MM.
 
  • #9
I think the reason of this paradox is that the random distribution from which the least (or equivalently, the largest) envelope was selected from is not given. If you think the (positive) amount of money in the least envelope is totally random, what sense do we give this? A uniform distribution on the set of integers [itex][1,\infty)[/itex] ? You can't have that.
 
  • #10
disregardthat said:
A uniform distribution on the set of integers [itex][1,\infty)[/itex] ? You can't have that.

But if we set a probability measure, it's allowed, right? Is anybody able to show me both why it can't be uniform if infinite and why it's (suddenly) doable when the distribution is abnormal or whatever?
 
  • #11
grotto said:
I'm guessing you meant +$25 if you accept, as you net $50 per 2 occurrences. And yeah, I suppose I'm kinda trying to come to an agreement, at least with myself, or at least convince myself that I can't, but it's obviously kinda difficult :P
My bad. You're right; the expected value of accepting is +$25 per trial.

This isn't very appealing to me, personally. Having understood the idea of utility, it still seems to me like you're using arbitrary thoughts to make a decision, i.e. your orientation isn't based in anything objective, and thus seems subject to be in any given state at any given time.
Per Monty Python, a Norwegian Blue parrot has very beautiful plumage. A dead Norwegian Blue might still have beautiful plumage, but it is nonetheless an ex-parrot. Per Matthew Rabin and Richard Ho Thaler (http://harbaugh.uoregon.edu/Readings/Risk/Rabin%202001%20JEP%20Risk%20Aversion%20Anomalies.pdf ), expected utility theory similarly has very beautiful plumage, and like the parrot in the Monty Python skit, "expected utility is an ex-hypothesis".

Expected utility theory has its place (Rabin thinks that place is the trash bin), but it is perhaps a bit too simplistic and definitely is not universal. People do switch from risk seeking to risk aversion, and vice versa, depending on circumstance. Just because someone is risk averse in a moderate setting does not mean they are risk averse everywhere. People oftentimes are risk seeking when risk is small and/or positive outcomes are large.
disregardthat said:
I think the reason of this paradox is that the random distribution from which the least (or equivalently, the largest) envelope was selected from is not given. If you think the (positive) amount of money in the least envelope is totally random, what sense do we give this? A uniform distribution on the set of integers [itex][1,\infty)[/itex] ? You can't have that.
Exactly. So if you are playing the game you have to reason under uncertainty. In this case, it's extreme uncertainty because the underlying distribution is a complete unknown.
 
Last edited by a moderator:
  • #12
grotto said:
But if we set a probability measure, it's allowed, right? Is anybody able to show me both why it can't be uniform if infinite and why it's (suddenly) doable when the distribution is abnormal or whatever?

In a discrete uniform distribution on a set S , every element has the same probability 1/(|S|), where |S| is the size/cardinality of S.
 
  • #13
grotto said:
But if we set a probability measure, it's allowed, right? Is anybody able to show me both why it can't be uniform if infinite and why it's (suddenly) doable when the distribution is abnormal or whatever?

Sure. Let's try to have a uniform dist. on the natural numbers. The measure of each number is e. e is the same for every number, otherwise the distribution fails to be uniform. e is greater than zero, otherwise the sum of the measures is zero. It must be one. For every value of e greater than zero, the sum of the measures of each number is infinite. It fails to be one.

In contrast, set the measure of each number n as 1/2^n. The sum of the measures is one.
 
  • #14
grotto said:
This isn't very appealing to me, personally. Having understood the idea of utility, it still seems to me like you're using arbitrary thoughts to make a decision, i.e. your orientation isn't based in anything objective, and thus seems subject to be in any given state at any given time. And if I tell myself I'm fine with that being the case, then I can essentially just say anything I choose to do is justifiable, simply because I decided that it was. E.g. you said you'd stick with $2k because you could think of "a lot I can buy with $2k that I can't buy with just $1k," while making the opposite decision when faced with $2MM, even though the same statement applies using $2MM and $1MM.

He is using an arbitrary and subjective utility function. There is no rule that forbids this.
 

1. What is EV and Logic?

EV stands for expected value, which is a concept in probability theory that represents the average outcome of a random variable. Logic, on the other hand, refers to the principles and methods used to reason and make logical deductions. The phrase "EV and Logic" is often used together to describe the process of using expected value calculations in logical reasoning.

2. How does EV and Logic relate to decision making?

EV and Logic are crucial components in decision making because they help individuals weigh the potential outcomes of a decision and make rational choices based on those calculations. By using expected value calculations in logical reasoning, individuals can make more informed and logical decisions.

3. Can you give an example of using EV and Logic in real life?

One example of using EV and Logic in real life is when considering whether or not to purchase a lottery ticket. The expected value of a lottery ticket is typically negative, meaning the potential losses outweigh the potential gains. By using logic and considering the expected value, individuals can make a more informed decision about whether or not to spend money on a lottery ticket.

4. How can EV and Logic be used in scientific research?

EV and Logic can be used in scientific research to help determine the potential risks and benefits of a particular experiment or study. By calculating the expected value of different outcomes, researchers can make more informed decisions about the design and execution of their studies.

5. What are some limitations of using EV and Logic?

One limitation of using EV and Logic is that it relies on assumptions and probability, which may not always accurately predict real-world outcomes. Additionally, it can be challenging to assign a numerical value to certain variables, making the expected value calculation less precise. Furthermore, EV and Logic may not account for personal preferences or emotions, which can also influence decision making.

Similar threads

Replies
17
Views
1K
  • Electromagnetism
Replies
2
Views
4K
  • Aerospace Engineering
Replies
5
Views
7K
  • STEM Academic Advising
Replies
10
Views
3K
  • General Discussion
Replies
4
Views
3K
  • STEM Academic Advising
Replies
4
Views
2K
  • Sci-Fi Writing and World Building
Replies
3
Views
2K
  • STEM Academic Advising
Replies
13
Views
4K
  • General Discussion
Replies
1
Views
8K
Replies
7
Views
29K
Back
Top