Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Risk Probability Question W/ Expected Value and Standard Deviation

  1. Nov 20, 2012 #1
    Hi everyone. I am currently in a club that prepares students for technical interviews for jobs such as investment banking, private equity and hedge funds. One of our mentors assigned us this question and to be honest I really do not have an idea how to approach it. I'm not sure if I am completely missing something that is obvious. I have not taken a math class in a couple years so I am definitely rusty.

    Consider a purely probabilistic game that you have the opportunity to play. Each time you play there are n potential known outcomes x1, x2, ..., xn (each of which is a specified gain or loss of dollars according to whether xi is positive or negative). These outcomes x1, x2, ..., xn occur with the known probabilities p1, p2, ..., pn respectively (where p1 + p2 + ... + pn = 1.0 and 0 <= pi <= 1 for each i). Furthermore, assume that each play of the game takes up one hour of your time, and that only you can play the game (you can't hire someone to play for you).

    Let E be the game's expected value and S be the game's standard deviation.

    1. In the real world, should a rational player always play this game whenever the
    expected value E is not negative? Why or why not?

    2. Does the standard deviation S do a good job of capturing how risky this game is?
    Why or why not?

    3. If you personally had to decide whether or not to play this game, how would
    you decide?

  2. jcsd
  3. Nov 21, 2012 #2


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    To answer a question like this completely you need a utility function. This encapsulates the relationship between how much a loss hurts you and how much a gain profits you - it isn't just money. Taking out insurance against losses that would seriously affect your life makes sense in utility theory, even though the average result is a profit to the insurance company. Under the same analysis, buying lottery tickets is incomprehensible.
    You could model the game as a random walk. You will have limited assets, and you need an income, so there is an absorbing barrier (bankruptcy) that advances linearly. So the downside of the game is the risk of hitting that barrier in your lifetime. The upside of the game is retiring comfortably. Utility theory gives you the tradeoff between the two, and you then have to compare the result with alternative ways of earning a living (none are risk free).
  4. Nov 22, 2012 #3

    Stephen Tashi

    User Avatar
    Science Advisor

    I'm not in finance, but my effort to mind-read the purpose of that question is as follows:

    The most elementary point is that if the game has some possibility of a negative gain then you have to worry about the problem of "gambler's ruin" even if the expected value of the gain is positive. In other words, you can't make your decisions based on the assumption that you can play the game as many times as you want to because you might run out of cash (or whatever "gain" is being measured with). Various pundits have commented on the the recent financial crises in the USA and one of the points they make is that a weakness of conventional financial forecasting is that it doesn't take in to account the possibility of rare events leading to gambler's ruin.

    To decide whether (and how often) to play the game, you would have to evaluate your utility function. (For example, one important question is whether you enjoy the game. Would you be willing to pay (i.e. to lose) a certain amount of money just to play?) Another factor is "opportunity cost". You should compute the expected gain per hour and compare this to what you could earn per hour doing other possible activities.

    I'm not sure what the question wants you to say about the standard deviation. Let's assume each play of the game is an independent event (i.e. you don't get better with practice). A big standard deviation increases the probabiity of "gambler's ruin" and also increases the probability of larger than expected gains in a given number of plays.

    The expected gain in N plays is N times the expected gain in 1 play. The expected standard deviation in N plays is [itex] \sqrt{N} [/itex] times the standard deviation of 1 play. So both the gain and the standard deviation increase, but the standard deviation increases in a "sub linear" manner. You can make playing the game many times look attractive (if it has postiive expected gain) by phrasing things in terms of ratios. If we look at the expected gain "per hour" it is the same for 1 play as for N plays. But the standard deviation of the gain per hour is smaller for N plays than for 1 play. I don't know what conclusioin your financial interviewer wants you to draw from these facts, but you should know the facts themselves.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook