1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Variance Question

  1. Oct 4, 2013 #1
    X ~ standard uniform random variable
    We toss a coin randomly and define
    Y := { X if the coin toss is heads
    .......{ 1 is the coin toss is tails

    Question wants the Var(Y^p) for any p > 0.

    My work:
    Var(Y^p) = E(Y^(2p)) - E(Y^p)^2

    I'm not sure how to go about finding E(Y^p) and E(Y^(2p)). I thought of using moment generating functions, but the preferred method is supposed to utilize conditional probabilities. Any hint on how to compute E(Y^p) would help.

    Thanks
     
  2. jcsd
  3. Oct 4, 2013 #2

    chiro

    User Avatar
    Science Advisor

    Hey Gridvvk.

    If your coin is Bernoulli = C (for coin) then your Y random variable is defined by:

    Y = C + (1-C)X where C = 0 corresponds to heads and C = 1 corresponds to tails. You could also switch the values around to give you:

    Y = (1-C) + CX.

    I would suggest that you find the conditional distribution of Y given C first. To do this you have:

    P(Y=1|C=1) = 1, P(Y=y|C=0) = 1, P(C=0) = P(C=1) = 0.5. From this you obtain the joint distribution and get:

    P(A=a|B=b) = P(A=a,B=b)/P(B=b) which implies P(A=a,B=b) = P(A=a|B=b)*P(B=b).

    You then use the joint distribution over all possibilities to get the moments and thus the Var[Y^p].

    Note that you will be mixing integrals and sums together.
     
  4. Oct 5, 2013 #3

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I'm not at all sure what you mean by that. Rather than work with P(Y|C), how about going straight to E(Yp|C)?
     
    Last edited: Oct 5, 2013
  5. Oct 5, 2013 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Besides the other suggestions, you could do it by determining the probability distribution of Y. For example, it is not hard to get ##F(y) = P\{Y \leq y\}## for y ≥ 0, and from that get ##F_p(z) = P\{ Y^p \leq z \}## for z ≥ 0. Since ##Z = Y^p \geq 0## you can get its expected value by using the more-or-less standard expression
    [tex] EZ = \int_0^{\infty} P\{ Z > z \} \, dz[/tex]
    Similarly, you can get ##E Y^{2p}##.
     
  6. Oct 5, 2013 #5

    chiro

    User Avatar
    Science Advisor

    It has a uniform distribution over [0,1] which is P(X=x) = 1 where x is in [0,1].
     
  7. Oct 6, 2013 #6
    Thanks for all the hints and quick replies. I saw them right away, but didn't fully know how to proceed, so I thought I'd come back look at it later and figure it out, but I'm drawing a blank.

    I really liked the idea letting Y = C + (1-C)X and C = 0 corresponds to heads and C = 1 corresponds to tails. But I wasn't sure on how that exactly translates to getting the the joint distribution since Y is defined piecewise as a mixed random variable.

    Instead I tried finding E[Y] = 1/2 * E[X] + 1/2 * 1 = 1/2 * 1/2 + 1/2 * 1 = 3/4.
    I thought E[Y^2] = 1/2 * E[X^2] + 1/2 * 1 = 1/2 * 1/3 + 1/2 * 1 = 2/3

    But this seems E[Y^2] - E[Y]^2 < 0, which cannot happen, so I'm going about it the wrong way.


    To determine the CDF don't I need the PDF of Y? Let's suppose I do get it, how can one go from ##F(y) = P\{Y \leq y\}## to ##F_p(z) = P\{ Y^p \leq z \}## ?
     
  8. Oct 6, 2013 #7

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Only if you redefine P(X=x) to mean dP[X<x]/dx.
     
  9. Oct 6, 2013 #8

    chiro

    User Avatar
    Science Advisor

    This is just a conditional distribution where C = 1: what do you find wrong with this?
     
  10. Oct 6, 2013 #9

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I think you've checked E[Y^2] - E[Y] instead of E[Y^2] - E[Y]^2.
     
  11. Oct 6, 2013 #10

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    X has a uniform distribution, so continuous. The probability that it takes any specific value is zero.
     
  12. Oct 6, 2013 #11

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    If I tell you that ##Y^p \leq z## what can you tell me about Y?
     
  13. Oct 6, 2013 #12
    Oh, that is true, but when I try to generalize the same method for E[Y^p] I experience some unexpected results. But then again, I didn't use any conditional probabilities this way.

    Var[Y^p] = E[Y^(2p)] - E[Y^p]^2

    E[Y^p] = (1/2) * E[X^p] + 1/2 * 1 = 1/2(1 / (p + 1)) + 1/2 = (p + 2) / (2p + 2)
    Where I used the power-rule for integration to get E[X^p]

    Similarly, E[Y^(2p)] = (1/2) * E[X^(2p)] + (1/2) * 1 = 1/2(1 / (2p + 1)) + 1/2 = (p + 1) / (2p + 1)

    Var[Y^p] = (p + 1) / (2p + 1) - [(p + 2) / (2p + 2)]^2 = p^2(2p + 3) / [4(p + 1)^2(2p + 1)]

    However, the limit of Var[Y^p] as p tends to infinity is 1/4, and not 0 as I thought it would be by the law of large numbers.


    I'm not entirely sure, would it be ##Y \leq z## as well?
     
  14. Oct 6, 2013 #13

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    In that limit, Y^p is equally likely 1 or 0. Sounds like a var of 1/4 to me.
     
  15. Oct 6, 2013 #14
    Yes that does make sense. Just to clarify, the reason this doesn't violate the law of large numbers is because the law doesn't necessarily say that any variance dies out?

    Does that mean the method I utilized is correct, even though I did not rely on any conditional probabilities?
     
  16. Oct 6, 2013 #15

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    NO! Think about it. If ##Y \leq 1/2## do you honestly believe that ##Y^{10}## can be as large as 1/2, or does it have to be a lot less than 1/2? Conversely, it you know that ##Y^{10} \leq 1/2## do you really think that Y cannot be larger than 1/2?
     
  17. Oct 6, 2013 #16
    I thought you were referring to a random variable, Y, so I wasn't sure if the that properties was true for higher moments, but yes if you keep taking higher powers of a number that is in (0,1) you approach 0. Conversely, if Y^p is bounded by a z in (0,1) then the bigger that p gets, Y approaches 1.

    I fail to see the connection this has with the problem though.
     
  18. Oct 6, 2013 #17

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    For ##Z = Y^p## you can get a simple, explicit formula for ##P(Z \leq z)## and can use that to get explicit expressions for ##EZ## and ##\text{Var}\,Z##. In other words, if you know the probability distribution of a random variable you can use that to get the mean and variance, etc.
     
  19. Oct 6, 2013 #18
    Var[Z] = E[Z^2] - E[Z]^2

    Doesn't this correspond directly to what I did without resorting to making the substitution ##Z = Y^p##.

    Var[Y^p] = E[Y^(2p)] - E[Y^p]^2

    E[Y^p] = (1/2) * E[X^p] + 1/2 * 1 = 1/2(1 / (p + 1)) + 1/2 = (p + 2) / (2p + 2)
    Where I used the power-rule for integration to get E[X^p]

    Similarly, E[Y^(2p)] = (1/2) * E[X^(2p)] + (1/2) * 1 = 1/2(1 / (2p + 1)) + 1/2 = (p + 1) / (2p + 1)

    Var[Y^p] = (p + 1) / (2p + 1) - [(p + 2) / (2p + 2)]^2 = p^2(2p + 3) / [4(p + 1)^2(2p + 1)]

    Unless it is the case that my work/answer was wrong, or that by letting Z = Y^p, you streamline the thinking and present in a better manner.
     
  20. Oct 6, 2013 #19

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    No problem. These are just two (slightly different) ways of doing the same problem. If you prefer one way, go for it.
     
  21. Oct 6, 2013 #20

    chiro

    User Avatar
    Science Advisor

    You are way too anal.

    Stating P(X=x) = 1 for x in [0,1] is a very standard way of describing a probability density function.

    You don't need to bring up measure theory: it's entirely un-necessary for this problem.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Variance Question
  1. Adding variances (Replies: 3)

  2. Conditional Variances (Replies: 4)

Loading...