Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Change of random variables

  1. Jul 17, 2011 #1
    Hoel: An Introduction to Mathematical Statistics introduces the following formulas for expectation, where the density is zero outside of the interval [a,b].

    [tex]E\left [ X \right ] = \int_{a}^{b} x f(x) \; dx[/tex]

    [tex]E\left [ g(X) \right ] = \int_{a}^{b} g(x) f(x) \; dx[/tex]

    He says, "Let the random variable g(X) be denoted by Y. Then knowing the density f(x) of X it is theoretically possible to to find the density h(x) of Y. The expected value of g(X) is the same as the expected value of Y; therefore if h(y) is available, the latter expected value can be expressed in the form

    [tex]E\left [ Y \right ] = \int_{-\infty}^{\infty} y h(y) \; dy.[/tex]

    "By using the change of variable techniques of calculus, it can be shown that this value is the same as the value given by (22) [the 2nd formula I've quoted in this post]."


    I've been trying to do this. Let I denote the identity function on [itex]\mathbb{R}[/itex]. Let [itex]f_X[/itex] denote the pdf of the distribution induced by a random variable X, and [itex]F_X[/itex] its cdf. I'm guessing that when the expected value of a distribution is expressed like this in terms of a random variable, [itex]E[X][/itex] is to be understood as [itex]E[P_X][/itex], and [itex]E[g(X)][/itex] as [itex]E[P_{g \circ X}][/itex], where [itex]P_X[/itex] means the distribution induced by the random variable X, given some sample space implicit in the context.

    Then expectation is defined by

    [tex]E[P_X]=\int_a^b I \cdot f_X,[/tex]

    and we must show that

    [tex]\int_a^b I \cdot f_{g \circ X} = \int_a^b g \cdot f_X,[/tex]

    or do the limits need to be changed? Using the chain rule (integration by substitution) and the identity

    [tex]F_{g \circ X}=F_X \circ g,[/tex]

    leads me to

    [tex]\int_a^b I \cdot f_{g \circ X} = \int_{g(a)}^{g(b)} I \cdot f_X[/tex]

    which looks tantalisingly close, but am I going in the right direction?
     
  2. jcsd
  3. Jul 17, 2011 #2
    This is sometimes called The Law of the Unconscious Statistician, so you might try looking for sources. I'm not sure how to use your approach, so I'll give a slightly different one. In Sheldon Ross's A First Course in Probability, he shows this by first proving the lemma
    [tex]
    \mathbf{E}[Y] =\int_0^\infty \mathbf{P}\{Y > y \} \, dy - \int_0^\infty \mathbf{P}\{Y < -y \} \, dy
    [/tex]
    for any random variable Y. (This is a pretty straightforward proof: just switch the order of integration using the pdf for Y.) After that, he sets Y = g(X) and switching the order of integration once more, the result falls out. I can go into more detail if you'd like, but I hope this helps!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Change of random variables
  1. Change of variables. (Replies: 3)

  2. Change of variable (Replies: 1)

  3. Change Of Variable (Replies: 11)

  4. Change of variable (Replies: 5)

  5. Change of variable (Replies: 7)

Loading...