1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Expectation operator - linearity

  1. Dec 8, 2012 #1
    1. The problem statement, all variables and given/known data
    Show that the expectation operator E() is a linear operator, or, implying:

    2. Relevant equations

    With [tex]f_{\bar{x}}[/tex] the probability density function of random variable x.

    3. The attempt at a solution
    [tex]aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex] and:

    Introducing a new random variable:



    And accordingly:

    So what remains to proof is that:


    And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

    Thank you in advance!
  2. jcsd
  3. Dec 8, 2012 #2
    Haven't you seen some kind of results that gives the pdf of X+Y in terms of the pdf of X and Y?? Hint: it has to do with convolution.
  4. Dec 8, 2012 #3
    Thank you for your reply.

    I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
  5. Dec 8, 2012 #4


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Yes to both.
  6. Dec 8, 2012 #5

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

    So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
    (1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
    [tex] EZ \equiv \int z \: dF_Z(z) [/tex]
    can be written as
    [tex] \int g(x,y) \: d^2F_{XY}(x,y),[/tex]
    which becomes
    [tex] \sum_{x,y} g(x,y)\: P_{XY}(x,y)[/tex]
    in the discrete case where there are no densities, and becomes
    [tex] \int g(x,y) f_{XY}(x,y) \, dx \, dy [/tex]
    in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
    [tex] \int (ax + by) f_{XY}(x,y) \, dx \, dy
    = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .[/tex]
    The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.
    Last edited: Dec 8, 2012
  7. Dec 9, 2012 #6
    Hmm, so there really is no elementary proof?? (I guess it depends on what you call elementary though). This kind of makes me happy that I know measure theory, it makes the proof of this result so much simpler.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook