1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Expectation operator - linearity

  1. Dec 8, 2012 #1
    1. The problem statement, all variables and given/known data
    Show that the expectation operator E() is a linear operator, or, implying:
    [tex]E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})[/tex]

    2. Relevant equations
    [tex]E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex]

    With [tex]f_{\bar{x}}[/tex] the probability density function of random variable x.

    3. The attempt at a solution
    [tex]aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex] and:
    [tex]bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

    Introducing a new random variable:
    [tex]\bar{v}=a\bar{x}+b\bar{y}[/tex]

    Then:

    [tex]E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv[/tex]

    And accordingly:
    [tex]E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv[/tex]

    So what remains to proof is that:

    [tex]a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

    And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

    Thank you in advance!
     
  2. jcsd
  3. Dec 8, 2012 #2

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Haven't you seen some kind of results that gives the pdf of X+Y in terms of the pdf of X and Y?? Hint: it has to do with convolution.
     
  4. Dec 8, 2012 #3
    Thank you for your reply.

    I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
     
  5. Dec 8, 2012 #4

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Yes to both.
     
  6. Dec 8, 2012 #5

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

    So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
    (1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
    [tex] EZ \equiv \int z \: dF_Z(z) [/tex]
    can be written as
    [tex] \int g(x,y) \: d^2F_{XY}(x,y),[/tex]
    which becomes
    [tex] \sum_{x,y} g(x,y)\: P_{XY}(x,y)[/tex]
    in the discrete case where there are no densities, and becomes
    [tex] \int g(x,y) f_{XY}(x,y) \, dx \, dy [/tex]
    in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
    [tex] \int (ax + by) f_{XY}(x,y) \, dx \, dy
    = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .[/tex]
    The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.
     
    Last edited: Dec 8, 2012
  7. Dec 9, 2012 #6

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Hmm, so there really is no elementary proof?? (I guess it depends on what you call elementary though). This kind of makes me happy that I know measure theory, it makes the proof of this result so much simpler.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Expectation operator - linearity
Loading...