Expectation value of the sum of two random variables

  1. 1. The problem statement, all variables and given/known data
    The expectation value of the sum of two random variables is given as:

    [tex] \langle x + y \rangle = \langle x \rangle + \langel y \rangle[/tex]

    My textbook provides the following derivation of this relationship.

    Suppose that we have two random variables, x and y. Let [tex]p_{ij}[/tex] be the probability that our measurement returns [tex]x_{i}[/tex] for the value of x and [tex]y_{j}[/tex] for the value of y. Then the expectation value of the sum of [tex]x+y[/tex] is:

    [tex] \langle x + y \rangle = \sum\limits_{ij} p_{ij} (x_i + y_j) =\sum\limits_{ij} p_{ij} x_i + \sum\limits_{ij} p_{ij} x_j [/tex]

    Then I am given the following statement:

    But [tex]\sum\limits_j p_{ij} = p_i[/tex] is the probability that we measure [tex]x_i[/tex] regardless of what we measure for y, so it must be equal to [tex]p_i[/tex]. Similarly, [tex]\sum\limits_i p_{ij} = p_j[/tex], is the probability of measuing [tex]y_i[/tex] irrespective of what we get for [tex]x_i[/tex].




    2. Relevant equations

    The difficulty I have with this statement is that I do no see how [tex]\sum\limits_j p_{ij}[/tex] can be equal to [tex] p_i[/tex].


    3. The attempt at a solution

    Summing over j, we should have [tex](p_{i1} + p_{i2},+ .... p_{in})[/tex]. Now, is this equal to [tex]p_i[/tex].

    And similarly how can [tex]\sum\limits_i p_{ij}[/tex] can be equal to [tex]p_j[/tex]

    I am hopefull that someone can clear this up for me.

    Thank you for your kind assitance.

    jg370[/quote]
    1. The problem statement, all variables and given/known data



    2. Relevant equations



    3. The attempt at a solution
     
  2. jcsd
  3. lanedance

    lanedance 3,307
    Homework Helper

    so you have 2 discrete random variables X & Y, with a joint distribution, pij
    [tex]p_{ij} = P(X=x_i, Y = y_j)[/tex]

    the expectation is given by:
    [tex] <X+Y> = \sum_{ij} p_{ij} x_i y_j = [/tex]

    By definition, the marginal probabilities are
    [tex]P(X=x_i) = \sum_{j} p_{ij} = p_i [/tex]
    [tex]P(Y=y_j) = \sum_{i} p_{ij} = p_j [/tex]

    If the variables are independent then you have the further conidtion that
    [tex]p_{ij} = P(X=x_i, Y = y_j) = P(X=x_i)P(Y = y_j) = p_i p_j [/tex]
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook