Expectation value of the sum of two random variables


by jg370
Tags: expectation, random, variables
jg370
jg370 is offline
#1
Dec1-10, 06:52 AM
P: 18
1. The problem statement, all variables and given/known data
The expectation value of the sum of two random variables is given as:

[tex] \langle x + y \rangle = \langle x \rangle + \langel y \rangle[/tex]

My textbook provides the following derivation of this relationship.

Suppose that we have two random variables, x and y. Let [tex]p_{ij}[/tex] be the probability that our measurement returns [tex]x_{i}[/tex] for the value of x and [tex]y_{j}[/tex] for the value of y. Then the expectation value of the sum of [tex]x+y[/tex] is:

[tex] \langle x + y \rangle = \sum\limits_{ij} p_{ij} (x_i + y_j) =\sum\limits_{ij} p_{ij} x_i + \sum\limits_{ij} p_{ij} x_j [/tex]

Then I am given the following statement:

But [tex]\sum\limits_j p_{ij} = p_i[/tex] is the probability that we measure [tex]x_i[/tex] regardless of what we measure for y, so it must be equal to [tex]p_i[/tex]. Similarly, [tex]\sum\limits_i p_{ij} = p_j[/tex], is the probability of measuing [tex]y_i[/tex] irrespective of what we get for [tex]x_i[/tex].




2. Relevant equations

The difficulty I have with this statement is that I do no see how [tex]\sum\limits_j p_{ij}[/tex] can be equal to [tex] p_i[/tex].


3. The attempt at a solution

Summing over j, we should have [tex](p_{i1} + p_{i2},+ .... p_{in})[/tex]. Now, is this equal to [tex]p_i[/tex].

And similarly how can [tex]\sum\limits_i p_{ij}[/tex] can be equal to [tex]p_j[/tex]

I am hopefull that someone can clear this up for me.

Thank you for your kind assitance.

jg370[/quote]
1. The problem statement, all variables and given/known data



2. Relevant equations



3. The attempt at a solution
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
lanedance
lanedance is offline
#2
Dec1-10, 07:57 PM
HW Helper
P: 3,309
so you have 2 discrete random variables X & Y, with a joint distribution, pij
[tex]p_{ij} = P(X=x_i, Y = y_j)[/tex]

the expectation is given by:
[tex] <X+Y> = \sum_{ij} p_{ij} x_i y_j = [/tex]

By definition, the marginal probabilities are
[tex]P(X=x_i) = \sum_{j} p_{ij} = p_i [/tex]
[tex]P(Y=y_j) = \sum_{i} p_{ij} = p_j [/tex]

If the variables are independent then you have the further conidtion that
[tex]p_{ij} = P(X=x_i, Y = y_j) = P(X=x_i)P(Y = y_j) = p_i p_j [/tex]


Register to reply

Related Discussions
Relationship between two random variables having same expectation Precalculus Mathematics Homework 1
Expectation and variance of a random number of random variables Calculus & Beyond Homework 3
Expectation conditional on the sum of two random variables Calculus & Beyond Homework 0
Expectation of 2 random variables Set Theory, Logic, Probability, Statistics 5
Expectation of Sums of Random Variables Calculus & Beyond Homework 0