Expectation value of the sum of two random variables

Click For Summary
SUMMARY

The expectation value of the sum of two discrete random variables, X and Y, is defined as <X + Y> = <X> + <Y>. The derivation involves the joint probability distribution p_{ij} = P(X = x_i, Y = y_j), which leads to the marginal probabilities P(X = x_i) = ∑_j p_{ij} = p_i and P(Y = y_j) = ∑_i p_{ij} = p_j. The discussion clarifies that the summation over j results in the marginal probability p_i, confirming the relationship between joint and marginal distributions.

PREREQUISITES
  • Understanding of discrete random variables
  • Familiarity with joint probability distributions
  • Knowledge of marginal probability concepts
  • Basic grasp of expectation values in probability theory
NEXT STEPS
  • Study the properties of joint and marginal distributions in probability theory
  • Learn about the independence of random variables and its implications
  • Explore the concept of expectation values in continuous random variables
  • Investigate the application of expectation values in statistical analysis
USEFUL FOR

Students of probability theory, statisticians, and anyone studying the behavior of random variables in mathematical contexts.

jg370
Messages
16
Reaction score
0

Homework Statement


The expectation value of the sum of two random variables is given as:

[tex]\langle x + y \rangle = \langle x \rangle + \langel y \rangle[/tex]

My textbook provides the following derivation of this relationship.

Suppose that we have two random variables, x and y. Let [tex]p_{ij}[/tex] be the probability that our measurement returns [tex]x_{i}[/tex] for the value of x and [tex]y_{j}[/tex] for the value of y. Then the expectation value of the sum of [tex]x+y[/tex] is:

[tex]\langle x + y \rangle = \sum\limits_{ij} p_{ij} (x_i + y_j) =\sum\limits_{ij} p_{ij} x_i + \sum\limits_{ij} p_{ij} x_j[/tex]

Then I am given the following statement:

But [tex]\sum\limits_j p_{ij} = p_i[/tex] is the probability that we measure [tex]x_i[/tex] regardless of what we measure for y, so it must be equal to [tex]p_i[/tex]. Similarly, [tex]\sum\limits_i p_{ij} = p_j[/tex], is the probability of measuing [tex]y_i[/tex] irrespective of what we get for [tex]x_i[/tex].




Homework Equations



The difficulty I have with this statement is that I do no see how [tex]\sum\limits_j p_{ij}[/tex] can be equal to [tex]p_i[/tex].


The Attempt at a Solution



Summing over j, we should have [tex](p_{i1} + p_{i2},+ ... p_{in})[/tex]. Now, is this equal to [tex]p_i[/tex].

And similarly how can [tex]\sum\limits_i p_{ij}[/tex] can be equal to [tex]p_j[/tex]

I am hopefull that someone can clear this up for me.

Thank you for your kind assitance.

jg370[/quote]
 
Physics news on Phys.org
so you have 2 discrete random variables X & Y, with a joint distribution, pij
[tex]p_{ij} = P(X=x_i, Y = y_j)[/tex]

the expectation is given by:
[tex]<X+Y> = \sum_{ij} p_{ij} x_i y_j =[/tex]

By definition, the marginal probabilities are
[tex]P(X=x_i) = \sum_{j} p_{ij} = p_i[/tex]
[tex]P(Y=y_j) = \sum_{i} p_{ij} = p_j[/tex]

If the variables are independent then you have the further conidtion that
[tex]p_{ij} = P(X=x_i, Y = y_j) = P(X=x_i)P(Y = y_j) = p_i p_j[/tex]
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
0
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K