Expectation value of the sum of two random variables

Click For Summary
SUMMARY

The expectation value of the sum of two discrete random variables, X and Y, is defined as <X + Y> = <X> + <Y>. The derivation involves the joint probability distribution p_{ij} = P(X = x_i, Y = y_j), which leads to the marginal probabilities P(X = x_i) = ∑_j p_{ij} = p_i and P(Y = y_j) = ∑_i p_{ij} = p_j. The discussion clarifies that the summation over j results in the marginal probability p_i, confirming the relationship between joint and marginal distributions.

PREREQUISITES
  • Understanding of discrete random variables
  • Familiarity with joint probability distributions
  • Knowledge of marginal probability concepts
  • Basic grasp of expectation values in probability theory
NEXT STEPS
  • Study the properties of joint and marginal distributions in probability theory
  • Learn about the independence of random variables and its implications
  • Explore the concept of expectation values in continuous random variables
  • Investigate the application of expectation values in statistical analysis
USEFUL FOR

Students of probability theory, statisticians, and anyone studying the behavior of random variables in mathematical contexts.

jg370
Messages
16
Reaction score
0

Homework Statement


The expectation value of the sum of two random variables is given as:

\langle x + y \rangle = \langle x \rangle + \langel y \rangle

My textbook provides the following derivation of this relationship.

Suppose that we have two random variables, x and y. Let p_{ij} be the probability that our measurement returns x_{i} for the value of x and y_{j} for the value of y. Then the expectation value of the sum of x+y is:

\langle x + y \rangle = \sum\limits_{ij} p_{ij} (x_i + y_j) =\sum\limits_{ij} p_{ij} x_i + \sum\limits_{ij} p_{ij} x_j

Then I am given the following statement:

But \sum\limits_j p_{ij} = p_i is the probability that we measure x_i regardless of what we measure for y, so it must be equal to p_i. Similarly, \sum\limits_i p_{ij} = p_j, is the probability of measuing y_i irrespective of what we get for x_i.




Homework Equations



The difficulty I have with this statement is that I do no see how \sum\limits_j p_{ij} can be equal to p_i.


The Attempt at a Solution



Summing over j, we should have (p_{i1} + p_{i2},+ ... p_{in}). Now, is this equal to p_i.

And similarly how can \sum\limits_i p_{ij} can be equal to p_j

I am hopefull that someone can clear this up for me.

Thank you for your kind assitance.

jg370[/quote]
 
Physics news on Phys.org
so you have 2 discrete random variables X & Y, with a joint distribution, pij
p_{ij} = P(X=x_i, Y = y_j)

the expectation is given by:
&lt;X+Y&gt; = \sum_{ij} p_{ij} x_i y_j =

By definition, the marginal probabilities are
P(X=x_i) = \sum_{j} p_{ij} = p_i
P(Y=y_j) = \sum_{i} p_{ij} = p_j

If the variables are independent then you have the further conidtion that
p_{ij} = P(X=x_i, Y = y_j) = P(X=x_i)P(Y = y_j) = p_i p_j
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
0
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K