Expectation value of the sum of two random variables

Click For Summary
The expectation value of the sum of two random variables, x and y, is expressed as ⟨x + y⟩ = ⟨x⟩ + ⟨y⟩, based on their joint probability distribution p_{ij}. The derivation shows that summing over the joint probabilities gives the marginal probabilities, where ∑_j p_{ij} equals p_i, the probability of measuring x_i regardless of y. Similarly, ∑_i p_{ij} equals p_j, the probability of measuring y_j irrespective of x. The discussion highlights the confusion regarding these marginal probabilities and their relationship to the joint distribution. Understanding these concepts is crucial for grasping the properties of expectation values in probability theory.
jg370
Messages
16
Reaction score
0

Homework Statement


The expectation value of the sum of two random variables is given as:

\langle x + y \rangle = \langle x \rangle + \langel y \rangle

My textbook provides the following derivation of this relationship.

Suppose that we have two random variables, x and y. Let p_{ij} be the probability that our measurement returns x_{i} for the value of x and y_{j} for the value of y. Then the expectation value of the sum of x+y is:

\langle x + y \rangle = \sum\limits_{ij} p_{ij} (x_i + y_j) =\sum\limits_{ij} p_{ij} x_i + \sum\limits_{ij} p_{ij} x_j

Then I am given the following statement:

But \sum\limits_j p_{ij} = p_i is the probability that we measure x_i regardless of what we measure for y, so it must be equal to p_i. Similarly, \sum\limits_i p_{ij} = p_j, is the probability of measuing y_i irrespective of what we get for x_i.




Homework Equations



The difficulty I have with this statement is that I do no see how \sum\limits_j p_{ij} can be equal to p_i.


The Attempt at a Solution



Summing over j, we should have (p_{i1} + p_{i2},+ ... p_{in}). Now, is this equal to p_i.

And similarly how can \sum\limits_i p_{ij} can be equal to p_j

I am hopefull that someone can clear this up for me.

Thank you for your kind assitance.

jg370[/quote]
 
Physics news on Phys.org
so you have 2 discrete random variables X & Y, with a joint distribution, pij
p_{ij} = P(X=x_i, Y = y_j)

the expectation is given by:
<X+Y> = \sum_{ij} p_{ij} x_i y_j =

By definition, the marginal probabilities are
P(X=x_i) = \sum_{j} p_{ij} = p_i
P(Y=y_j) = \sum_{i} p_{ij} = p_j

If the variables are independent then you have the further conidtion that
p_{ij} = P(X=x_i, Y = y_j) = P(X=x_i)P(Y = y_j) = p_i p_j
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
0
Views
845
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K