Addition Rule for Random Variables

AI Thread Summary
The Addition Rule for Random Variables states that E(X+Y) = E(X) + E(Y) holds true even for dependent random variables. This is because the expected values are calculated using joint probability distributions, which account for the dependencies between X and Y. The joint distribution P(X,Y) allows for the correct summation of the expected values, regardless of their dependence. Therefore, the expected value of the sum remains valid, as shown through the mathematical derivation provided. Understanding this concept clarifies the relationship between dependent variables in probability theory.
Peter G.
Messages
439
Reaction score
0
Hi,

I am having a hard time understanding why the Addition Rule for two Random Variables holds even when the random variables are dependent.


Essentially: why is E(X+Y) = E(X) + E(Y) when X and Y are dependent random variable?

Given the two variables are dependent, if X happens to take on a value x, for example, doesn't that change the probability distribution of Y and, thus, affect its expected value?

I hope I made my doubt clear,
Peter G.
 
Physics news on Phys.org
Let's say you have a joint probability distribution P(X,Y).
Then
E(X)=\sum_{i}X_{i}P(X_{i})=\sum_{i,j}X_{i} P(X_{i},Y_{j}),
and
E(Y)=\sum_{j}Y_{j}P(Y_{j})=\sum_{i,j}Y_{j} P(X_{i},Y_{j}).
From here, we can see that
E(X)+E(Y)=\sum_{i,j}(X_{i}+Y_{j}) P(X_{i},Y_{j})= E(X+Y).
Hope this helps:)
 
Thank you very much, jfizzix!
 
No problem:)
 

Similar threads

Replies
30
Views
4K
Replies
5
Views
2K
Replies
2
Views
2K
Replies
9
Views
2K
Replies
1
Views
2K
Replies
5
Views
2K
Back
Top