MHB Conditional expectation proof question

Click For Summary
The discussion centers on proving that E(X1 + X2|Y) equals E(X1|Y) + E(X2|Y) using the definition of conditional expectation. The user attempts to substitute X with X1 + X2 and applies the expectation operator, breaking it down into components for X1 and X2. They question whether their notation is correct, specifically regarding the use of lowercase 'y' versus uppercase 'Y'. The conversation highlights the importance of maintaining consistent notation in mathematical proofs. The proof process is on the right track, and clarification on notation will enhance understanding.
oyth94
Messages
32
Reaction score
0
Here is a proof question: For two random variables X and Y, we can define E(X|Y) to be the function of Y that satisfies E(Xg(X)) = E(E(X|Y)g(Y)) for any function g. Using this definition show that E(X1 + X2|Y) = E(X1|Y) + E(X2|Y)

So what I did was I plugged into X = X1 + X2
E(E(X1 + X2)|Y)g(Y))
= E(X1g(Y)) + E(X2g(Y))
= E(E(X1|y)g(Y) + E(X2|Y)g(Y))
= E(g(Y) [E(X1|Y) + E(X2|Y)]

am I on the right track? what do I do after that?
 
Physics news on Phys.org
Are the $y$ supposed to be $Y$?
 
Greetings, I am studying probability theory [non-measure theory] from a textbook. I stumbled to the topic stating that Cauchy Distribution has no moments. It was not proved, and I tried working it via direct calculation of the improper integral of E[X^n] for the case n=1. Anyhow, I wanted to generalize this without success. I stumbled upon this thread here: https://www.physicsforums.com/threads/how-to-prove-the-cauchy-distribution-has-no-moments.992416/ I really enjoyed the proof...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
2K
  • · Replies 54 ·
2
Replies
54
Views
6K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K