Law of total expectation (VECTOR case)

  • Context: Graduate 
  • Thread starter Thread starter kingwinner
  • Start date Start date
  • Tags Tags
    Expectation Law
Click For Summary

Discussion Overview

The discussion centers on the law of total expectation and its generalizations to the vector case, specifically exploring the implications of these generalizations and extensions. Participants seek clarification on the definitions and relationships involved, particularly in the context of continuous random variables.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants question whether E(Y|X1,X2) is a random variable and if it can be expressed as a function of X1 and X2, suggesting E(Y|X1,X2)=g(X1,X2) for some function g.
  • There is uncertainty regarding whether the extensions (i) and (ii) are direct consequences of the law of total expectation, with participants expressing difficulty in deriving these as special cases.
  • One participant provides a general definition of E(Y|A) and discusses the integration approach for continuous random variables, indicating the complexity of defining E(Y|X) and E(Y|X1,X2) for general cases.
  • Another participant suggests that proving the relationships may be challenging and recommends trying examples with independent and identical distributions (iid) for X1 and X2.
  • There is a specific inquiry about whether E(Y|X1) = E[E(Y|X1,X2)|X1] can be considered a special case of the law of total expectation.

Areas of Agreement / Disagreement

Participants express varying levels of understanding and agreement on the definitions and implications of the law of total expectation and its extensions. The discussion remains unresolved, with multiple competing views on how to approach the derivations and definitions.

Contextual Notes

Participants note the complexity of defining conditional expectations for general random variables and the potential limitations of the definitions they are working with. There is also mention of the need for specific examples to clarify the concepts.

kingwinner
Messages
1,266
Reaction score
0
" The law of total expectation is: E(Y) = E[E(Y|X)].
It can be generalized to the vector case: E(Y) = E[E(Y|X1,X2)].

Further extension:
(i) E(Y|X1) = E[E(Y|X1,X2)|X1]
(ii) E(Y|X1,X2) = E[E(Y|X1,X2,X3)|X1,X2] "
====================

I understand the law of total expectation itself, but I don't understand the generalizations to the vector case and the extensions.

1) Is E(Y|X1,X2) a random variable? Is E(Y|X1,X2) a function of both X1 and X2? i.e. E(Y|X1,X2)=g(X1,X2) for some function g?

2) Are (i) and (ii) direct consequences of the law of total expectation? (are they related at all?) I don't see how (i) and (ii) can be derived as special cases from it...can someone please show me how?

Any help is much appreciated!
 
Physics news on Phys.org
kingwinner said:
I understand the law of total expectation itself, but I don't understand the generalizations to the vector case and the extensions.

1) Is E(Y|X1,X2) a random variable? Is E(Y|X1,X2) a function of both X1 and X2? i.e. E(Y|X1,X2)=g(X1,X2) for some function g?

2) Are (i) and (ii) direct consequences of the law of total expectation? (are they related at all?) I don't see how (i) and (ii) can be derived as special cases from it...can someone please show me how?

E(Y|X) and E(Y|X1,X2) are tricky to define for general random variables. Which definition are you working with?
 
bpet said:
E(Y|X) and E(Y|X1,X2) are tricky to define for general random variables. Which definition are you working with?

E(Y|X=x)=

∫ y f(y|x) dy
-∞
for continuous random variables X and Y.
(similarly for discrete).

General definition:
E(Y|A)=E(Y I_A)/E(I_A)=E(Y I_A)/P(A)
where I_A is the indicator function of A.

If it's too hard to show it in general, can you please show me how can we derive (i) and (ii) from the law of total expectation for the case of CONTINUOUS random variables?

Thanks!
 
Last edited:
kingwinner said:
... for the case of CONTINUOUS random variables?

Even then it's tricky - try some examples first with X1 iid X2 and then with X1=X2 and you'll see how those definitions break. The only robust proofs I've seen work with implicit definition for E(Y|X) etc. Maybe someone else here can suggest a simpler way?
 
First of all, is E(Y|X1,X2) a function of X1 and X2??

Is E(Y|X1) = E[E(Y|X1,X2)|X1] a special case of the law of total expectation E(Y) = E[E(Y|X)]?
 

Similar threads

Replies
3
Views
3K
  • · Replies 54 ·
2
Replies
54
Views
7K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
6K