Law of total expectation (VECTOR case)

In summary: For continuous random variables, the law of total expectation states that E(Y|A)=E(Y I_A)/E(I_A)=E(Y I_A)/P(A). To derive (i) and (ii), you need to show that E(Y|A) is a function of A. This can be done by showing that E(Y I_A)/P(A) is a function of A. To derive (i), you need to show that E(Y I_A)/P(A) is a function of A. To derive (ii), you need to show that E(Y I_A)/P(A
  • #1
kingwinner
1,270
0
" The law of total expectation is: E(Y) = E[E(Y|X)].
It can be generalized to the vector case: E(Y) = E[E(Y|X1,X2)].

Further extension:
(i) E(Y|X1) = E[E(Y|X1,X2)|X1]
(ii) E(Y|X1,X2) = E[E(Y|X1,X2,X3)|X1,X2] "

====================

I understand the law of total expectation itself, but I don't understand the generalizations to the vector case and the extensions.

1) Is E(Y|X1,X2) a random variable? Is E(Y|X1,X2) a function of both X1 and X2? i.e. E(Y|X1,X2)=g(X1,X2) for some function g?

2) Are (i) and (ii) direct consequences of the law of total expectation? (are they related at all?) I don't see how (i) and (ii) can be derived as special cases from it...can somone please show me how?

Any help is much appreciated!
 
Physics news on Phys.org
  • #2
kingwinner said:
I understand the law of total expectation itself, but I don't understand the generalizations to the vector case and the extensions.

1) Is E(Y|X1,X2) a random variable? Is E(Y|X1,X2) a function of both X1 and X2? i.e. E(Y|X1,X2)=g(X1,X2) for some function g?

2) Are (i) and (ii) direct consequences of the law of total expectation? (are they related at all?) I don't see how (i) and (ii) can be derived as special cases from it...can somone please show me how?

E(Y|X) and E(Y|X1,X2) are tricky to define for general random variables. Which definition are you working with?
 
  • #3
bpet said:
E(Y|X) and E(Y|X1,X2) are tricky to define for general random variables. Which definition are you working with?

E(Y|X=x)=

∫ y f(y|x) dy
-∞
for continuous random variables X and Y.
(similarly for discrete).

General definition:
E(Y|A)=E(Y I_A)/E(I_A)=E(Y I_A)/P(A)
where I_A is the indicator function of A.

If it's too hard to show it in general, can you please show me how can we derive (i) and (ii) from the law of total expectation for the case of CONTINUOUS random variables?

Thanks!
 
Last edited:
  • #4
kingwinner said:
... for the case of CONTINUOUS random variables?

Even then it's tricky - try some examples first with X1 iid X2 and then with X1=X2 and you'll see how those definitions break. The only robust proofs I've seen work with implicit definition for E(Y|X) etc. Maybe someone else here can suggest a simpler way?
 
  • #5
First of all, is E(Y|X1,X2) a function of X1 and X2??

Is E(Y|X1) = E[E(Y|X1,X2)|X1] a special case of the law of total expectation E(Y) = E[E(Y|X)]?
 

1. What is the law of total expectation (VECTOR case)?

The law of total expectation (VECTOR case) is a mathematical principle that states that the expected value of a random vector is equal to the sum of the expected values of its components.

2. How is the law of total expectation (VECTOR case) different from the one-dimensional case?

In the one-dimensional case, the law of total expectation only applies to a single random variable, while in the vector case, it applies to a set of random variables.

3. What is the significance of the law of total expectation (VECTOR case) in statistics?

The law of total expectation (VECTOR case) is a fundamental concept in statistics that allows us to calculate the expected value of a complex random variable by breaking it down into its individual components.

4. Can the law of total expectation (VECTOR case) be applied to both discrete and continuous random variables?

Yes, the law of total expectation (VECTOR case) can be applied to both discrete and continuous random variables as long as the expected values of the individual components are well-defined.

5. How is the law of total expectation (VECTOR case) useful in real-life applications?

The law of total expectation (VECTOR case) is commonly used in various fields such as finance, economics, and engineering to calculate the expected values of multiple random variables simultaneously. This helps in making informed decisions and predictions based on statistical analysis.

Similar threads

  • Set Theory, Logic, Probability, Statistics
2
Replies
54
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
425
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
836
  • Quantum Physics
Replies
3
Views
203
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
26
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
875
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
2K
  • Introductory Physics Homework Help
Replies
7
Views
1K
Back
Top