Expectation conditional on the sum of two random variables

But under the given assumption that Z is not correlated with e, we have shown that E(e|z+mu) = E(e|mu), which satisfies the first part of the problem.
  • #1
crguevar
1
0
Hi:

e, z, mu are vectors of size N
I need to show that E(e|z+mu) = E(e|mu) or at least E(e|z+mu) converges in probability to E(e|mu) as N goes to infinity, under the assumption that Z is not correlated with e.

My guess is that to get this result I also need z to be orthogonal to mu, that is z'mu=0

I tryed using the law of iterated expectations... but my bieg problem is that I'm not sure how to handle the condictioning on the sum of z and mu... I would really appreciate any help !

Regards
 
Physics news on Phys.org
  • #2
To show that E(e|z+mu) = E(e|mu), it is enough to assume that Z is not correlated with e. Since Z and mu are both independent of e, we can use the law of iterated expectations to write:E(e|z+mu) = E[E(e|z, mu)|z+mu] = E[E(e|mu)|z+mu] = E(e|mu). This shows that E(e|z+mu) = E(e|mu). To show that E(e|z+mu) converges in probability to E(e|mu) as N goes to infinity, we will need to make additional assumptions about z and mu, such as they being orthogonal or having certain distributions.
 

1. What is the formula for calculating the expectation conditional on the sum of two random variables?

The formula for calculating the expectation conditional on the sum of two random variables is: E(X+Y|Z) = E(X|Z) + E(Y|Z), where X and Y are the two random variables and Z is the given condition.

2. How is the expectation conditional on the sum of two random variables related to the Law of Total Expectation?

The expectation conditional on the sum of two random variables is a key concept in the Law of Total Expectation, which states that the overall expectation of a random variable can be calculated by taking the weighted average of its conditional expectations given different conditions. In the case of the sum of two random variables, the overall expectation is equal to the sum of the conditional expectations of each individual variable.

3. When is the expectation conditional on the sum of two random variables useful in statistical analysis?

The expectation conditional on the sum of two random variables is useful in statistical analysis when dealing with complex probability distributions or when trying to understand the relationship between two random variables. It allows for the calculation of the expected value of one variable while taking into account the influence of another variable.

4. Can the expectation conditional on the sum of two random variables be negative?

Yes, the expectation conditional on the sum of two random variables can be negative. This can occur if one of the individual variables has a negative expectation and the other has a positive expectation, leading to a net negative expectation when combined.

5. How can the expectation conditional on the sum of two random variables be used in decision-making?

The expectation conditional on the sum of two random variables can be used in decision-making by providing insight into the potential outcomes of a decision. By understanding the conditional expectations of each variable, one can make a more informed decision that takes into account the potential impact of each variable on the overall outcome.

Similar threads

  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
710
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
152
  • Calculus and Beyond Homework Help
Replies
5
Views
888
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
709
  • Calculus and Beyond Homework Help
Replies
1
Views
903
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
448
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top