Proving conditional expectation

Click For Summary
SUMMARY

The discussion centers on the relationship between conditional expectation and homoskedasticity in the context of random variables U and X. It is established that if E(U|X) = E(U) = 0, then E(U^2|X) = E(U^2) under the assumption of homoskedasticity. The participants clarify that the equality E(U|X) = 0 does not necessarily imply E(U^2|X) = E(U^2) without the homoskedasticity assumption, as demonstrated through a counterexample involving a normal distribution.

PREREQUISITES
  • Understanding of conditional expectation, specifically E(U|X)
  • Knowledge of homoskedasticity in regression analysis
  • Familiarity with variance and its relationship to expectation
  • Basic concepts of probability distributions, particularly the normal distribution
NEXT STEPS
  • Study the implications of homoskedasticity in linear regression models
  • Learn about the properties of conditional expectation in probability theory
  • Explore examples of variance in conditional distributions
  • Investigate the role of normal distributions in statistical modeling
USEFUL FOR

Statisticians, data scientists, and researchers involved in regression analysis and those seeking to deepen their understanding of conditional expectations and homoskedasticity.

Usagi
Messages
38
Reaction score
0
Hi guys, assume we have an equality involving 2 random variables U and X such that E(U|X) = E(U)=0, now I was told that this assumption implies that E(U^2|X) = E(U^2). However I'm not sure on how to prove this, if anyone could show me that'd be great!
 
Physics news on Phys.org
Usagi said:
Hi guys, assume we have an equality involving 2 random variables U and X such that E(U|X) = E(U)=0, now I was told that this assumption implies that E(U^2|X) = E(U^2). However I'm not sure on how to prove this, if anyone could show me that'd be great!

Not sure this is true. Suppose \(U|(X=x) \sim N(0,x^2)\), and \(X\) has whatever distribution we like.

Then \(E(U|X=x)=0\) and \( \displaystyle E(U)=\int \int u f_{U|X=x}(u) f_X(x)\;dudx =\int E(U|X=x) f_X(x) \; dx=0\).

Now \(E(U^2|X=x)={\text{Var}}(U|X=x)=x^2\). While \( \displaystyle E(U^2)=\int E(U^2|X=x) f_X(x) \; dx= \int x^2 f_X(x) \; dx\).

Or have I misunderstood something?

CB
 
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that \sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)

Thanks for your help!
 
Usagi said:
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that \sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)

Thanks for your help!

It is the assumed homoskedasticity (that is what it means).

CB
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K