Proving conditional expectation

Click For Summary

Discussion Overview

The discussion revolves around the implications of conditional expectation in the context of random variables, specifically examining the relationship between E(U|X) and E(U^2|X) under certain assumptions. The scope includes theoretical aspects of probability and statistics, particularly related to regression analysis and homoskedasticity.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant presents an assumption that E(U|X) = E(U) = 0 implies E(U^2|X) = E(U^2), seeking proof for this claim.
  • Another participant challenges this assumption by providing a counterexample where U conditioned on X has a normal distribution with a variance dependent on X, leading to differing expectations for E(U^2|X) and E(U^2).
  • A later reply references a passage regarding the homoskedasticity assumption in simple linear regression, questioning how the conclusion σ² = E(U²|X) implies σ² = E(U²) is reached.
  • One participant asserts that the conclusion is derived from the assumption of homoskedasticity.

Areas of Agreement / Disagreement

Participants express disagreement regarding the implications of the initial assumption, with one providing a counterexample that suggests the original claim may not hold true. The discussion remains unresolved as differing viewpoints are presented without consensus.

Contextual Notes

The discussion highlights the dependence on specific assumptions regarding the distributions of U and X, as well as the implications of homoskedasticity in regression analysis. There are unresolved mathematical steps regarding the proof of the initial claim.

Usagi
Messages
38
Reaction score
0
Hi guys, assume we have an equality involving 2 random variables U and X such that E(U|X) = E(U)=0, now I was told that this assumption implies that E(U^2|X) = E(U^2). However I'm not sure on how to prove this, if anyone could show me that'd be great!
 
Physics news on Phys.org
Usagi said:
Hi guys, assume we have an equality involving 2 random variables U and X such that E(U|X) = E(U)=0, now I was told that this assumption implies that E(U^2|X) = E(U^2). However I'm not sure on how to prove this, if anyone could show me that'd be great!

Not sure this is true. Suppose \(U|(X=x) \sim N(0,x^2)\), and \(X\) has whatever distribution we like.

Then \(E(U|X=x)=0\) and \( \displaystyle E(U)=\int \int u f_{U|X=x}(u) f_X(x)\;dudx =\int E(U|X=x) f_X(x) \; dx=0\).

Now \(E(U^2|X=x)={\text{Var}}(U|X=x)=x^2\). While \( \displaystyle E(U^2)=\int E(U^2|X=x) f_X(x) \; dx= \int x^2 f_X(x) \; dx\).

Or have I misunderstood something?

CB
 
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that \sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)

Thanks for your help!
 
Usagi said:
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that \sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)

Thanks for your help!

It is the assumed homoskedasticity (that is what it means).

CB
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K