[probability theory] prove E(X|X) = X

Click For Summary

Homework Help Overview

The discussion revolves around proving a property of conditional expectations for continuous random variables, specifically the equality E(Xh(Y)|Y) = h(Y) E(X|Y). Participants express uncertainty about the proof process and the implications of the notation used in the problem.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the interpretation of conditional expectations and their representation as random variables. There are attempts to clarify the relationship between joint and conditional distributions, and some participants question the notation and definitions being used.

Discussion Status

There is an ongoing exploration of the proof, with some participants suggesting specific approaches and clarifying concepts. Multiple interpretations of the notation and definitions are being considered, and while some guidance has been provided, there is no explicit consensus on the proof method yet.

Contextual Notes

Participants note potential confusion arising from the notation and definitions used in the problem, particularly regarding the nature of conditional expectations and their representation as random variables. There is also mention of the need to show integration steps in the proof process.

nonequilibrium
Messages
1,412
Reaction score
2

Homework Statement


Well, not really, but in essence that's the part I'm having trouble with. The actual question is
Show that for continuous random variables, E(Xh(Y)|Y) = h(Y) E(X|Y).
The equality seems obvious enough, but I'm unsure how to actually prove it...

Homework Equations


N/A

The Attempt at a Solution


So it seems I have to prove that P( \{ E(Xh(Y)|Y) = h(Y) E(X|Y) \} ) = 1.

Can anybody tell me how they would start?

I shall say how I would proceed if pressed:

As I see it, E(Xh(Y)|Y) is actually a function with possible values E(X h(Y) | Y = y_0) and we can write E(X h(Y) | Y = y_0) = \iint x h(y) f_{X,Y|Y=y_0}(x,y) \mathrm d x \mathrm d x and since f_{X,Y|Y=y_0}(x,y) = f_{X|Y=y_0} \delta(y-y_0) (or perhaps this needs to be proven too?) we get E(X h(Y) | Y = y_0) = h(y_0) \int x f_{X|Y=y_0}(x) \mathrm d x = h(y_0) E(X|Y) and hence E(Xh(Y)|Y) is the same random variable as h(Y) E(X|Y).

Is this acceptable?
 
Physics news on Phys.org
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variableAs X=x_0 the conditional probability distribution of X becomes a delta function
p_{X}(x|X=x_0)=\delta(x-x_0)

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x)

and the conditional case becomes
p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0)
 
Thanks for posting. I'm not sure what you are getting at in your post; is it for proving the delta-distribution equality? (in which case I get what you're saying and how it helps)

And as for:
also an expectation is NOT a random variable
I say I must disagree. E(X h(Y)|Y) is certainly a random variable, or at least that is what is meant but perhaps the book's notation is confusing and you're accustomed to writing it a different way. It means with this the random variable y_0 \mapsto E(X h(Y) | Y = y_0). It is like how E(X|Y) is also called a random variable, after all otherwise the fundamental equality E(E(X|Y))=E(X) could have no meaning.
 
ok, yeah I think I get what you mean around E(X|Y), missed in the first and don't think I have dealt with these a heap. So the way I read it, effectively E(X|Y) = f(Y), the value will be dependent on the distribution of X, but the stochastic element comes form the variable Y. re-reading I see yo urpetty much had the delta part pegged, but that should back it up.

wouldn't it just be sufficent to show that
E(Xh(Y)|Y=y_0)= h(Y=y_0)E(X|Y=y_0), \ \forall y_0

this seems to be the direction you've headed in so far and seems acceptable

otherwise do you have a different definition for E(X|Y) to start from?
 
Last edited:
so yeah i think what you have done is reasonable if you show the integration steps
 
lanedance said:
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variable


As X=x_0 the conditional probability distribution of X becomes a delta function
p_{X}(x|X=x_0)=\delta(x-x_0)

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x)

and the conditional case becomes
p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0)

A conditional expectation such as E(X|Y) _is_ a random variable, whose value on the event {Y = y} is E(X|Y = y). (Such matters typically do not occur in Probability 101, but do appear very much in senior or graduate level probability.)

RGV
 
yeah its been a while... but we got there
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
1K
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K