[probability theory] prove E(X|X) = X

nonequilibrium
Messages
1,412
Reaction score
2

Homework Statement


Well, not really, but in essence that's the part I'm having trouble with. The actual question is
Show that for continuous random variables, E(Xh(Y)|Y) = h(Y) E(X|Y).
The equality seems obvious enough, but I'm unsure how to actually prove it...

Homework Equations


N/A

The Attempt at a Solution


So it seems I have to prove that P( \{ E(Xh(Y)|Y) = h(Y) E(X|Y) \} ) = 1.

Can anybody tell me how they would start?

I shall say how I would proceed if pressed:

As I see it, E(Xh(Y)|Y) is actually a function with possible values E(X h(Y) | Y = y_0) and we can write E(X h(Y) | Y = y_0) = \iint x h(y) f_{X,Y|Y=y_0}(x,y) \mathrm d x \mathrm d x and since f_{X,Y|Y=y_0}(x,y) = f_{X|Y=y_0} \delta(y-y_0) (or perhaps this needs to be proven too?) we get E(X h(Y) | Y = y_0) = h(y_0) \int x f_{X|Y=y_0}(x) \mathrm d x = h(y_0) E(X|Y) and hence E(Xh(Y)|Y) is the same random variable as h(Y) E(X|Y).

Is this acceptable?
 
Physics news on Phys.org
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variableAs X=x_0 the conditional probability distribution of X becomes a delta function
p_{X}(x|X=x_0)=\delta(x-x_0)

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x)

and the conditional case becomes
p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0)
 
Thanks for posting. I'm not sure what you are getting at in your post; is it for proving the delta-distribution equality? (in which case I get what you're saying and how it helps)

And as for:
also an expectation is NOT a random variable
I say I must disagree. E(X h(Y)|Y) is certainly a random variable, or at least that is what is meant but perhaps the book's notation is confusing and you're accustomed to writing it a different way. It means with this the random variable y_0 \mapsto E(X h(Y) | Y = y_0). It is like how E(X|Y) is also called a random variable, after all otherwise the fundamental equality E(E(X|Y))=E(X) could have no meaning.
 
ok, yeah I think I get what you mean around E(X|Y), missed in the first and don't think I have dealt with these a heap. So the way I read it, effectively E(X|Y) = f(Y), the value will be dependent on the distribution of X, but the stochastic element comes form the variable Y. re-reading I see yo urpetty much had the delta part pegged, but that should back it up.

wouldn't it just be sufficent to show that
E(Xh(Y)|Y=y_0)= h(Y=y_0)E(X|Y=y_0), \ \forall y_0

this seems to be the direction you've headed in so far and seems acceptable

otherwise do you have a different definition for E(X|Y) to start from?
 
Last edited:
so yeah i think what you have done is reasonable if you show the integration steps
 
lanedance said:
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variable


As X=x_0 the conditional probability distribution of X becomes a delta function
p_{X}(x|X=x_0)=\delta(x-x_0)

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x)

and the conditional case becomes
p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0)

A conditional expectation such as E(X|Y) _is_ a random variable, whose value on the event {Y = y} is E(X|Y = y). (Such matters typically do not occur in Probability 101, but do appear very much in senior or graduate level probability.)

RGV
 
yeah its been a while... but we got there
 

Similar threads

Back
Top