1. The problem statement, all variables and given/known data
Well, not really, but in essence that's the part I'm having trouble with. The actual question is

The equality seems obvious enough, but I'm unsure how to actually prove it...

2. Relevant equations
N/A

3. The attempt at a solution
So it seems I have to prove that [itex]P( \{ E(Xh(Y)|Y) = h(Y) E(X|Y) \} ) = 1[/itex].

Can anybody tell me how they would start?

I shall say how I would proceed if pressed:

As I see it, E(Xh(Y)|Y) is actually a function with possible values [itex]E(X h(Y) | Y = y_0)[/itex] and we can write [itex]E(X h(Y) | Y = y_0) = \iint x h(y) f_{X,Y|Y=y_0}(x,y) \mathrm d x \mathrm d x[/itex] and since [itex]f_{X,Y|Y=y_0}(x,y) = f_{X|Y=y_0} \delta(y-y_0)[/itex] (or perhaps this needs to be proven too?) we get [itex]E(X h(Y) | Y = y_0) = h(y_0) \int x f_{X|Y=y_0}(x) \mathrm d x = h(y_0) E(X|Y) [/itex] and hence E(Xh(Y)|Y) is the same random variable as h(Y) E(X|Y).

Thanks for posting. I'm not sure what you are getting at in your post; is it for proving the delta-distribution equality? (in wich case I get what you're saying and how it helps)

And as for:

I say I must disagree. E(X h(Y)|Y) is certainly a random variable, or at least that is what is meant but perhaps the book's notation is confusing and you're accustomed to writing it a different way. It means with this the random variable [itex]y_0 \mapsto E(X h(Y) | Y = y_0)[/itex]. It is like how E(X|Y) is also called a random variable, after all otherwise the fundamental equality E(E(X|Y))=E(X) could have no meaning.

ok, yeah I think I get what you mean around E(X|Y), missed in the first and don't think I have dealt with these a heap. So the way I read it, effectively E(X|Y) = f(Y), the value will be dependent on the distribution of X, but the stochastic element comes form the variable Y. re-reading I see yo urpetty much had the delta part pegged, but that should back it up.

wouldn't it just be sufficent to show that
[tex]E(Xh(Y)|Y=y_0)= h(Y=y_0)E(X|Y=y_0), \ \forall y_0 [/tex]

this seems to be the direction you've headed in so far and seems acceptable

otherwise do you have a different definition for E(X|Y) to start from?

A conditional expectation such as E(X|Y) _is_ a random variable, whose value on the event {Y = y} is E(X|Y = y). (Such matters typically do not occur in Probability 101, but do appear very much in senior or graduate level probability.)