I have the following problem: Assume g is a (smooth enough) function, X a random variable and [itex]\varepsilon^h[/itex] a sequence of random variables, whose moments converge to 0 as h goes to zero.(adsbygoogle = window.adsbygoogle || []).push({});

I would then like to prove that

[tex]

\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|

[/tex]

converges to zero as well as h goes to 0, if possible at the same rate as the first moment of [itex]\varepsilon^h[/itex].

I tried using Taylor's theorem which states that

[tex]

g(X+\varepsilon^h)-g(X) = g'(X)\varepsilon^h + R(X,\varepsilon^h),

[/tex]

where the absolute value of the remainder satisfies [itex]\left|R(X,\varepsilon^h)\right|\leq C(X)|\varepsilon^h|^2[/itex]. Using this and the Cauchy-Schwarz inequality I could show that

[tex]

\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|\leq\sqrt{\mathbb{E}(g'(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^2} + \sqrt{\mathbb{E}(C(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^4}.

[/tex]

For some reason, however, I'm not quite sure if thats correct In particular I don't know how to argue that C(X) should have finite variance. Maybe i must just assume that.

Anyway, I'd appreciate any input on how to bound the expected value of the increment in term of the moments of [itex]\varepsilon^h[/itex]. Thanks,

Pere

EDIT:

I think what I did is actually not correct because C(X) might also depend on on the value of [itex]\varepsilon^h[/itex]...Intuitively, however, I find it quite plausible that [itex]\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|[/itex] should not go to zero more slowly than [itex]\mathbb{E}|\varepsilon^h}[/itex] .. Thanks again

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Taylor approximation (probability)

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**