# Taylor approximation (probability)

1. Jul 28, 2010

### Pere Callahan

I have the following problem: Assume g is a (smooth enough) function, X a random variable and $\varepsilon^h$ a sequence of random variables, whose moments converge to 0 as h goes to zero.

I would then like to prove that
$$\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|$$

converges to zero as well as h goes to 0, if possible at the same rate as the first moment of $\varepsilon^h$.

I tried using Taylor's theorem which states that
$$g(X+\varepsilon^h)-g(X) = g'(X)\varepsilon^h + R(X,\varepsilon^h),$$

where the absolute value of the remainder satisfies $\left|R(X,\varepsilon^h)\right|\leq C(X)|\varepsilon^h|^2$. Using this and the Cauchy-Schwarz inequality I could show that
$$\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|\leq\sqrt{\mathbb{E}(g'(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^2} + \sqrt{\mathbb{E}(C(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^4}.$$

For some reason, however, I'm not quite sure if thats correct In particular I don't know how to argue that C(X) should have finite variance. Maybe i must just assume that.

Anyway, I'd appreciate any input on how to bound the expected value of the increment in term of the moments of $\varepsilon^h$. Thanks,

Pere

EDIT:

I think what I did is actually not correct because C(X) might also depend on on the value of $\varepsilon^h$...Intuitively, however, I find it quite plausible that $\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|$ should not go to zero more slowly than $\mathbb{E}|\varepsilon^h}$ .. Thanks again

Last edited: Jul 28, 2010