Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Taylor approximation (probability)

  1. Jul 28, 2010 #1
    I have the following problem: Assume g is a (smooth enough) function, X a random variable and [itex]\varepsilon^h[/itex] a sequence of random variables, whose moments converge to 0 as h goes to zero.

    I would then like to prove that

    converges to zero as well as h goes to 0, if possible at the same rate as the first moment of [itex]\varepsilon^h[/itex].

    I tried using Taylor's theorem which states that
    g(X+\varepsilon^h)-g(X) = g'(X)\varepsilon^h + R(X,\varepsilon^h),

    where the absolute value of the remainder satisfies [itex]\left|R(X,\varepsilon^h)\right|\leq C(X)|\varepsilon^h|^2[/itex]. Using this and the Cauchy-Schwarz inequality I could show that
    \mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|\leq\sqrt{\mathbb{E}(g'(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^2} + \sqrt{\mathbb{E}(C(X)^2)}\sqrt{\mathbb{E}(\varepsilon^h)^4}.

    For some reason, however, I'm not quite sure if thats correct:smile: In particular I don't know how to argue that C(X) should have finite variance. Maybe i must just assume that.

    Anyway, I'd appreciate any input on how to bound the expected value of the increment in term of the moments of [itex]\varepsilon^h[/itex]. Thanks,



    I think what I did is actually not correct because C(X) might also depend on on the value of [itex]\varepsilon^h[/itex]...Intuitively, however, I find it quite plausible that [itex]\mathbb{E}\left|g(X+\varepsilon^h)-g(X)\right|[/itex] should not go to zero more slowly than [itex]\mathbb{E}|\varepsilon^h}[/itex] .. Thanks again
    Last edited: Jul 28, 2010
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted