Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Estimating Error in a Quadrature

  1. Mar 23, 2012 #1
    I'm using a quadrature to estimate the integral of a function.
    Intuitively, I know that if the function is a very smooth function, the quadrature will perform well at a low order (few samples).
    If however, the function in more complex, I'll need to sample it more frequently for the quadrature to be as accurate.

    I'm wondering if there's some equation that expresses this intuition. Something that I can point to and use to explain why more complex functions require a higher order quadrature for the same level of accuracy.

    I think the expression should contain some reference to the function's derivative.

    Last edited: Mar 23, 2012
  2. jcsd
  3. Mar 23, 2012 #2
    Well, Gaussian Quadrature, is (i'm pretty sure) order 2n accurate, meaning that the error is less than


    where [itex]f[/itex] is the function being interpolated, and [itex]k[/itex] is constant. So, if [itex]f[/itex] is a degree [itex]2n-1[/itex] polynomial, then Gaussian Quadrature is exact.

    Does this help?
    Last edited: Mar 23, 2012
  4. Mar 23, 2012 #3
    It does actually. Thanks for that.

    Do you know what the error is for a Clenshaw-Curtis quadrature?
  5. Mar 23, 2012 #4
    Not off hand, but google probably does :) .
  6. Mar 23, 2012 #5
    I found something on the wikipedia page saying it's error is bounded by
    O([2N]^{-k}/k) for a k-times differentiable integrand.

    I'm not sure what a k-times differentiable integrand is exactly but, at a guess, is a function like x^2+2x+5 differentiable 3 times and x^9+2 differentiable 10 times so it's a proxy to complexity?

    The function I'm trying to integrate is exp(-x^2/L) with respect to x between the limits, -1 and 1.
    I'm finding though that the smaller I make L, the worse the quadrature gets. I don't think I'm changing the number of times it can be differentiated, so why does the approximation get worse?
    My intuition is that by making L smaller, the width of this function (Gaussian) becomes smaller and a higher order quadrature is needed to accurately probe it but I'm not sure if any equation backs this up.
    Last edited: Mar 23, 2012
  7. Mar 23, 2012 #6
    As a side note, one of the drawbacks of Gaussian Quadrature is that it is kind of difficult to get a good grasp on the error. For this reason, people estimate the error by [itex]I_m - I_n[/itex] where [itex]m > n[/itex] and [itex]I_n[/itex] is the estimated integral with [itex]n[/itex] nodes.

    Now, I think that the problem is that as L gets bigger and bigger, the meaty portion of the function bunches up around [itex]0[/itex], right? So, if you are using a few nodes, you probably aren't getting much sampling from the real "meaty" part of this function.

    I'd keep increasing the number of nodes until [itex]I_{n+2} - I_n[/itex] is within your tolerance. This will give you better than necessary accuracy, which isn't all that bad.
  8. Mar 23, 2012 #7
    You are sort of right. But, [itex]x^9 + 2[/itex] is infinitely times differentiable since the trivial ploynomial [itex]p(x) = 0[/itex] is differentiable.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook