Approximating definite integrals using series.

  1. I was wondering how to approximate definite integrals to within a specific accuracy. For example, how would I go about approximating the integral from 0 to 1 of sin(x^3) dx to within an accuracy of 0.001? I think I'm supposed to use the remainder estimate for the integral test, but I'm confused because that seems to apply to indefinite integrals. Any ideas? :rolleyes:
  2. jcsd
  3. dextercioby

    dextercioby 12,301
    Science Advisor
    Homework Helper

    Substitute the "x^3" factor instead of "x" in the Taylor expansion of sin x.Then integrate each term of the series from 0 to 1.Evaluate each integral with maximum 4 decimals.You'll stop evaluating the integrals,once the numbers added are less than 0.001.

Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?