Approximating definite integrals using series.

  • #1
I was wondering how to approximate definite integrals to within a specific accuracy. For example, how would I go about approximating the integral from 0 to 1 of sin(x^3) dx to within an accuracy of 0.001? I think I'm supposed to use the remainder estimate for the integral test, but I'm confused because that seems to apply to indefinite integrals. Any ideas? :rolleyes:
 

Answers and Replies

  • #2
dextercioby
Science Advisor
Homework Helper
Insights Author
13,162
725
Wee Sleeket said:
I was wondering how to approximate definite integrals to within a specific accuracy. For example, how would I go about approximating the integral from 0 to 1 of sin(x^3) dx to within an accuracy of 0.001? I think I'm supposed to use the remainder estimate for the integral test, but I'm confused because that seems to apply to indefinite integrals. Any ideas? :rolleyes:

Substitute the "x^3" factor instead of "x" in the Taylor expansion of sin x.Then integrate each term of the series from 0 to 1.Evaluate each integral with maximum 4 decimals.You'll stop evaluating the integrals,once the numbers added are less than 0.001.

Daniel.
 

Related Threads on Approximating definite integrals using series.

Replies
1
Views
2K
  • Last Post
Replies
2
Views
2K
Replies
15
Views
10K
  • Last Post
Replies
12
Views
1K
  • Last Post
Replies
4
Views
684
  • Last Post
Replies
8
Views
4K
  • Last Post
Replies
6
Views
2K
  • Last Post
Replies
6
Views
6K
  • Last Post
Replies
1
Views
5K
  • Last Post
Replies
7
Views
9K
Top