Calculate error using Lagrange formula

  • Thread starter etha
  • Start date
  • #1
etha
2
0
hi everyone! I'm having difficulty figuring this problem out. so here goes:

f(x) = sin(x)

Use the Lagrange formula to find the smallest value of n so that the nth degree Taylor polynomial for f centered at x = 0 approximates f at x = 1 with an error of no more that 0.001.

whatever help anyone can provide would be great
 

Answers and Replies

  • #2
HallsofIvy
Science Advisor
Homework Helper
43,021
971
Well, the first thing I would do is write out the "Lagrange" formula for the error! Then follow that formula. Knowing that sin(x) and cos(x) are never larger than 1 helps.
 

Suggested for: Calculate error using Lagrange formula

Replies
3
Views
576
Replies
13
Views
758
  • Last Post
Replies
0
Views
588
  • Last Post
Replies
13
Views
613
Replies
4
Views
496
Replies
14
Views
709
Replies
3
Views
848
Replies
33
Views
1K
  • Last Post
Replies
1
Views
1K
Replies
1
Views
1K
Top