# Lagrange Error

## Main Question or Discussion Point

In my BC calc class, we just finished working through most of series and sequences, and as we were reviewing years past free response questions on the topic, and in 2004, they dropped a lagrange error analysis. I've been looking at different explanations, but I'm not getting the concept. It seems like it is just the next term of the series, so how can it account for all of the remaining terms for a polynomial approximation of a function like f(x) = sin (x). Also, what is the deal with the whole c being between x and a. If anyone could enlighten me as to the meaning of all this, I would be most appreciative. Maybe a sample problem...

sat
I am assuming that by the Lagrange error you mean the expression
$$\frac{f^{(n+1)}(c)}{(n+1)!}(x-x_{0})^{n+1}$$
This is almost the same as the expression for the $(n+1)$th term, but (as you seem to have noticed) the derivative of $f$ is evaluated at some $c$ between $x$ and $x_0$.

Basically the $c$ can come about as a result of using the mean value theorem to justify the remainder term. Actually finding $c$ may be tricky, but because you know it's between x and $x_{0}$, you can find the maximum value of the error based on that. For example, if you have an expansion which gives an error term $R_{5}$ of
$$\frac{\cos c}{3!}x^{5}$$
you know that the maximum value of the error is
$$\frac{1}{5!}x^{5}$$
since $|\cos c|$ is at most equal to unity. Furthermore, you could consider an expansion where $x$ is only between 0 and 1. Then the biggest the error term could be is
$$\frac{1}{120}$$
Finding the maximum error over a certain range doesn't give you the true remainder for a particular value of x, but it gives a "worst case scenario".

I hope this makes some sense.

Hurkyl
Staff Emeritus
Gold Member
I assume you're familiar with differential approximation and the mean value formula, right?

Differential approximation says:

f(y) ~ f(x) + f'(x) (y - x)

And the mean value theorem says (after a little rearrangement) that there exists a c in [x, y] such that:

f(y) = f(x) + f'(c) (y - x)

A Taylor series is just a vast generalization of differential approximation -- it includes higher order derivatives. The error term is simply the corresponding generalization of the mean value theorem.

arg, i just went to a bc calculus session at my school. we did a lagrange error problem. the whole concept baffles me for some reason. im sure many wont be on the ap test, because i managed a 5 on the practice one; however, i would love to learn to do it. anyone want to do a sample problem? 