MHB An inequality between the integral Remainder of a function and the function.

Alone
Messages
57
Reaction score
0
Suppose we have a function $f(x)$ which is infinitely differentiable in the neighborhood of $x_0$, and that: $f^{(k)}(x) \ge 0$ for each $k=0,1,2,3,\ldots$ for all $x$ in this neighborhood.

Let $R_n(x)=\frac{1}{n!}\int_a^x f^{(n+1)}(t)(x-t)^n dt$ where $x_0-\epsilon <a<x<b<x_0+\epsilon$;

I want to show that $R_n(b) \le f(b)$, how to show this?

Thanks.
 
Last edited by a moderator:
Physics news on Phys.org
Hi Alan,

The integral form of Taylor's theorem yields
$$f(x) = f(a) + f'(a)(x - a) + \cdots + \frac{f^{(n)}(a)}{n!}(x - a)^n + R_n(x)$$
for all $x$ in your neighborhood of $x_0$. If $x$ is in that neighborhood, $\dfrac{f^{(k)}(a)}{k!} (x - a)^k$ is nonnegative for all $k$. Thus $f(b) \ge R_n(b)$.
 
Yes, easy.

Thanks Euge!
 
Back
Top