# Proving convergence of a series to its generating function

## Homework Statement

The problem asks to use the Lagrange form of the remainder in Taylor's Theorem to prove that the Maclaurin series generated by f(x) = xex converges to f. From the actual answer, I'm guessing it wants me to use the Remainder Estimation Theorem to accomplish this.

## Homework Equations

The Lagrange form of the remainder of Taylor's Theorem, Rn(x), where a is the center of the Taylor series in an open interval I, c is some number between a and x in the interval I, f(n+1) is the (n+1)st derivative of f, and n is a positive integer:

$$R_n(x) = \frac{f^{(n+1)}(c)}{(n+1)!}(x-a)^{n+1}$$

The other relevant equation, I would guess, is the Remainder Estimation Theorem. It states (word-for-word) that "if there are positive constants M and r such that |f(n+1)(t)| $\leq$ Mrn+1 for all t between a and x, then the remainder Rn(x) in Taylor's Theorem satisfies the inequality

$$|R_n(x)| \leq M \frac{r^{n+1}|x-a|^{n+1}}{(n+1)!}$$

If these conditions hold for every n and all other conditions of Taylor's Theorem are satisfied by f, then the series converges to f(x)."

## The Attempt at a Solution

I got the series by multiplying the series for ex by x. $$\sum_{n=0}^{\infty}\frac{x^{n+1}}{n!}$$ In my attempt to prove the convergence of this to f, I used the Remainder Estimation Theorem with r = 1, a = 0, and f(n+1)(t) = the (n+1)st derivative of tet. |f(n+1)(t)| $\leq$ M. So, I just needed to find a bound for the (n+1)st derivative. This is where I got confused. I can't find the bound. On the interval [0,x], the bound would be the (n+1)st derivative evaluated at x (the right end of the interval) because xex is a positive increasing function. So, M = xex + (n+1)ex.

However, M = a positive constant, so it can't be dependent on n. On the interval [x,0], a similar thing occurs except this time the bound would be (n+1) because M = 0e0 + (n+1)e0 = 0 + (n+1) = (n+1). Once again, this is a shifting bound.

Attempting to continue the proof by plugging this into the inequality of the Remainder Estimation Theorem despite the shifting bounds, I get

$$|R_n(x)| \leq \frac {xe^x + (n+1)(e^x)}{(n+1)!}|x|^{n+1} = \frac {xe^x|x|^{n+1} + (n+1)e^x|x|^{n+1}}{(n+1)!} = \frac{xe^x|x|^{n+1}}{(n+1)!} + \frac {(n+1)e^x|x|^{n+1}}{(n+1)!} = \frac {xe^x|x|^{n+1}}{(n+1)!} + \frac {e^x|x|^{n+1}}{n!}$$

The n+1 cancels and so as n $\rightarrow\infty$, the right side of this inequality and thus the remainder goes to zero which proves convergence on [0,x] (the convergence on [x,0] follows from the "bound" found earlier through this same method). Assuming there is nothing wrong with this proof, that is the only way I can see how to prove it.

However, the correct answer gives this. |f(n+1)(t)| = tet $\leq$ Mrn+1. r = 1, so M = tet. On [x,0], M = 0. On [0,x], M = xex. Thus, tet $\leq$ M, and the Remainder Estimation Theorem is satisfied which proves the convergence of the series to f.

And that is why I'm confused. (1) It claims that M = xex on [0,x], (2) equates tet to |f(n+1)(t)|, and (3) sets M = 0 on [x,0]. There are other similar problems to prove in which a similar thing happens (I have not attempted these again yet, though). The correct answer does all three of these confusing things for h(x) = sin2(x) and i(x)=cos2(x). On a third question, the correct answer sets M = 0 on [x,0] for j(x) = sin(x) - x + $\frac {x^3}{6}$. However, that last one I can find an actual bound for the (n+1)st derivative fine (it isn't M = 0 though).

I thought that there might be something similar going on with all of these problems and that by understanding the one I explained in detail, I could solve the rest.

I'd appreciate any help here. Is there something obvious I'm missing? Because I noticed that https://www.physicsforums.com/showthread.php?t=292812&highlight=remainder+estimation+theorem" [Broken] (but with no attempt made and definitely not as detailed as me) and was asked in another website without getting an answer.

Last edited by a moderator:

Dick
Homework Helper
It looks to me like you are only confused because the solutions are written sloppily. First let's restrict to a finite interval [-C,C] and think of C large. I think you would agree that |R_n(x)|<=|(C*e^C+(n+1)e^C)*C^(n+1)|/(n+1)!. I've replaced the x and c in x^(n+1) and f^(n+1)(c) with C, since both of these functions take on their maximum absolute value at the right endpoint (+C). That's easy enough to show. Now you just want to show lim n->infinity of |R_n(x)| is zero. Can you show that? If so then the series converges on [-C,C]. But C could be anything, so you've actually shown it converges for all x.

Can you show that? If so then the series converges on [-C,C]. But C could be anything, so you've actually shown it converges for all x.
Yes, I can show that. If I understand you right, then I did prove it right (albeit very informally) by using M = CeC + (n+1)eC, except I used x instead of C, correct? So, if that proof is correct, this means that the constant M can "shift" its value? M can depend on n?

It looks to me like you are only confused because the solutions are written sloppily.
You said "sloppily" instead of saying they were wrong. This implies that they are correct in a sense. I don't see how. Will you or someone else please explain this to me?

Dick
Homework Helper
I'm not really sure what the "Remainder Estimation Theorem" is. So I'm not sure what 'M shifting it's value' means. Sorry, maybe somebody else does. But if you can show that the remainder term goes to zero, then you have shown that the series converges. You quoted lines from the solution like |f^(n+1)(t)| = t*e^t. That's not right. That's what I meant by 'sloppily'.

I'm not really sure what the "Remainder Estimation Theorem" is. So I'm not sure what 'M shifting it's value' means.
I quoted the Theorem word-for-word in the first post. Sorry, I guess I wasn't clear on that. All I meant by 'M shifiting its value' is that although M is a constant, its value in this case depends on n. For example, its value for n = 1 is M = e + 2e = 3e. Its value for n = 2 is M = 2e2 + 3e2 = 5e2. So, M's value will change with n as n goes to infinity. However, M is supposed to be a constant. That is why I asked if it could 'shift its value' and if M could depend on n.

You quoted lines from the solution like |f^(n+1)(t)| = t*e^t. That's not right. That's what I meant by 'sloppily'.
Ah, I see now. I'd better solve those other ones then so that the solutions will be correct for future students! Thanks for clearing that up.

But if you can show that the remainder term goes to zero, then you have shown that the series converges.
This implies, then, that M can 'shift its value' and 'depend on n' defined as I explained them in this post. If that's the case, then you have answered that question too.

Thanks for all the help!

Last edited:
Dick