# Proof of taylor series

1. Jan 22, 2007

### timm3r

i'm having a hard time understanding taylor series and why it works and how it works. if someone could please explain it to me that would be great. My teacher explained it in class but he goes so fast that i have no idea what hes saying. he did give us some practice problems but if i have no idea how it works i really wan't work on them. this is a couple of problems he gave. x=a=0.
1. y=e^x
2.y=sinx
3.y=1/(1-x)
4.y=ln(1-x)

Last edited: Jan 22, 2007
2. Jan 22, 2007

### Gib Z

Ok Well the theory behind it is actually VERY simple.

If we wanted to approximate a function, say, near x=0, then this approximation should have the same value for x=0, obviously. And, preferably, the points around it should be the same. To achieve this, the derivatives should also be the same. Thats the simple theory.

For a better introduction, I have attached this powerpoint presentation which will explain it better than me. I renamed the extentsion as pdf because PF wouldn't let me upload a file of this size unless it was...so when you get it, renamed the extentsion to zip, and then extract it. Good Luck

#### Attached Files:

• ###### Calc09_2day1.pdf
File size:
212.8 KB
Views:
1,693
3. Jan 22, 2007

### HallsofIvy

Staff Emeritus
Do you want a "proof" of the Taylor series (proof of what, exactly? You can't prove that the Taylor series sums to the original function, that's not always true!) or do you want to find the Taylor series of those functions?

There is also no "proof" that a Taylor polynomial (the Taylor series stopped at a particular finite power) is in any sense the "best" approximation- that also is not always true.

Think of it this way-
If we wanted the linear function such that its value and its derivative we the same as f's at some x= a, then we must have the tangent line there: y= f(a)+ f'(a)(x-a).
If we want the second degree (quadratic) function that has exactly the same value, first derivative, and second derivative at x= a, then we must have y"= f"(a) so y'= f"(a)x+ C. But y'(a)= f"(a)a+ C= f'(a) so that C= f'(a)- f"(a)a:
y'= f"(a)x+ f'(a)- f"(a)a= f"(a)(x-a)+ f'(a). Integrating again,
$$y= \frac{f"(a)}{2}(x-a)^2+ f'(a)x+ C[/itex] Since [tex]y(a)= \frac{f"(a)}{2}(a-a)^2+ f'(a)a+ C= f'(a)a+ C= f(a)$$
C= f(a)- f'(a)a and
$$y(x)= \frac{f"(a)}{2}(x-a)^2+ f'(a)x+ f(a)- f'(a)a$$
$$= \frac{f"(a)}{2}(x-a)+ f'(a)(x-a)+ f(a)$$

Starting from y(n)= fn(a) and integrating repeatedly, always requireing that f(k)= f(k)(a) for k< n, you get the nth Taylor polynomial.

The Taylor series for any infinitely differentiable function, f(x), about x= a, is given by
$$\Sigma_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x- a)^n$$
where f(n)(a) is the nth derivative of f evaluated at x= a. In particular, since you always have a= 0, the Taylor series is
$$\Sigma_{n=0}^\infty \frac{f^{(n)}(0)}{n!}x^n$$

Now the important question! Do you know how to differentiate those functions? In particular can you see a pattern and guess the formula for the nth derivative at x= 0?

4. Jan 22, 2007

### timm3r

ok thanks you guys explained it a lot more for me.

5. Dec 5, 2008

### CylonMath

The attachment isn't working. And I think he (and I ) want the proof of Taylor series , I mean how we get $$\Sigma_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x- a)^n$$
equation by proof ?

6. Dec 5, 2008

### Linear Space

I'll show you how to get the Taylor series. First, start with a power series about x=a:

$$f(x) = \sum\limits_{n = 0}^\infty {a_n (x - a)^n } = a_0 + a_1 (x - a) + a_2 (x - a)^2 + \cdot \cdot \cdot + a_n (x - a)^n + \cdot \cdot \cdot$$

Differentiate term by term...

$$\begin{gathered} f(x) = a_0 + a_1 (x - a) + a_2 (x - a)^2 + \cdot \cdot \cdot + a_n (x - a)^n + \cdot \cdot \cdot \hfill \\ f'(x) = a_1 + 2a_2 (x - a) + 3a_3 (x - a)^2 + \cdot \cdot \cdot + na_n (x - a)^{n - 1} + \cdot \cdot \cdot \hfill \\ f''(x) = 1 \cdot 2a_2 + 2 \cdot 3a_3 (x - a) + 3 \cdot 4a_4 (x - a)^2 + \cdot \cdot \cdot \hfill \\ f'''(x) = 1 \cdot 2 \cdot 3a_3 + 2 \cdot 3 \cdot 4a_4 (x - a) + 3 \cdot 4 \cdot 5a_5 (x - a)^2 + \cdot \cdot \cdot \hfill \\ \end{gathered}$$

Then realize:

$$f^{(n)} (x) = n!a_n + \cdot \cdot \cdot$$

The dots represent a sum of terms with (x-a) being a factor. So now we know a_n.

$$a_n = \frac{{f^{(n)} (x)}} {{n!}}$$

Now plug this result into our original power series about x=a to get the Taylor series of a function:

$$\sum\limits_{n = 0}^\infty {\frac{{f^{(n)} (a)}} {{n!}}} (x - a)^n$$

7. Dec 5, 2008

### HallsofIvy

Staff Emeritus
You don't "prove" it- that is the definition of the Taylor's series for a function having all derivatives.

If you mean, "prove that converges to the original function f(x) for all x in the radius of convergence", you can't- it isn't true. That is only true for "analytic" functions which, again, are defined as functions for which that is true!

For example, the function
$$f(x)= e^{-1/x^2}$$
if x is not 0, f(0)= 0, has all derivatives and all derivatives at x= 0 are 0 which means that its Taylor's series about x= 0 is simply 0+ 0x+ 0x2+ ...= 0 which is not equal to f for any non-zero x.

(Linear Space's "proof" starts by assuming that there exist a power series equal to the function. What he showed was that if that is true, then the Taylor's series is that power series.)

8. Apr 6, 2009

### saraaaahhhhhh

I am actually looking for a proof as well.

You say that is the definition of the Taylor series, but how does one prove that if a function F is analytic, it can be represented by a power series of the form
$$\Sigma^{\infty}_{n=0}a_nz^n$$
where
$$a_n = f^{(n)}(0)/n!$$

My teacher recommended a method including 'picking z, using Cauchy integral formula to computer f(z) as an integral, expand the integrand in a geom. series with ratio z/zeta, integrate term by term, and use CIF again to identify integrals as a_n.'

But, I'm not sure exactly what is meant by 'pick z.' Or how to do this, really.

Thanks!

Last edited: Apr 6, 2009
9. Apr 7, 2009

### CylonMath

Still curious

10. Apr 7, 2009

### HallsofIvy

Staff Emeritus
For the third or fourth time now, you don't "prove" a definition!

And "can be represented by a power series of the form
$$\Sigma^{\infty}_{n=0}a_nz^n$$"
in some neighborhood of 0 (or z0 if you use (z- z0)^n) is the definition of "analytic at 0" (or z0).

Once you have
$$f(z)= \Sigma^{\infty}_{n=0}a_n(z- z_0)^n$$
in some neighborhood of z0, taking z= z0 makes all but the 0th[/b] term 0 and give f(z0= a0.
Differentiating term by term gives
$$f'(z)= \Sigma^{\infty}_{n=0}na_n(z- z_0)^n$$
and setting z= z0 gives
f'(z0)= a1, etc.

11. Apr 7, 2009

### n!kofeyn

Were your questions actually answered? Have you been able to find the Taylor series of those functions?

Also, the Taylor series at a=0 is given a special name, called the Maclaurin series.

12. Apr 8, 2009

### n!kofeyn

I think it would be best to start a new thread about this, as your problem is within complex analysis, where the original poster seems to be in a calculus 2 course.

In complex analysis, this is not the definition of an analytic function. For example, look within Conway's Functions of One Complex Variable I on p. 34.

A function $f:G\to \mathds{C}$ is analytic if f is continuously differentiable on G.

Later it is proven that if f is analytic then it has a power series representation with a formula for the coefficients. This is what saraaaahhhhhh is referring to. It is also a theorem in complex analysis that a function with a power series is analytic, but these are not the definitions as you imply they are. So yes, the statement that saraaaahhhhhh gave CAN be proven.

Also, look in Gamelin's Complex Analysis on p. 45.