# Can someone help me understand Taylor and MacLaurin series?

1. Nov 30, 2004

### Wee Sleeket

I am having difficulty understanding Taylor and MacLaurin series. I need someone to go through step by step and explain a problem from beginning to end. You could use the function f(x) = cos x. Also, could someone find the MacLaurin series of 1/(x^2 + 4) ? I just don't understand the basics of evaluating these problems. If someone could help, that'd be great

2. Nov 30, 2004

### mathwonk

a maclaurin series is an infinite polynomial that "converges" to your function. there is a theorem that there si at most one such series, so any one you find by hook or crook must be it.

the most natural best possible seriews in the world is the geometric series.

i.e. a + ar + ar^2 + ar^3 +..... = a/(1-r). (true for |r| < 1.)

Your problem example is so easy it actually equals a geometric series. just change

1/(4+x^2) to (1/4) [1/(1 + (x/2)^2)] = (1/4) [1/(1 -{- (x/2)^2)}] = and then you get the series

(1/4) [ 1 -(x/2)^2 +(x/2)^3 - (x/2)^4 + - .....].

Then there is a theorem that you can differentiate a convergent maclaurin series term by term. so the only possible series representing f, is

f(0) + f'(0)x + f''(0)/2 x^2 + [f'''(0)/3!] x^3 +......

But as in your case above, taking derivatives is not always the easiest way to proceed.

Remark: maclaurin had nothing to do with these things, according to reliable scholars, so lets call them all taylor series. i.e. maclaurin rediscovered a special case of taylor series, decades after taylor had already explained them in general.

try reading courants calculus book. or send me you email and i will send you my notes on the topic.

3. Nov 30, 2004

good old Courant!

4. Nov 30, 2004

### Wee Sleeket

I think I'm ok with finding the terms of the series... I just don't know how to notate the final answer. How would you write the final Taylor series for each one?

5. Dec 1, 2004

### HallsofIvy

IF you can find a general form for f(n) at x0 (the nth derivative), Then the Taylor series is just what Mathwonk said:
f(x0) + f'(x0)(x-x0) + f''(x0)/2 (x-x0)^2 + [f'''(x0)/3!] (x-x0)^3 +......
with the general term being (f(h)/n!)(x-x0)n.

As Mathwonk said, any power series that converges to the same function must have exactly the same coefficients, so however you find the coefficients, you have the Taylor series.

By the way, one thing Mathwonk said might be misleading:
"a maclaurin series is an infinite polynomial that "converges" to your function. there is a theorem that there si at most one such series, so any one you find by hook or crook must be it."

It is quite possible for a function, f, to be infinitely differentiable and have a Taylor's or Maclaurin series that converges for all x but doesn't converge to the function f itself! An example is f(x)= exp(-1/x2) if x is not 0, 0 if x is 0. One can show that f is continuous and, indeed, infinitely differentiable at x= 0. All derivatives at 0 are equal to 0 so it's MacLaurin series is just $\Sigma 0*x^n$ which is equal to 0 for all x and not to f(x).

Functions for which the Taylor series at some point does converge to f(x) for some neighborhood around the point are called "analytic". Those are just about all of the functions we work with.