Understanding Taylor Series Approximations

j-lee00
Messages
93
Reaction score
0
When it says "about a point x=a", what does this mean? why not just say at x = a?

Thanks
 
Physics news on Phys.org
Because x is a variable. Saying that the Taylor series is "about the point x= a" means its interval of convergence is centered on a:
\sum a_n (x- a)^n.
 
Because one is looking at a neighbourhood of a, say all x that satisfy |x-a|<d for some (small) number delta>0.
 
why can be written taylor series?
 
Despite the question mark, that is not a question.
 
Landau said:
Despite the question mark, that is not a question.

can we ask like that: how do the taylor's series work?
Thanks
 
Landau said:
Because one is looking at a neighbourhood of a, say all x that satisfy |x-a|<d for some (small) number delta>0.

HallsofIvy said:
Because x is a variable. Saying that the Taylor series is "about the point x= a" means its interval of convergence is centered on a:
\sum a_n (x- a)^n.

Thanks
 
alpagut said:
can we ask like that: how do the taylor's series work?
Thanks
One way to look at it is this- how can we best approximate a function, given information about it at a single point?

If the only thing we know is that f(a)= A, then the simplest thing to do is to approximate f(x) by the constant A- and there is no reason to think that any more complicated formula would give a better approximation.

If we know that f(a)= A and f'(a)= B, then we can approximate f by the linear function satisying those properties: y= A+ B(x- a).

If we know that f(a)= A, f'(a)= B, and f"(b)= C, the simplest function having those properties is y= A+ B(x- a)+ (C/2)(x- a)^2.

Continuing in that way, gives the succesive "Taylor's polynomials". For especially "nice" functions, we can extend that to an infinite power series, the "Taylor's series".

(But be careful, even if a function is infinitely differentiable, so that we can form the "Taylor's series", it can happen that the Taylor's series does not converge to the function at more than single point.)
 
HallsofIvy said:
One way to look at it is this- how can we best approximate a function, given information about it at a single point?

If the only thing we know is that f(a)= A, then the simplest thing to do is to approximate f(x) by the constant A- and there is no reason to think that any more complicated formula would give a better approximation.

If we know that f(a)= A and f'(a)= B, then we can approximate f by the linear function satisying those properties: y= A+ B(x- a).

If we know that f(a)= A, f'(a)= B, and f"(b)= C, the simplest function having those properties is y= A+ B(x- a)+ (C/2)(x- a)^2.

Continuing in that way, gives the succesive "Taylor's polynomials". For especially "nice" functions, we can extend that to an infinite power series, the "Taylor's series".

(But be careful, even if a function is infinitely differentiable, so that we can form the "Taylor's series", it can happen that the Taylor's series does not converge to the function at more than single point.)

Thank you!
 

Similar threads

Back
Top