- #1

dawn_pingpong

- 50

- 0

## Homework Statement

When does a Taylor's Series not converge to its original function? the commonly given example is

[tex]e^\frac{-1}{x^2}[/tex]

But I really don't get how it works... For example if we find the series around the point x=1, then

http://www4a.wolframalpha.com/Calculate/MSP/MSP75061a390i474cg906c800004hhd9d3497i4fa6h?MSPStoreType=image/gif&s=58&w=865&h=29 [Broken]

This is the series right? However, but then the value at the point x=0 is undefined, as the denominator x cannot be zero? I really don't understand how this thing works. Thank you!

Last edited by a moderator: