Taylor Series about exp(-1/x^2)

YayMathYay
Messages
22
Reaction score
0

Homework Statement



WrqOG.png


Homework Equations



We just learned basic Taylor Series expansion about C,

f(x) = f(C) + f'(C)(x - C) + [f''(C)(x - C)^2]/2 + ...

The Attempt at a Solution



Well the previous few questions involved finding the limit of the function and the derivative of the function as X approaches zero..

I thought this would be as simple as plugging in 0 for X in the Taylor Series expansion, but it's clear that the function cannot be evaluated at X = 0. I'm betting it has to do with the previous parts (finding the derivative of the function, etc) but I'm lost as to where to start.

Help would be much appreciated! :)EDIT: Also thought about just using the limit as X -> 0 for the value of the function.. but then the whole Taylor Series would just equal zero, and I just felt like this wasn't the right answer? If it is, well.. I feel dumb.
 
Last edited:
Physics news on Phys.org
YayMathYay said:

Homework Statement



WrqOG.png


Homework Equations



We just learned basic Taylor Series expansion about C,

f(x) = f(C) + f'(C)(x - C) + [f''(C)(x - C)^2]/2 + ...

The Attempt at a Solution



Well the previous few questions involved finding the limit of the function and the derivative of the function as X approaches zero..

I thought this would be as simple as plugging in 0 for X in the Taylor Series expansion, but it's clear that the function cannot be evaluated at X = 0. I'm betting it has to do with the previous parts (finding the derivative of the function, etc) but I'm lost as to where to start.

Help would be much appreciated! :)EDIT: Also thought about just using the limit as X -> 0 for the value of the function.. but then the whole Taylor Series would just equal zero, and I just felt like this wasn't the right answer? If it is, well.. I feel dumb.

In your previous exercises it must have been given that the functions value is defined to be 0 at x=0. And you would have worked out f'(0) (not the value of f' as x approaches 0, but the value of f' at 0) and maybe f''(0). What did you get for them?

For this problem you would plug in 0 for the C in the Taylor expansion, not for the x. And you would use the values for f(0) (given as 0), f'(0), f''(0) that you have previously calculated etc to plug in the rest of the Taylor series. If the whole thing comes out 0, that would be interesting, eh?
 
LCKurtz said:
In your previous exercises it must have been given that the functions value is defined to be 0 at x=0. And you would have worked out f'(0) (not the value of f' as x approaches 0, but the value of f' at 0) and maybe f''(0). What did you get for them?

For this problem you would plug in 0 for the C in the Taylor expansion, not for the x. And you would use the values for f(0) (given as 0), f'(0), f''(0) that you have previously calculated etc to plug in the rest of the Taylor series. If the whole thing comes out 0, that would be interesting, eh?

Well, here are the previous parts..

Do87G.png


My answers were 0, 2x^(-3)exp(-1/x^2), and 0 respectively.. so from that, can I just assume that f(0) = f'(0) = 0?
 
No. You have to be given one way or another that f(0)=0. It is usually given as part of the definition of the function. Or you could require that f be continuous at 0 and define it as the limit as ##\lim_{x\rightarrow 0} f(x)## which would be 0, as you have shown. Then you would show$$
\lim_{x\rightarrow 0}\frac {f(h)-f(0)}{h-0}=0$$which would prove f'(0)= 0. Then you would do the same kind of limit with f'(x) to show f''(x)=0. It is usually given as an induction problem to show all derivatives are 0 at 0. But you have to start with f(0)=0 somehow.
 
LCKurtz said:
No. You have to be given one way or another that f(0)=0. It is usually given as part of the definition of the function. Or you could require that f be continuous at 0 and define it as the limit as ##\lim_{x\rightarrow 0} f(x)## which would be 0, as you have shown. Then you would show$$
\lim_{x\rightarrow 0}\frac {f(h)-f(0)}{h-0}=0$$which would prove f'(0)= 0. Then you would do the same kind of limit with f'(x) to show f''(x)=0. It is usually given as an induction problem to show all derivatives are 0 at 0. But you have to start with f(0)=0 somehow.

Well I'm at a loss then, since we aren't given that :T
Perhaps just something the professor overlooked, I guess.
 
YayMathYay said:
Well I'm at a loss then, since we aren't given that :T
Perhaps just something the professor overlooked, I guess.

Well, if I were you I would at this point assume that and take f(0)= 0. You could work out the limit I suggested in post #4 and prove f'(0)=0. Since you have already calculated f'(x) if ##x\ne 0## you could to the same with that to get f''(0). Then you have the first two or three terms of your Taylor series and won't go to class empty-handed. Then you can ask your Prof about it. I will be curious to hear what happens.

[Edit] Edited a bit.
 
LCKurtz said:
Well, if I were you I would at this point assume that and take f(0)= 0. You could work out the limit I suggested in post #4 and prove f'(0)=0. Since you have already calculated f'(x) if ##x\ne 0## you could to the same with that to get f''(0). Then you have the first two or three terms of your Taylor series and won't go to class empty-handed. Then you can ask your Prof about it. I will be curious to hear what happens.

[Edit] Edited a bit.

I think I will do that. Thanks a lot for your help!
 
Back
Top