1. Nov 4, 2012

### YayMathYay

1. The problem statement, all variables and given/known data

2. Relevant equations

We just learned basic Taylor Series expansion about C,

f(x) = f(C) + f'(C)(x - C) + [f''(C)(x - C)^2]/2 + ...

3. The attempt at a solution

Well the previous few questions involved finding the limit of the function and the derivative of the function as X approaches zero..

I thought this would be as simple as plugging in 0 for X in the Taylor Series expansion, but it's clear that the function cannot be evaluated at X = 0. I'm betting it has to do with the previous parts (finding the derivative of the function, etc) but I'm lost as to where to start.

Help would be much appreciated! :)

EDIT: Also thought about just using the limit as X -> 0 for the value of the function.. but then the whole Taylor Series would just equal zero, and I just felt like this wasn't the right answer? If it is, well.. I feel dumb.

Last edited: Nov 4, 2012
2. Nov 4, 2012

### LCKurtz

In your previous exercises it must have been given that the functions value is defined to be 0 at x=0. And you would have worked out f'(0) (not the value of f' as x approaches 0, but the value of f' at 0) and maybe f''(0). What did you get for them?

For this problem you would plug in 0 for the C in the Taylor expansion, not for the x. And you would use the values for f(0) (given as 0), f'(0), f''(0) that you have previously calculated etc to plug in the rest of the Taylor series. If the whole thing comes out 0, that would be interesting, eh?

3. Nov 4, 2012

### YayMathYay

Well, here are the previous parts..

My answers were 0, 2x^(-3)exp(-1/x^2), and 0 respectively.. so from that, can I just assume that f(0) = f'(0) = 0?

4. Nov 4, 2012

### LCKurtz

No. You have to be given one way or another that f(0)=0. It is usually given as part of the definition of the function. Or you could require that f be continuous at 0 and define it as the limit as $\lim_{x\rightarrow 0} f(x)$ which would be 0, as you have shown. Then you would show$$\lim_{x\rightarrow 0}\frac {f(h)-f(0)}{h-0}=0$$which would prove f'(0)= 0. Then you would do the same kind of limit with f'(x) to show f''(x)=0. It is usually given as an induction problem to show all derivatives are 0 at 0. But you have to start with f(0)=0 somehow.

5. Nov 4, 2012

### YayMathYay

Well I'm at a loss then, since we aren't given that :T
Perhaps just something the professor overlooked, I guess.

6. Nov 4, 2012

### LCKurtz

Well, if I were you I would at this point assume that and take f(0)= 0. You could work out the limit I suggested in post #4 and prove f'(0)=0. Since you have already calculated f'(x) if $x\ne 0$ you could to the same with that to get f''(0). Then you have the first two or three terms of your Taylor series and won't go to class empty-handed. Then you can ask your Prof about it. I will be curious to hear what happens.

 Edited a bit.

7. Nov 4, 2012

### YayMathYay

I think I will do that. Thanks a lot for your help!