- #1
Poopsilon
- 294
- 1
The problem I am having trouble with is Exercise #1 in Chapter 8 of Rudin's Principles of Mathematical Analysis. For those of you who don't own the book the problem states:
Define f(x) = { exp(-1/x^2) for x≠0.
......{ 0 for x=0.
Prove that f(x) has derivatives of all orders at x=0 and that the nth derivative is equal to zero for all n=1,2,3,... .
There is a theorem in the book concerning power-series which are convergent in some interval having derivatives of all orders in this interval. Clearly exp(-1/x^2) has a convergent power-series expansion for the intervals (-R,0) and (0,R), but I'm a bit confused on how to approach proving statements about the derivative at zero.
Now for both f(x) = exp(-1/x^2) and the first derivative f'(x) = (2/x)exp(-1/x^2) go to 0 as x->0 (from both directions). So since f(x)=0 for x=0 does this mean I can 'stitch in' 0 into the original function and conclude that the derivative is 0 at f(x)=0? If this is sufficient what would be a good way to generalize this to all n=1,2,3,...? Thanks.
Define f(x) = { exp(-1/x^2) for x≠0.
......{ 0 for x=0.
Prove that f(x) has derivatives of all orders at x=0 and that the nth derivative is equal to zero for all n=1,2,3,... .
There is a theorem in the book concerning power-series which are convergent in some interval having derivatives of all orders in this interval. Clearly exp(-1/x^2) has a convergent power-series expansion for the intervals (-R,0) and (0,R), but I'm a bit confused on how to approach proving statements about the derivative at zero.
Now for both f(x) = exp(-1/x^2) and the first derivative f'(x) = (2/x)exp(-1/x^2) go to 0 as x->0 (from both directions). So since f(x)=0 for x=0 does this mean I can 'stitch in' 0 into the original function and conclude that the derivative is 0 at f(x)=0? If this is sufficient what would be a good way to generalize this to all n=1,2,3,...? Thanks.