Taylor series / 2nd deriv test

jesuslovesu
Messages
185
Reaction score
0

Homework Statement



Use the Taylor series about x = a to verify the second derivative test for a max or min. Show if f'(a) = 0 then f''(a) > 0 implies a min point at x = a ... Hint for a min point you must show that f(x) > f(a) for all x near enough to a.

Homework Equations


The Attempt at a Solution


f(x) = a0 + a1(x-a) + a2(x-a)^2 + ...
f'(x) = a1 + 2a2(x-a) + ...
f''(x) = 2a2

f'(a) = a1
if a1 = 0 then it's either a max or min
but I don't quite know what I should do to show that if f''(a) > 0 or < 0 that the point will be a max or min.
Should I do a limit?
\lim_{x \to a} a\right0 + a\right1 (x-a) + a\right2(x-a)^2 ... &gt; a\right0 ?
 
Last edited:
Physics news on Phys.org
At x=a your Taylor series has no a1 term, since f'(a)=0. So your series is just f(a)+(f''(a)/2)*(x-a)^2+... You can ignore the higher order terms if x is 'near enough' to a.
 
jesuslovesu said:

Homework Statement



Use the Taylor series about x = a to verify the second derivative test for a max or min. Show if f'(a) = 0 then f''(a) > 0 implies a min point at x = a ... Hint for a min point you must show that f(x) > f(a) for all x near enough to a.

Homework Equations





The Attempt at a Solution


f(x) = a0 + a1(x-a) + a2(x-a)^2 + ...
f'(x) = a1 + 2a2(x-a) + ...
f''(x) = 2a2

f'(a) = a1
if a1 = 0 then it's either a max or min
but I don't quite know what I should do to show that if f''(a) > 0 or < 0 that the point will be a max or min.
Should I do a limit?
\lim_{x \to a} a\right0 + a\right1 (x-a) + a\right2(x-a)^2 ... &gt; a\right0 ?
Yes, if a is a max or min for f(x), then f'(a)= 0. That means that the taylor's series for f(x) is a_0+ a_2(x- a)^2+ higher order terms. Close to a, (x-a)3[/sup is very small compared to (x-a)2 (if x- a= 0.001, (x-a)2= (0.001)2= 0.000001 and (x-a)3= (0.001)3= 0.000000001. (x-a)2 is 1000 times as large as (x-a)3). Ignoring higher power terms, f(x)= f(a)+ a_2(x-a)2. Of course, (x-a)2 is never negative and for x not equal to a is not 0. Whether f(x) is f(a)+ a positive number or f(x)= f(a)- a positive number depends entirely upon whether a2 is positive or negative-which in turn depends upon whether f"(a) is positive or negative.
 
Last edited by a moderator:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top