Taylor series / 2nd deriv test

Click For Summary
SUMMARY

The discussion focuses on using the Taylor series expansion to verify the second derivative test for identifying local maxima and minima. Specifically, it establishes that if f'(a) = 0 and f''(a) > 0, then x = a is a local minimum. The Taylor series is expressed as f(x) = a0 + a1(x-a) + a2(x-a)^2 + ..., where the first derivative f'(x) = a1 + 2a2(x-a) + ... and the second derivative f''(x) = 2a2. The analysis emphasizes that for values of x near a, the higher-order terms can be ignored, simplifying the evaluation of f(x) around the critical point.

PREREQUISITES
  • Understanding of Taylor series expansion
  • Knowledge of first and second derivatives
  • Familiarity with limits and continuity
  • Basic calculus concepts related to maxima and minima
NEXT STEPS
  • Study the implications of Taylor series in approximating functions
  • Learn about the application of limits in calculus
  • Explore the relationship between higher-order derivatives and function behavior
  • Investigate other methods for finding local extrema, such as the first derivative test
USEFUL FOR

Students studying calculus, particularly those focusing on optimization problems, mathematicians interested in function behavior analysis, and educators teaching derivative tests for extrema.

jesuslovesu
Messages
185
Reaction score
0

Homework Statement



Use the Taylor series about x = a to verify the second derivative test for a max or min. Show if f'(a) = 0 then f''(a) > 0 implies a min point at x = a ... Hint for a min point you must show that f(x) > f(a) for all x near enough to a.

Homework Equations


The Attempt at a Solution


f(x) = a0 + a1(x-a) + a2(x-a)^2 + ...
f'(x) = a1 + 2a2(x-a) + ...
f''(x) = 2a2

f'(a) = a1
if a1 = 0 then it's either a max or min
but I don't quite know what I should do to show that if f''(a) > 0 or < 0 that the point will be a max or min.
Should I do a limit?
\lim_{x \to a} a\right0 + a\right1 (x-a) + a\right2(x-a)^2 ... &gt; a\right0 ?
 
Last edited:
Physics news on Phys.org
At x=a your Taylor series has no a1 term, since f'(a)=0. So your series is just f(a)+(f''(a)/2)*(x-a)^2+... You can ignore the higher order terms if x is 'near enough' to a.
 
jesuslovesu said:

Homework Statement



Use the Taylor series about x = a to verify the second derivative test for a max or min. Show if f'(a) = 0 then f''(a) > 0 implies a min point at x = a ... Hint for a min point you must show that f(x) > f(a) for all x near enough to a.

Homework Equations





The Attempt at a Solution


f(x) = a0 + a1(x-a) + a2(x-a)^2 + ...
f'(x) = a1 + 2a2(x-a) + ...
f''(x) = 2a2

f'(a) = a1
if a1 = 0 then it's either a max or min
but I don't quite know what I should do to show that if f''(a) > 0 or < 0 that the point will be a max or min.
Should I do a limit?
\lim_{x \to a} a\right0 + a\right1 (x-a) + a\right2(x-a)^2 ... &gt; a\right0 ?
Yes, if a is a max or min for f(x), then f'(a)= 0. That means that the taylor's series for f(x) is a_0+ a_2(x- a)^2+ higher order terms. Close to a, (x-a)3[/sup is very small compared to (x-a)2 (if x- a= 0.001, (x-a)2= (0.001)2= 0.000001 and (x-a)3= (0.001)3= 0.000000001. (x-a)2 is 1000 times as large as (x-a)3). Ignoring higher power terms, f(x)= f(a)+ a_2(x-a)2. Of course, (x-a)2 is never negative and for x not equal to a is not 0. Whether f(x) is f(a)+ a positive number or f(x)= f(a)- a positive number depends entirely upon whether a2 is positive or negative-which in turn depends upon whether f"(a) is positive or negative.
 
Last edited by a moderator:

Similar threads

  • · Replies 27 ·
Replies
27
Views
4K
Replies
6
Views
3K
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
10
Views
2K