kingwinner said:
Homework Statement
Theorem:
Let f: M->R
where M is a open subset of Rn
Suppose f is C2(M)
Let x E M such that
"gradient of f at x" = 0 and the Hessian of f at x is positive definite
Then x is a strict local minimum point of f.
The above theorem is given in my textbook.
If instead we have "gradient of f at x" = 0 and the Hessian of f at x is positive SEMI-definite, can we conclue that x is a local minimum point of f? Why or why not?
This puzzles me and I can't find this in the textbook.
Homework Equations
Unconstrained optimziation
The Attempt at a Solution
N/A
Any help is appreciated!
Look at the function f(x,y) = (y - x^2)*(y - 2x^2) near (0,0). We have that (0,0) is a stationary point and the Hessian of f at (0,0) is positive semi-definite. However, (0,0) is neither a local max nor min. To see this, note that for y > 2x^2, y is above both parabolas y = x^2 and y = 2x^2, so both factors of f are > 0 and f > 0. For y < x^2, y is below both parabolas, so each factor of f is < 0 but f > 0. However, for x^2 < y < 2x^2, y is between the two parabolas, so one factor of f is > 0 and the other factor is < 0, giving f < 0. Therefore, in any neighbourhood of (0,0) there are points (x,y) where f(x,y) > f(0,0) = 0, and other points (x,y) where f(x,y) < f(0,0).
In general, having a positive semi-definite Hessian is a second-order NECESSARY condition for a minimum, but the sufficient condition for a (strict, local) minimum is stronger. Actually, this is even true in one dimension; just look at the two functions f1(x) = x^2 and f2(x) = x^3 at x = 0. In this case, the analog of positive-semidefinite is ">=0", while the analog of positive-definite is ">0".
RGV