Second derivative test when Hessian is Positive Semi-Definite

Click For Summary
SUMMARY

The discussion centers on the implications of a positive semi-definite Hessian matrix in the context of the second derivative test for optimization. It is established that when the Hessian is positive semi-definite at a stationary point, the second derivative test is inconclusive, indicating that the point may be a local minimum or a saddle point, but not a local maximum. The participants also explore the definition of saddle points and the potential adjustment of the Hessian matrix in Newton's method to influence optimization outcomes.

PREREQUISITES
  • Understanding of Hessian matrices in multivariable calculus
  • Familiarity with the second derivative test for local extrema
  • Knowledge of critical points in optimization
  • Experience with Newton's method for unconstrained optimization
NEXT STEPS
  • Research the properties of positive semi-definite matrices in optimization contexts
  • Study the second derivative test and its limitations in detail
  • Explore the concept of saddle points and their mathematical definitions
  • Investigate techniques for modifying Hessian matrices in Newton's method
USEFUL FOR

Mathematicians, optimization specialists, and students studying multivariable calculus who are interested in advanced optimization techniques and the behavior of functions near critical points.

logarithmic
Messages
103
Reaction score
0
Can someone tell me what this actually is.

So, in the case when the Hessian is positive (or negative) semidefinite, the second derivative test is inconclusive.

However, I think I've read that even in the case where the Hessian is positive semidefinite at a stationary point x, we can still conclude that the function at x is not a local maximum. Is that correct?

Is that equivalent to the function at x being either a local minimum or a saddle point, since there are only 3 possibilities for x: local max, local min, or saddle (or are there more possibilties)?

If someone has some online pdf notes with definitions and proof of the second derivative test in generality, that would be good too. I also can't seem to find an agreed on definition of a saddle point, which adds to the confusion.
 
Physics news on Phys.org
Yes, at any "critical point" we must have a maximum, minimum, or saddle point. The fact that the Hessian is not positive or negative means we cannot use the 'second derivative' test (local max if det(H)> 0 and the \partial^2 z/\partial x^2&lt; 0, local min if det(H)> 0 and \partial^2 z/\partial x^2&lt; 0 and a saddle point if det(H)< 0)but it will be one of those, none the less. That simply means that we cannot use that particular test to determine which.

For example, the functions z= x^4+ y^4, z= -x^4- y^4 and z= x^3+ y^3 have first derivatives equal to 0 only at (0, 0) so that is the only critical point. The Hessian of all three functions is the 0 matrix at (0, 0) but it is obvious that (0, 0) is a minimum for the first function, a maximum for the second, and a saddle point for the third.
 
I see, thanks for the reply. Although I'd like to know your definition for a saddle point. Is it a critical point that is neither a local min, nor a local max?
 
This is an old post... but whatever.

My question is: If we are trying to do unconstrained optimization and using Newton's method, can we adjust the hessian matrix by shifting it (adding a diagonal matrix to it) to push its eigenvales to be sufficiently negative (or positive) to force it towards a maximum (or minimum) while avoiding the saddle points; I would like to do this shift before the Newton step.
many thanks
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 10 ·
Replies
10
Views
4K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 8 ·
Replies
8
Views
4K