Second derivative test when Hessian is Positive Semi-Definite

Click For Summary

Discussion Overview

The discussion revolves around the second derivative test in the context of optimization, particularly when the Hessian matrix is positive semi-definite. Participants explore the implications of this condition on identifying local maxima, minima, and saddle points, as well as the definitions and properties of these critical points.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant notes that when the Hessian is positive semi-definite at a stationary point, the second derivative test is inconclusive, leading to questions about the nature of the critical point.
  • Another participant asserts that at any critical point, one must have a maximum, minimum, or saddle point, but the inconclusiveness of the Hessian means that the second derivative test cannot determine which one it is.
  • Examples are provided where the Hessian is the zero matrix at a critical point, yet the nature of the critical point varies among different functions.
  • A participant seeks clarification on the definition of a saddle point, proposing it as a critical point that is neither a local minimum nor a local maximum.
  • Another participant raises a question about modifying the Hessian matrix in Newton's method to influence its eigenvalues, aiming to avoid saddle points while optimizing.

Areas of Agreement / Disagreement

Participants generally agree that the Hessian being positive semi-definite leads to inconclusiveness in determining the nature of critical points. However, there is no consensus on the precise definition of a saddle point, and the discussion includes competing views on how to handle optimization in the presence of saddle points.

Contextual Notes

Participants express uncertainty regarding the definitions of critical points and saddle points, as well as the implications of modifying the Hessian matrix in optimization methods. There are also unresolved mathematical steps related to the second derivative test.

logarithmic
Messages
103
Reaction score
0
Can someone tell me what this actually is.

So, in the case when the Hessian is positive (or negative) semidefinite, the second derivative test is inconclusive.

However, I think I've read that even in the case where the Hessian is positive semidefinite at a stationary point x, we can still conclude that the function at x is not a local maximum. Is that correct?

Is that equivalent to the function at x being either a local minimum or a saddle point, since there are only 3 possibilities for x: local max, local min, or saddle (or are there more possibilties)?

If someone has some online pdf notes with definitions and proof of the second derivative test in generality, that would be good too. I also can't seem to find an agreed on definition of a saddle point, which adds to the confusion.
 
Physics news on Phys.org
Yes, at any "critical point" we must have a maximum, minimum, or saddle point. The fact that the Hessian is not positive or negative means we cannot use the 'second derivative' test (local max if det(H)> 0 and the \partial^2 z/\partial x^2&lt; 0, local min if det(H)> 0 and \partial^2 z/\partial x^2&lt; 0 and a saddle point if det(H)< 0)but it will be one of those, none the less. That simply means that we cannot use that particular test to determine which.

For example, the functions z= x^4+ y^4, z= -x^4- y^4 and z= x^3+ y^3 have first derivatives equal to 0 only at (0, 0) so that is the only critical point. The Hessian of all three functions is the 0 matrix at (0, 0) but it is obvious that (0, 0) is a minimum for the first function, a maximum for the second, and a saddle point for the third.
 
I see, thanks for the reply. Although I'd like to know your definition for a saddle point. Is it a critical point that is neither a local min, nor a local max?
 
This is an old post... but whatever.

My question is: If we are trying to do unconstrained optimization and using Newton's method, can we adjust the hessian matrix by shifting it (adding a diagonal matrix to it) to push its eigenvales to be sufficiently negative (or positive) to force it towards a maximum (or minimum) while avoiding the saddle points; I would like to do this shift before the Newton step.
many thanks
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 10 ·
Replies
10
Views
4K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K