Panphobia said:
So you can use the second derivative test to check whether it is a saddle point or not correct?
D = fxxfyy - fyx^2
D < 0, saddle point
In an unconstrained problem, the second-order necessary conditions for a (local) min of ##f(x,y)## at the stationary point ##(x,y) = (x^*,y^*)## are:
f_{xx} \geq 0, \:f_{yy} \geq 0 \; \text{at} \; (x,y) = (x^*,y^*)\\<br />
f_{xx} f_{yy} - f_{xy}^2 \geq 0 \; \text{at} \; (x,y) = (x^*,y^*)
The second-order sufficient conditions for a
strict (local) min at ##(x,y) = (x^*,y^*)## are as above, but with all inequalities being strict.
In a constrained problem with a
linear constraint, necessary second-order conditions for a (local) min at ##(x^*,y^*)## positive semi-definiteness of the Hessian matrix
H_f (x^*,y^*) =\left. \pmatrix{ f_{xx} & f_{xy}\\f_{xy} & f_{yy}} \right|_{(x,y) = (x*^,y^*)}
projected down into the tangent subspace of the constraint. A sufficient condition for a strict constrained local min is that we have positive-definiteness instead of semi-definiteness in the above, provided that the Lagrange multiplier is nonzero. (If the Lagrange multiplier is zero it is trickier).
For a constrained problem with a
nonlinear constraint, you must replace the function f by the Lagrangian
L(x,y,\lambda^*) = f(x,y) - \lambda^* g(x,y)
in the tests listed above. (That is, we look at the Hessian of the Lagrangian instead of the function f). Here, ##\lambda^*## is the value of the Lagrange multiplier ##\lambda## at the solution ##(x^*,y^*)##. Note that we fix ##\lambda## at the value ##\lambda^*##, but let ##(x,y)## vary around the point ##(x^*,y^*)##.
All this is a very lengthy way of saying that your suggested test above is not really correct in all its details.