# Critical points of 4 variables?

• AndreTheGiant
In summary: Cholesky Decomposition and its Application to Optimization. In summary, the hessian is given by 1 0 -2 1 0 2 0 -1 -2 0 5 2 1 -1 2 4 and the Attempt at a Solution can get the determinant of the principal submatricies until the 3x3 matrix but is that enough to determine if its a local max min or saddle? If the Hessian matrix has all: 1) positive eigenvalues then you have a minima 2) negative eigenvalues then you have a max 3) Mix of positive and negative eigenvalues you have a saddle point.

## Homework Statement

I have been given a 4x4 hessian matrix at a point c. I want to find out if the point is a local min/max/saddle. I am aware of the conditions of critical points for 3x3 hessians, but I'm not sure about 4x4 hessians.

The hessian is given by

1 0 -2 1
0 2 0 -1
-2 0 5 2
1 -1 2 4

## The Attempt at a Solution

I can get the determinant of the principal submatricies until the 3x3 matrix but is that enough to determine if its a local max min or saddle?

I think the best version of the theorem you want is this:

If the Hessian matrix has all:
1) positive eigenvalues then you have a minima
2) negative eigenvalues then you have a max
3) Mix of positive and negative eigenvalues you have a saddle point

This version works in all dimensions.

AndreTheGiant said:

## Homework Statement

I have been given a 4x4 hessian matrix at a point c. I want to find out if the point is a local min/max/saddle. I am aware of the conditions of critical points for 3x3 hessians, but I'm not sure about 4x4 hessians.

The hessian is given by

1 0 -2 1
0 2 0 -1
-2 0 5 2
1 -1 2 4

## The Attempt at a Solution

I can get the determinant of the principal submatricies until the 3x3 matrix but is that enough to determine if its a local max min or saddle?

The way practical algorithms carry out the tests in optimization packages is to apply Cholesky factorization of the Hessian Matrix H. This seeks to find an upper triangular matrix U with (strictly) positive diagonal elements, such that H = U^T * U, where U^T = transpose of U. If such a U exists, H is positive definite and your point is a strict local minimum. If -H has a Cholesky factorization, your matrix H is negative-definite, and the point is a strict local max. If H is indefinite, your point is a "saddle point"---neither a max nor a min.

Finding the Cholesky factors manually is easy for matrices up to about 10x10 or a bit more; just using a hand-held calculator is enough. In practice, eigenvalues are NOT used for such purposes, as computing them is much more involved than Cholesky methods. Also, computation of many, many subdeterminants is avoided as well, for the same reason.

The article http://en.wikipedia.org/wiki/Cholesky_decomposition
shows how to do the computations, although it regards the factorization as H = L * L^T for lower-triangular L, but of course, L = U^T. Basically, here is how to do it:
u(1,1) = sqrt{h(1,1)}, u(1,j) = h(1,j)/u(1,1) for j=2,3,...,n
u(2,2)^2 = h(2,2) - u(1,2)^2, so u(2,2) = sqrt{h(2,2)-u(1,2)^2},
and u(2,j) = [h(2,j)- u(1,2)*u(1,j)]/u(2,2), j=3,...,n
...
and for k = 3,...,n:
u(k,k) = sqrt{h(k,k)-sum_{i=1..k-1} u(i,k)^2 },
and u(k,j) = [h(k,j) - sum_{i=1..k-1} u(i,k)*u(i,j)]/u(k,k), j=k+1,...,n

Note that in u(k,k) we need the sum of squares of previous u in column k, and in u(k,j) we need the product of previous u in columns k and j.

RGV