Critical points of 4 variables?

Click For Summary
SUMMARY

The discussion focuses on determining the nature of critical points using a 4x4 Hessian matrix. The key criteria established are: if all eigenvalues are positive, the point is a local minimum; if all are negative, it is a local maximum; and if there is a mix of signs, it indicates a saddle point. The Cholesky factorization method is recommended for practical applications, as it simplifies the process of identifying positive definiteness or negative definiteness of the Hessian matrix, thus determining the nature of the critical point effectively.

PREREQUISITES
  • Understanding of Hessian matrices and their role in optimization.
  • Familiarity with Cholesky factorization and its application in matrix analysis.
  • Knowledge of eigenvalues and their significance in determining matrix definiteness.
  • Basic skills in linear algebra, particularly with matrix operations.
NEXT STEPS
  • Study the Cholesky decomposition method in detail, including its computational steps.
  • Learn about eigenvalue computation techniques and their applications in optimization.
  • Explore higher-dimensional Hessian matrices and their implications in multivariable calculus.
  • Investigate other matrix factorization methods and their comparative advantages in optimization problems.
USEFUL FOR

Mathematicians, optimization specialists, and students studying multivariable calculus who need to analyze critical points in higher-dimensional spaces.

AndreTheGiant
Messages
26
Reaction score
0

Homework Statement



I have been given a 4x4 hessian matrix at a point c. I want to find out if the point is a local min/max/saddle. I am aware of the conditions of critical points for 3x3 hessians, but I'm not sure about 4x4 hessians.

The hessian is given by

1 0 -2 1
0 2 0 -1
-2 0 5 2
1 -1 2 4

Homework Equations


The Attempt at a Solution



I can get the determinant of the principal submatricies until the 3x3 matrix but is that enough to determine if its a local max min or saddle?
 
Physics news on Phys.org
I think the best version of the theorem you want is this:

If the Hessian matrix has all:
1) positive eigenvalues then you have a minima
2) negative eigenvalues then you have a max
3) Mix of positive and negative eigenvalues you have a saddle point

This version works in all dimensions.
 
AndreTheGiant said:

Homework Statement



I have been given a 4x4 hessian matrix at a point c. I want to find out if the point is a local min/max/saddle. I am aware of the conditions of critical points for 3x3 hessians, but I'm not sure about 4x4 hessians.

The hessian is given by

1 0 -2 1
0 2 0 -1
-2 0 5 2
1 -1 2 4


Homework Equations





The Attempt at a Solution



I can get the determinant of the principal submatricies until the 3x3 matrix but is that enough to determine if its a local max min or saddle?

The way practical algorithms carry out the tests in optimization packages is to apply Cholesky factorization of the Hessian Matrix H. This seeks to find an upper triangular matrix U with (strictly) positive diagonal elements, such that H = U^T * U, where U^T = transpose of U. If such a U exists, H is positive definite and your point is a strict local minimum. If -H has a Cholesky factorization, your matrix H is negative-definite, and the point is a strict local max. If H is indefinite, your point is a "saddle point"---neither a max nor a min.

Finding the Cholesky factors manually is easy for matrices up to about 10x10 or a bit more; just using a hand-held calculator is enough. In practice, eigenvalues are NOT used for such purposes, as computing them is much more involved than Cholesky methods. Also, computation of many, many subdeterminants is avoided as well, for the same reason.

The article http://en.wikipedia.org/wiki/Cholesky_decomposition
shows how to do the computations, although it regards the factorization as H = L * L^T for lower-triangular L, but of course, L = U^T. Basically, here is how to do it:
u(1,1) = sqrt{h(1,1)}, u(1,j) = h(1,j)/u(1,1) for j=2,3,...,n
u(2,2)^2 = h(2,2) - u(1,2)^2, so u(2,2) = sqrt{h(2,2)-u(1,2)^2},
and u(2,j) = [h(2,j)- u(1,2)*u(1,j)]/u(2,2), j=3,...,n
...
and for k = 3,...,n:
u(k,k) = sqrt{h(k,k)-sum_{i=1..k-1} u(i,k)^2 },
and u(k,j) = [h(k,j) - sum_{i=1..k-1} u(i,k)*u(i,j)]/u(k,k), j=k+1,...,n

Note that in u(k,k) we need the sum of squares of previous u in column k, and in u(k,j) we need the product of previous u in columns k and j.

RGV
 

Similar threads

Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 8 ·
Replies
8
Views
5K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K