Determining if a 3x3 matrix is negative semidefinite

In summary: This shows that the constraints are not a problem, and that the optimum is (\bar{x_1},\bar{x_2},\bar{x_3},\bar{u}) = (-\frac{2}{15},-\frac{\sqrt{79}}{15},\frac{1}{15},\frac{\sqrt{71 + 2\sqrt{79}}}{15}) = (0,0,0,1).
  • #1
earti193
2
0

Homework Statement



I have the matrix A = [-10 3.5 3; 3.5 -4 0.75; 3 0.75 -0.75]

I need to determine whether this is negative semidefinite.

Homework Equations





The Attempt at a Solution



1st order principal minors:

-10
-4
-0.75

2nd order principal minors:

2.75
-1.5
2.4375

3rd order principal minor:

=det(A) = 36.5625

To be negative semidefinite principal minors of an odd order need to be ≤
0, and ≥0 fir even orders. This suggests that the matrix is not negative semidefinite.

I don't believe my answer though for two reasons:
- I thought that if the diagonal entries were all negative that meant it was negative semidefinite?
- I am looking at the Hessian of an expenditure function and the expenditure function satisfies all the other conditions of being an expenditure function, so I think it should be negative semi definite.

Where have I gone wrong?
 
Physics news on Phys.org
  • #2
earti193 said:

Homework Statement



I have the matrix A = [-10 3.5 3; 3.5 -4 0.75; 3 0.75 -0.75]

I need to determine whether this is negative semidefinite.

Homework Equations





The Attempt at a Solution



1st order principal minors:

-10
-4
-0.75

2nd order principal minors:

2.75
-1.5
2.4375

3rd order principal minor:

=det(A) = 36.5625

To be negative semidefinite principal minors of an odd order need to be ≤
0, and ≥0 fir even orders. This suggests that the matrix is not negative semidefinite.

I don't believe my answer though for two reasons:
- I thought that if the diagonal entries were all negative that meant it was negative semidefinite?
- I am looking at the Hessian of an expenditure function and the expenditure function satisfies all the other conditions of being an expenditure function, so I think it should be negative semi definite.

Where have I gone wrong?

I find it easier to just test whether B = -A is positive semi-definite. It isn't in this case. One can see this explicitly by trying to find the Cholesky factorization of B (which exists for both a positive-definite and a positive-semidefinite matrix). If the Cholesky factorization does not exist (or if some diagonal elements are complex) the matrix is indefinite. That is what happens in this case.

Note: in practice one never uses determinants to make these tests; one, instead, does matrix factorization. In this case, the factorization is just finding a way to write the quadratic form ##Q(x) = x^T B x## as a sum of squares. In your case it leads to
[tex] Q(x) = x^T B x = \sum_{i=1}^3 b_{ii} x_i^2 + 2 \sum_{i < j} a_{ij} x_i x_j[/tex]
of the form
[tex] Q(x) = U_1^2 + U_2^2 - \frac{195}{148} x_3^2,\\
\text{where}\\
U_1 = \sqrt{10} x_1 - \frac{7}{ 2 \sqrt{10}} x_2 - \frac{3}{\sqrt{10}} x_3 \\
U_2 = \frac{\sqrt{1110}}{20} x_2 - \frac{6\sqrt{1110}}{185} x_3.
[/tex]
Note that it is possible to have some ##x = (x_1,x_2,x_3)## giving ##Q(x) > 0## and other x giving ##Q(x) < 0##, because of the negative sign in the third term. That is why the matrix is indefinite.

If, as you believe, your matrix should be semidefinite, then you must have copied it down incorrectly; or, perhaps, you are using it to test a constrained optimum for a maximum, in which case you need to test the Hessian of the Lagrangian in the tangent space of the constraints (not the Hessian of the objective function over the whole space).
 
  • #3
I've gone over the original matrix a few times and can't see how it can be any different. I think that these are constrained optimums because they are optimum demand functions. The matrix is a Skutsky matrix which by definition is identical to the Hessian of the expenditure function. If my approach was only testing for semidefiniteness in the 'whole space' (not sure what this means), what do I need to do differently to test it in the tangent space?
 
  • #4
earti193 said:
I've gone over the original matrix a few times and can't see how it can be any different. I think that these are constrained optimums because they are optimum demand functions. The matrix is a Skutsky matrix which by definition is identical to the Hessian of the expenditure function. If my approach was only testing for semidefiniteness in the 'whole space' (not sure what this means), what do I need to do differently to test it in the tangent space?

You would need to show me your entire problem for me to "get" what you are saying. However, I can show you by example what I was saying.

Suppose we want to maximize ##f = 5x_1 x_2 x_3 - x_1^2## subject to the constraint ##g = x_1^2+x_2^2+x_3^2-1= 0.## If we form the Lagrangian
##L = f + u g## (##u=## a Lagrange multiplier---often called ##\lambda##, but ##u## is easier to write) the first-order necessary conditions for a max are that
[tex] \partial L / \partial x_i = 0, \; i = 1,2,3, \text{ and } g = 0.[/tex] Without going into the details, the solution is ##(x_1,x_2,x_3,u) = (\bar{x_1},\bar{x_2},\bar{x_3},\bar{u}),## where
[tex] \bar{x_1} = -\frac{2}{15} +\frac{\sqrt{79}}{15}, \:
\bar{x_2} = \bar{x_3} = \frac{1}{15} \sqrt{71 + 2\sqrt{79}}, \:
\bar{u} = \frac{1}{3} - \frac{1}{6} \sqrt{79} [/tex]
Numerically, we have
[tex] \bar{x_1} = 0.4592129612, \: \bar{x_2} = \bar{x_3} = 0.6281414874,\:
\bar{u} = -1.148032403 .[/tex]

In order to test if ##\bar{x}## is a (local) constrained max, we must examine the Hessian of the Lagrangian ##L(x_1,x_2,x_3,\bar{u})## at the point ##\bar{x}##

Note that the objective function f (the thing we want to maximize) has only saddle points as stationary, so in 3-space has no max or min. However, we need to test the Hessian of L, not of f. We have ##HL = \text{Hessian of } L(x_1,x_2,x_3,\bar{u}) |_{\bar{x}}## given as
[tex] HL = \pmatrix{-2 & 3.140707436 & 3.140707436\\
3.140707436 & 0 & 2.296064805 \\
3.140707436 & 2.296064805 & 0 }[/tex]
This is an indefinite matrix.

However, we are supposed to test HL projected down into the subspace of the tangent to the constraint at ##\bar{x}##. What this means is the following: we look at points near ##\bar{x}##, of the form ##(x_1,x_2,x_3) = (\bar{x_1}+p_1,\bar{x_2}+p_2,\bar{x_3}+p_3)## for small ##|p|##. The Hessian HL gives the quadratic expansion of ##L(x_1,x_2,x_3,\bar{u})## in terms of the ##p_i##; that is, it gives the second-order terms in a Taylor expansion, so that
[tex] L(\bar{x}+p,\bar{u}) \doteq Q(p) \equiv \sum_i \sum_j HL_{ij} p_i p_j [/tex] if we drop terms of higher than second order in the ##p_j##. We have just seen that HL is indefinite, so the above quadratic form in the ##p_i## is indefinite; that is, we have Q(p) > 0 for some p and Q(p) < 0 for other p. However, for vectors p lying in the tangent space of the constraint we have ##p## perpendicular to ##\nabla g(\bar{x})##, so must have ##\bar{x_1} p_1 + \bar{x_2} p_2 + \bar{x_3} p_3 =0##. That is, we can express ##p_3## as a linear combination of ##p_1## and ##p_2##:
[tex]p_3 = -0.7310661219 p_1-p_2 . [/tex] When we plug this into Q(p) we end up with a quadratic form in ##p_1## and ##p_2## alone. This 2-variable quadratic form is given as
[tex] -10.11534387 p_1^2 - 6.714300770 p_1 p_2 - 9.184259223 p_2^2 [/tex]
The Hessian of this is
[tex] H_0 = \pmatrix{-20.23068775 & -6.714300766 \\
-6.714300766 & -18.36851845}[/tex]
We can see that ##H_0## is a negative-definite matrix, so the point ##\bar{x}## is a strict local constrained max of f.

Note that none of the matrices involved were definite or semidefinite over the whole space of three variables; however, the one matrix that we really care about IS negative defiinite in the tangent subspace, and that is enough (by some theorems in optimization theory).
 

1. What is a negative semidefinite matrix?

A negative semidefinite matrix is a square matrix which satisfies the condition that all of its eigenvalues are less than or equal to zero. In other words, the matrix has negative or zero eigenvalues.

2. How can I determine if a 3x3 matrix is negative semidefinite?

To determine if a 3x3 matrix is negative semidefinite, you can use the Sylvester's criterion. This involves checking the signs of the determinants of all the upper left submatrices of the matrix. If all the determinants are positive or equal to zero, then the matrix is negative semidefinite.

3. Can a matrix be both positive and negative semidefinite?

No, a matrix cannot be both positive and negative semidefinite. A matrix is positive semidefinite if all its eigenvalues are greater than or equal to zero, while a negative semidefinite matrix has all its eigenvalues less than or equal to zero.

4. What are the applications of negative semidefinite matrices?

Negative semidefinite matrices have various applications in mathematics, physics, and engineering. They are commonly used in optimization problems, control theory, and signal processing. They also play a crucial role in characterizing stable systems in control theory.

5. Is there a quick way to determine if a matrix is negative semidefinite?

Yes, there is a quick way to determine if a matrix is negative semidefinite. If the matrix is symmetric, then checking the signs of its eigenvalues is sufficient. If all the eigenvalues are negative or equal to zero, then the matrix is negative semidefinite. However, if the matrix is not symmetric, then Sylvester's criterion must be used.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
837
  • Programming and Computer Science
Replies
3
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
4K
  • Calculus and Beyond Homework Help
Replies
2
Views
3K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
7K
  • Calculus and Beyond Homework Help
Replies
5
Views
4K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Back
Top