# Condition for a matrix to be Positive semi-definite

• NaturePaper
In summary: Just as a follow-up, I found the information about the theorem I mentioned above in Horn and Johnson. The theorem is Theorem 4.3.25, p. 216 in my edition. They attribute the proof to Weyl. As I said, it's proved via the spectral theorem and induction. The proof is a few pages long, so I will not reproduce it here. However, I can say that the theorem is equivalent to the statement that if a Hermitian matrix A is positive semidefinite, then all of its principal minors are non-negative. (A principal minor is the determinant of a submatrix consisting of the first k rows and k columns of A.) This

#### NaturePaper

Hi everyone,
I know that for a matrix to be Positive semi-definite (A>=0) (PSD), the elements on the principle diagonal (upper left corner to right bottom corner) must be non-negative (i.e., d_ii>=0).

But I wonder if there exists any condition to be satisfied by the elements on the secondary diagonal (upper right to left bottom) for the matrix to be PSD. Does it?

Whats if it is given that the matrix is hermitian?

[The matrix is over the complex field]

Thanks and Regards,
NaturePaper

I don't know about your main question, but PSD *implies* Hermitian when the field is $$\mathbb{C}$$.

jbunniii said:
I don't know about your main question, but PSD *implies* Hermitian when the field is $$\mathbb{C}$$.

No this is not the case...(PSD does not imply Hermitian)

A= 1 0
z 0
is PSD but not Hermitian; z\in C.

My main question is (though it was straightforward) what are the constraints for the elements on the secondary diagonals so that the matrix is PSD?

Hope its now clear to you.

Thanks and Regards,
NaturePaper

NaturePaper said:
No this is not the case...(PSD does not imply Hermitian)

A= 1 0
z 0
is PSD but not Hermitian; z\in C.

My main question is (though it was straightforward) what are the constraints for the elements on the secondary diagonals so that the matrix is PSD?

Hope its now clear to you.

Thanks and Regards,
NaturePaper

The question was clear the first time you posted it. Offhand I don't know what the answer is.

But you are quite incorrect when you say that PSD does not imply Hermitian in the complex case. In your example I can obtain

$$\left[\begin{array}{cc} \overline{x} & \overline{y} \end{array}\right] \left[\begin{array}{cc} 1 & 0 \\ z & 0 \end{array}\right] \left[\begin{array}{c} x \\ y \end{array}\right] < 0$$

by choosing, for example,

$$x = \overline{z}$$ and $$y = -2$$. I can also obtain a non-real left hand side (so the inequality doesn't even make sense) if instead of y = -2, I choose, say, y = i.

Indeed, PSD implies Hermitian in general. Why is this? PSD implies that, for all vectors v, we have

$$v^* A v \geq 0$$

In particular, $$v^* A v$$ is REAL (this is all we really need for the proof).

Thus $$v^* A v = v^* A^* v$$, which means

$$v^* (A - A^*) v = 0$$ for all v. Now to conclude that $$A = A^*$$, I just need this lemma:

Lemma: If V is a complex inner product vector space and T is a linear operator on V satisfying

$$<Tv, v> = 0$$ for all $$v \in V$$, then $$T = 0$$.

(This is, for example, Proposition 7.2 in Axler's "Linear Algebra Done Right.")

To prove it, we write (for arbitrary u,w in V):

$$<Tu,w> = \frac{1}{4}\left(<T(u+w),u+w> - <T(u-w),u-w> + i<T(u+iw),u+iw> - i<T(u-iw),u-iw>$$

which is easily verified by simplifying the right side. Each of the four terms on the right is of the form $$<Tv,v>$$ which by hypothesis is 0. Thus

$$<Tu,w> = 0$$ for any vectors u, w. Now take w = Tu:

$$||Tu||^2 = 0$$ which forces Tu = 0. This is true for all u in V, so T = 0.

Hope it's now clear to you.

Last edited:
@jbunniii,
Thank you very much for clarifying me. Actually I followed the definition that all eigenvalues >=0 to be PSD.

By the way any help for the current problem? Anyone, indeed. Its quite important to me.

Thanks and Regards,
NaturePaper

Well, I messed around with it a bit just using a 2x2 matrix

$$M = \left[\begin{array}{cc} a & b \\ \overline{b} & c \end{array}\right]$$

An elementary calculation shows that the eigenvalues of this matrix are

$$\lambda = \frac{1}{2}(a+c) \pm \sqrt{\frac{1}{4}(a+c)^2 + |b|^2 - ac}$$

Both eigenvalues need to be real and non-negative. The real constraint means we must have

$$(a-c)^2 + 4|b|^2 \geq 0$$

which is always true regardless of a, b, and c. (We knew this had to be true because the matrix is hermitian.)

The non-negative constraint gives a necessary condition that reduces to:

$$|b|^2 \leq ac$$

I don't see an obvious way to generalize this to larger matrices, though.

Last edited:
I don't see an obvious way to generalize this to larger matrices, though.

The generalization of this result is the following useful theorem, which can be proved using the spectral theorem and induction on the dimension:

Theorem: Let $$A$$ be a Hermitian matrix, and let $$A_i$$ be the top-left $$i \times i$$ submatrix of $$A$$. Then $$A$$ is positive semidefinite if and only if $$\det(A_i) \geq 0$$ for all $$i$$.

(Also, a matrix is positive definite if and only if $$\det(A_i) > 0$$ for all $$i$$.)

@VKint,
This is quite well known.

@VKint,
This is quite well known.

Yeah, I apologize. But jbunniii's condition is a special case of this.

Wikipedia gives the following result for a PSD matrix M with elements $$m_{ij}$$:

$$|m_{ij}| \leq \sqrt{m_{ii} m_{jj}}$$

They cite Horn and Johnson's "Matrix Analysis." I'm not sure offhand how to prove this, and I don't have my copy of Horn and Johnson with me at work, but if this is of interest I can look it up when I get home. It's the only "elementwise" constraint I've seen for the off-diagonal elements thus far.

@jbunniii,
Thank you very much, especially for the constant help on the topic.

About your last post, is the relation holds for all i,j? I mean there is no sum (repeated sum etc.) over i,j?

If it is like $$\sum_{i,j}|m_{ij}|<=\sum_{i\ne j }\sqrt{m_{ii}m_{jj}}$$,
then there is some inequality named "Hadamard inequality" which is of no help in my case.

However if there is no such sum, i.e., the mentioned relation holds for any arbitrary i,j, then it is really nice, cause we get the answer. If this is the case, then it can be followed from the fact that the 2x2 submatrix
$$\left[\begin{array}{cc} m_{ii} & m_{ij} \\ \bar{m}_{ji} & m_{jj} \end{array}\right]$$

is PSD (I don't know whether it is correct that the above 2x2 matrix need to be PSD at all!
Is it?).

Anyway, hope someone will clarify all these. I apologize for my poor skill of math representation in Latex. (I don't know how other guys generated such nice math representations).

Regards,
NaturePaper

Last edited:
@jbunniii,
Just seen wiki that the condition you mentioned was for PD not for PSD.

Regards.

NaturePaper said:
@jbunniii,
Just seen wiki that the condition you mentioned was for PD not for PSD.

Regards.

Yeah, I'm assuming that a similar result is true for PSD, but let me check Horn and Johnson this evening and I'll let you know for sure.

P.S. If you click on a typeset equation in any post on this forum, you will get a popup window that shows the LaTeX instructions that generated it.

Last edited:
OK, I am at home now and have Horn and Johnson's "Matrix Analysis" in front of me. The good news is that the result asserted on the Wikipedia page is indeed here; the bad news is that the proof is left as an exercise.

I'll quote you verbatim what it says. (This is on page 398, following Corollary 7.1.5, in case you have access to the book.)

Exercise. Show that if $$A = [a_{ij}] \in M_2$$ is positive definite, then
$$a_{11}a_{22} > |a_{12}|^2$$. Hint: Use det A > 0.
Deduce that if $$A \in M_n$$ is positive definite, then
$$a_{ii}a_{jj} > |a_{ij}|^2$$ for all $$i,j=1,2,\ldots,n$$. Show that ">" must be replaced by "$$\geq$$" in this inequality if one assumes only that A is positive semidefinite.

If I get a chance later this evening, I'll see if I can come up with a proof. This is one of the earliest exercises in the positive definite matrices chapter, so it shouldn't require too much machinery.

@jbunniii,
Thanks again for your reference. I'll see the book of Horne (but I'm not sure that I can prove it).

Let me try to follow the TeX instruction here
$$d_{ii}$$
Regards,
NP

Last edited:
OK, it's true for exactly the reason you speculated about in post 11.

Definition: A principal submatrix is formed by choosing a subset S of $$\{1,2,\ldots,n\}$$ and defining A(S) as the matrix formed by deleting all the rows of A whose indices are not in S, and all the columns of A whose indices are not in S.

The special case of interest here is S = {i,j}, (i < j):

$$A(S) = \left[\begin{array}{cc} a_{ii} & a_{ij} \\ a_{ji} & a_{jj} \end{array}\right]$$

Theorem: If A is positive (semi)definite, then every principal submatrix of A is also positive (semi)definite.

The proof is very straightforward: let x be a vector with arbitrary complex values in the positions corresponding to S, and zeros everywhere else. Let x(S) denote the vector formed from x by taking only the positions corresponding to S.

Then:

$$x(S)^* A(S) x(S) = \sum_{i,j \in S} a_{ij}x^*_{i} x_{j} = \sum_{i,j=1}^{n}a_{ij}x^*_{i} x_{j} = x^* A x$$

which is either > 0 (if A is positive definite) or >= 0 (if A is positive semidefinite), regardless of how we choose x(S). Thus A(S) is also positive definite or semi-definite, respectively.

Now we can apply the result from post 6 to each principal submatrix of the form

$$A(S) = \left[\begin{array}{cc} a_{ii} & a_{ij} \\ a_{ji} & a_{jj} \end{array}\right]$$

(noting that $$a_{ji} = \overline{a_{ij}}$$ because A is hermitian)

to obtain the result

$$a_{ii}a_{jj} \geq |a_{ij}|^2$$

(we can conclude strict inequality > in the positive definite case).

P.S. Horn and Johnson have a full chapter of nearly 100 pages on the subject of positive definite matrices. It's an excellent book, and I recommend getting your hands on a copy if you can.

Last edited:
@jbunniii,
Thanks, I got the proof too!

I'm trying to have a copy of the book. (Previously I used another one--Zhang's "Matrix Theory").

Regards,
NP

@jbunniii,

thanks very much, I have been looking for this in recent days!

## 1. What does it mean for a matrix to be positive semi-definite?

A matrix is positive semi-definite if all of its eigenvalues are greater than or equal to zero. This means that when the matrix is multiplied by any non-zero vector, the resulting vector will have a non-negative dot product with itself. In other words, the matrix preserves the direction and magnitude of the vector.

## 2. What is the significance of a matrix being positive semi-definite?

A positive semi-definite matrix is often used in optimization problems and in statistics, particularly in the field of multivariate analysis. It can also be used to represent quadratic forms and as a basis for defining metrics in vector spaces.

## 3. How can I determine if a matrix is positive semi-definite?

To determine if a matrix is positive semi-definite, you can calculate its eigenvalues and check if they are all greater than or equal to zero. Another method is to check if all of the principal minors (determinants of submatrices) are non-negative.

## 4. Can a matrix be both positive semi-definite and positive definite?

Yes, a matrix can be both positive semi-definite and positive definite. A matrix is positive definite if all of its eigenvalues are strictly greater than zero, while a positive semi-definite matrix can have eigenvalues equal to zero.

## 5. Are there any real-world applications of positive semi-definite matrices?

Positive semi-definite matrices have many real-world applications in fields such as physics, engineering, and economics. For example, they can be used to model systems with multiple variables and constraints, such as in portfolio optimization or in analyzing the stability of a physical system.