Register to reply 
Condition for a matrix to be Positive semidefinite 
Share this thread: 
#1
May2609, 03:02 PM

P: 70

Hi everyone,
I know that for a matrix to be Positive semidefinite (A>=0) (PSD), the elements on the principle diagonal (upper left corner to right bottom corner) must be nonnegative (i.e., d_ii>=0). But I wonder if there exists any condition to be satisfied by the elements on the secondary diagonal (upper right to left bottom) for the matrix to be PSD. Does it? Whats if it is given that the matrix is hermitian? [The matrix is over the complex field] Thanks and Regards, NaturePaper 


#2
May2609, 07:07 PM

Sci Advisor
HW Helper
PF Gold
P: 3,288

I don't know about your main question, but PSD *implies* Hermitian when the field is [tex]\mathbb{C}[/tex].



#3
May2709, 11:23 AM

P: 70

A= 1 0 z 0 is PSD but not Hermitian; z\in C. My main question is (though it was straightforward) what are the constraints for the elements on the secondary diagonals so that the matrix is PSD? Hope its now clear to you. Thanks and Regards, NaturePaper 


#4
May2709, 11:48 AM

Sci Advisor
HW Helper
PF Gold
P: 3,288

Condition for a matrix to be Positive semidefinite
But you are quite incorrect when you say that PSD does not imply Hermitian in the complex case. In your example I can obtain [tex]\left[\begin{array}{cc} \overline{x} & \overline{y} \end{array}\right] \left[\begin{array}{cc} 1 & 0 \\ z & 0 \end{array}\right] \left[\begin{array}{c} x \\ y \end{array}\right] < 0[/tex] by choosing, for example, [tex]x = \overline{z}[/tex] and [tex]y = 2[/tex]. I can also obtain a nonreal left hand side (so the inequality doesn't even make sense) if instead of y = 2, I choose, say, y = i. Indeed, PSD implies Hermitian in general. Why is this? PSD implies that, for all vectors v, we have [tex]v^* A v \geq 0[/tex] In particular, [tex]v^* A v[/tex] is REAL (this is all we really need for the proof). Thus [tex]v^* A v = v^* A^* v[/tex], which means [tex]v^* (A  A^*) v = 0[/tex] for all v. Now to conclude that [tex]A = A^*[/tex], I just need this lemma: Lemma: If V is a complex inner product vector space and T is a linear operator on V satisfying [tex]<Tv, v> = 0[/tex] for all [tex]v \in V[/tex], then [tex]T = 0[/tex]. (This is, for example, Proposition 7.2 in Axler's "Linear Algebra Done Right.") To prove it, we write (for arbitrary u,w in V): [tex]<Tu,w> = \frac{1}{4}\left(<T(u+w),u+w>  <T(uw),uw> + i<T(u+iw),u+iw>  i<T(uiw),uiw>[/tex] which is easily verified by simplifying the right side. Each of the four terms on the right is of the form [tex]<Tv,v>[/tex] which by hypothesis is 0. Thus [tex]<Tu,w> = 0[/tex] for any vectors u, w. Now take w = Tu: [tex]Tu^2 = 0[/tex] which forces Tu = 0. This is true for all u in V, so T = 0. Hope it's now clear to you. 


#5
May2709, 03:56 PM

P: 70

@jbunniii,
Thank you very much for clarifying me. Actually I followed the definition that all eigenvalues >=0 to be PSD. By the way any help for the current problem? Anyone, indeed. Its quite important to me. Thanks and Regards, NaturePaper 


#6
May2709, 05:10 PM

Sci Advisor
HW Helper
PF Gold
P: 3,288

Well, I messed around with it a bit just using a 2x2 matrix
[tex]M = \left[\begin{array}{cc} a & b \\ \overline{b} & c \end{array}\right][/tex] An elementary calculation shows that the eigenvalues of this matrix are [tex]\lambda = \frac{1}{2}(a+c) \pm \sqrt{\frac{1}{4}(a+c)^2 + b^2  ac}[/tex] Both eigenvalues need to be real and nonnegative. The real constraint means we must have [tex](ac)^2 + 4b^2 \geq 0[/tex] which is always true regardless of a, b, and c. (We knew this had to be true because the matrix is hermitian.) The nonnegative constraint gives a necessary condition that reduces to: [tex]b^2 \leq ac[/tex] I don't see an obvious way to generalize this to larger matrices, though. 


#7
May2709, 08:59 PM

P: 111

Theorem: Let [tex] A [/tex] be a Hermitian matrix, and let [tex] A_i [/tex] be the topleft [tex] i \times i [/tex] submatrix of [tex] A [/tex]. Then [tex] A [/tex] is positive semidefinite if and only if [tex] \det(A_i) \geq 0 [/tex] for all [tex] i [/tex]. (Also, a matrix is positive definite if and only if [tex] \det(A_i) > 0 [/tex] for all [tex] i [/tex].) 


#8
May2809, 02:12 AM

P: 70

@VKint,
This is quite well known. 


#9
May2809, 12:20 PM

P: 111




#10
May2809, 12:59 PM

Sci Advisor
HW Helper
PF Gold
P: 3,288

Wikipedia gives the following result for a PSD matrix M with elements [tex]m_{ij}[/tex]:
[tex]m_{ij} \leq \sqrt{m_{ii} m_{jj}}[/tex] They cite Horn and Johnson's "Matrix Analysis." I'm not sure offhand how to prove this, and I don't have my copy of Horn and Johnson with me at work, but if this is of interest I can look it up when I get home. It's the only "elementwise" constraint I've seen for the offdiagonal elements thus far. 


#11
May2809, 02:13 PM

P: 70

@jbunniii,
Thank you very much, especially for the constant help on the topic. About your last post, is the relation holds for all i,j? I mean there is no sum (repeated sum etc.) over i,j? If it is like [tex]\sum_{i,j}m_{ij}<=\sum_{i\ne j }\sqrt{m_{ii}m_{jj}}[/tex], then there is some inequality named "Hadamard inequality" which is of no help in my case. However if there is no such sum, i.e., the mentioned relation holds for any arbitrary i,j, then it is really nice, cause we get the answer. If this is the case, then it can be followed from the fact that the 2x2 submatrix [tex]\left[\begin{array}{cc} m_{ii} & m_{ij} \\ \bar{m}_{ji} & m_{jj} \end{array}\right][/tex] is PSD (I don't know whether it is correct that the above 2x2 matrix need to be PSD at all!! Is it?). Anyway, hope someone will clarify all these. I apologize for my poor skill of math representation in Latex. (I don't know how other guys generated such nice math representations). Regards, NaturePaper 


#12
May2809, 02:32 PM

P: 70

@jbunniii,
Just seen wiki that the condition you mentioned was for PD not for PSD. Regards. 


#13
May2809, 03:03 PM

Sci Advisor
HW Helper
PF Gold
P: 3,288

P.S. If you click on a typeset equation in any post on this forum, you will get a popup window that shows the LaTeX instructions that generated it. 


#14
May2809, 09:38 PM

Sci Advisor
HW Helper
PF Gold
P: 3,288

OK, I am at home now and have Horn and Johnson's "Matrix Analysis" in front of me. The good news is that the result asserted on the Wikipedia page is indeed here; the bad news is that the proof is left as an exercise.
I'll quote you verbatim what it says. (This is on page 398, following Corollary 7.1.5, in case you have access to the book.) Exercise. Show that if [tex]A = [a_{ij}] \in M_2[/tex] is positive definite, then [tex]a_{11}a_{22} > a_{12}^2[/tex]. Hint: Use det A > 0. Deduce that if [tex]A \in M_n[/tex] is positive definite, then [tex]a_{ii}a_{jj} > a_{ij}^2[/tex] for all [tex]i,j=1,2,\ldots,n[/tex]. Show that ">" must be replaced by "[tex]\geq[/tex]" in this inequality if one assumes only that A is positive semidefinite. If I get a chance later this evening, I'll see if I can come up with a proof. This is one of the earliest exercises in the positive definite matrices chapter, so it shouldn't require too much machinery. 


#15
May2909, 01:50 AM

P: 70

@jbunniii,
Thanks again for your reference. I'll see the book of Horne (but I'm not sure that I can prove it). Let me try to follow the TeX instruction here [tex]d_{ii}[/tex] Regards, NP 


#16
May2909, 02:12 AM

Sci Advisor
HW Helper
PF Gold
P: 3,288

OK, it's true for exactly the reason you speculated about in post 11.
Definition: A principal submatrix is formed by choosing a subset S of [tex]\{1,2,\ldots,n\}[/tex] and defining A(S) as the matrix formed by deleting all the rows of A whose indices are not in S, and all the columns of A whose indices are not in S. The special case of interest here is S = {i,j}, (i < j): [tex]A(S) = \left[\begin{array}{cc} a_{ii} & a_{ij} \\ a_{ji} & a_{jj} \end{array}\right][/tex] Theorem: If A is positive (semi)definite, then every principal submatrix of A is also positive (semi)definite. The proof is very straightforward: let x be a vector with arbitrary complex values in the positions corresponding to S, and zeros everywhere else. Let x(S) denote the vector formed from x by taking only the positions corresponding to S. Then: [tex]x(S)^* A(S) x(S) = \sum_{i,j \in S} a_{ij}x^*_{i} x_{j} = \sum_{i,j=1}^{n}a_{ij}x^*_{i} x_{j} = x^* A x[/tex] which is either > 0 (if A is positive definite) or >= 0 (if A is positive semidefinite), regardless of how we choose x(S). Thus A(S) is also positive definite or semidefinite, respectively. Now we can apply the result from post 6 to each principal submatrix of the form [tex]A(S) = \left[\begin{array}{cc} a_{ii} & a_{ij} \\ a_{ji} & a_{jj} \end{array}\right][/tex] (noting that [tex]a_{ji} = \overline{a_{ij}}[/tex] because A is hermitian) to obtain the result [tex]a_{ii}a_{jj} \geq a_{ij}^2[/tex] (we can conclude strict inequality > in the positive definite case). P.S. Horn and Johnson have a full chapter of nearly 100 pages on the subject of positive definite matrices. It's an excellent book, and I recommend getting your hands on a copy if you can. 


#17
May2909, 11:15 AM

P: 70

@jbunniii,
Thanks, I got the proof too! I'm trying to have a copy of the book. (Previously I used another oneZhang's "Matrix Theory"). Regards, NP 


#18
Dec2510, 12:47 PM

P: 2

@jbunniii,
thanks very much, I have been looking for this in recent days! 


Register to reply 
Related Discussions  
Positivesemidefinite matrix  Linear & Abstract Algebra  1  
Positive Definite  Calculus & Beyond Homework  7  
What is a positive definite Hamiltonian?  Quantum Physics  7  
Ker Positive Definite Matrix  Calculus & Beyond Homework  2  
Positive definite matrices  Linear & Abstract Algebra  4 