Is Q-P Symmetric Positive-Definite?

  • Context: Graduate 
  • Thread starter Thread starter arunakkin
  • Start date Start date
  • Tags Tags
    Eigenvalues Matrix
Click For Summary
SUMMARY

The discussion centers on the question of whether the matrix Q, defined as Q = SPS where P is a symmetric positive-definite matrix and S is a diagonal matrix with all diagonal elements greater than 1, is also symmetric positive-definite. Participants suggest that an analytical proof can be established by performing an eigen-analysis on a modified matrix P*, which incorporates the scaling effects of S. The consensus is that if the characteristic polynomial of Q has roots greater than those of P, then eig(Q) will be greater than or equal to eig(P) element-wise, confirming the positive-definiteness of Q.

PREREQUISITES
  • Understanding of symmetric positive-definite matrices
  • Familiarity with eigenvalues and eigen-analysis
  • Knowledge of characteristic polynomials
  • Experience with matrix operations and diagonalization
NEXT STEPS
  • Study the properties of symmetric positive-definite matrices in detail
  • Learn about eigenvalue decomposition and its applications
  • Research characteristic polynomials and their significance in linear algebra
  • Explore determinant properties and their implications for matrix transformations
USEFUL FOR

Mathematicians, data scientists, and engineers involved in linear algebra, particularly those working with matrix theory and eigenvalue problems.

arunakkin
Messages
2
Reaction score
0
Assume P is a symmetric positive-definite matrix,
and S to be a diagonal matrix with all its diagonal elements being greater than 1.

Let Q = SPS

then is Q-P symmetric positive-definite ?
i.e.
are the eigen-values of Q greater than P element-wise? or eig(Q)>= eig(P) in non-negative orthant

I tried a simulation to check if it is true. I did not come up with a case which disproves it.
I would be grateful if an analytical proof is provided.
Intuitively it makes sense. We scale any vector multiplied by Q (= SPS) before multiplying it with P. And as the eigen-values represent scaling, the resultant eigen-values of Q must be greater than P as long as the scaling by S increases vector in all dimensions (i.e. diag elements of S are >= 1)


Thanks so much,
arunakkin
 
Physics news on Phys.org
Hey arunakkin and welcome to the forums.

For this particular problem I suggest you create a matrix P* where you absorb the S entries since they are diagonal. Absorbing these entries is simple and you can do this by multiplying each column by the appropriate diagonal value in that column. You do the same thing for RHS S matrix (expand this matrix out just for clarity and to check this yourself).

This means you will get P* where each column entry is multiplied by the square of the appropriate diagonal. You can then do an eigen-analysis on this new matrix P* and prove definitely whether your conjecture is correct.
 
Hi,
Thanks for the reply and interest.

I think what you are trying to say is that, in terms of above notation,

q(i,j) = p(i,j)*s(i,i)*s(j,j)

This is what I did in the simulation I mentioned in the post.
I randomly generated a symmetric positive definite matrix, and scaled (s(i,i)>=1 for all i) it as mentioned above and compared the eigenvalues of Q with that of P (after sorting them).
I wasn't able to find a result where eig(Q)<eig(P) (element-wise).


But this alone doesn't prove anything (as I might not have come across the case where this is disproved in my simulation).
Can you elaborate on eigen-analysis. Are you referring to diagonalization.
If so, I tried this but was not able to find any meaningful relationship which'll definitely prove that (Q-P) is positive definite (or eig(Q)> eig(P) ,element-wise)

Regards,
arunakkin.
 
Well, yes the idea is to show that the roots of the characteristic polynomial are greater (since you are trying to show eig(Q) >= eig(P)).

What I suggest is to relate the characteristic equation for Q to that of P taking into account that the columns have been scaled for the square of the diagonal entries.

You could if you wanted a general argument, use the properties of the determinant where |A^T| = |A| and the properties of where you factor out a column or a row of the matrix and how that affects the determinant.

I would start with the characteristic equation and compare the two in some way to show that the roots have the property you desire.

If you show that the roots are greater of char(Q) than char(P), that's a real proof and you're done.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
10K