Eigenvalues of a scaled matrix

In summary: Otherwise, you might have to continue with the demonstration.In summary, arunakkin suggests that if Q is a symmetric positive-definite matrix, then eig(Q)>=eig(P) in non-negative orthant.
  • #1
arunakkin
2
0
Assume P is a symmetric positive-definite matrix,
and S to be a diagonal matrix with all its diagonal elements being greater than 1.

Let Q = SPS

then is Q-P symmetric positive-definite ?
i.e.
are the eigen-values of Q greater than P element-wise? or eig(Q)>= eig(P) in non-negative orthant

I tried a simulation to check if it is true. I did not come up with a case which disproves it.
I would be grateful if an analytical proof is provided.
Intuitively it makes sense. We scale any vector multiplied by Q (= SPS) before multiplying it with P. And as the eigen-values represent scaling, the resultant eigen-values of Q must be greater than P as long as the scaling by S increases vector in all dimensions (i.e. diag elements of S are >= 1)


Thanks so much,
arunakkin
 
Physics news on Phys.org
  • #2
Hey arunakkin and welcome to the forums.

For this particular problem I suggest you create a matrix P* where you absorb the S entries since they are diagonal. Absorbing these entries is simple and you can do this by multiplying each column by the appropriate diagonal value in that column. You do the same thing for RHS S matrix (expand this matrix out just for clarity and to check this yourself).

This means you will get P* where each column entry is multiplied by the square of the appropriate diagonal. You can then do an eigen-analysis on this new matrix P* and prove definitely whether your conjecture is correct.
 
  • #3
Hi,
Thanks for the reply and interest.

I think what you are trying to say is that, in terms of above notation,

q(i,j) = p(i,j)*s(i,i)*s(j,j)

This is what I did in the simulation I mentioned in the post.
I randomly generated a symmetric positive definite matrix, and scaled (s(i,i)>=1 for all i) it as mentioned above and compared the eigenvalues of Q with that of P (after sorting them).
I wasn't able to find a result where eig(Q)<eig(P) (element-wise).


But this alone doesn't prove anything (as I might not have come across the case where this is disproved in my simulation).
Can you elaborate on eigen-analysis. Are you referring to diagonalization.
If so, I tried this but was not able to find any meaningful relationship which'll definitely prove that (Q-P) is positive definite (or eig(Q)> eig(P) ,element-wise)

Regards,
arunakkin.
 
  • #4
Well, yes the idea is to show that the roots of the characteristic polynomial are greater (since you are trying to show eig(Q) >= eig(P)).

What I suggest is to relate the characteristic equation for Q to that of P taking into account that the columns have been scaled for the square of the diagonal entries.

You could if you wanted a general argument, use the properties of the determinant where |A^T| = |A| and the properties of where you factor out a column or a row of the matrix and how that affects the determinant.

I would start with the characteristic equation and compare the two in some way to show that the roots have the property you desire.

If you show that the roots are greater of char(Q) than char(P), that's a real proof and you're done.
 
  • #5
eni

I would like to first clarify that the statement "eig(Q) >= eig(P)" does not necessarily mean that all the eigenvalues of Q are greater than or equal to all the eigenvalues of P. It simply means that the eigenvalues of Q are greater than or equal to the corresponding eigenvalues of P in a non-negative orthant (i.e. all eigenvalues are positive or zero).

Now, to address the question at hand, let us first consider the definition of a symmetric positive-definite matrix. A matrix P is symmetric positive-definite if it satisfies the following conditions:

1. P is symmetric, i.e. P = P^T
2. All eigenvalues of P are positive, i.e. eig(P) > 0

Using these conditions, we can prove that Q-P is also symmetric and positive-definite.

1. Symmetry:
Since P is symmetric, we have Q-P = SPS - P = SPS - P^T = SPS - (SP)^T = SPS - S^T P^T = SP(S - S^T)P^T.
Since S is a diagonal matrix, S = S^T, and thus Q-P = SP(S - S)P^T = SP^2 (S - S) = SP^2 0 = 0. Therefore, Q-P is symmetric.

2. Positive-definite:
Since P is positive-definite, all its eigenvalues are positive. Additionally, since S is a diagonal matrix with all diagonal elements greater than 1, we can say that all eigenvalues of S are also greater than 1. This means that when we multiply P by S, the resulting eigenvalues will be larger than the original eigenvalues of P. And since we are squaring the diagonal elements of S, these eigenvalues will be even larger when we multiply by S again. This means that all eigenvalues of SP^2 will be greater than the eigenvalues of P (in a non-negative orthant). Therefore, all eigenvalues of Q-P = SP^2 0 will be greater than 0, making it a positive-definite matrix.

In conclusion, we can say that Q-P is indeed symmetric and positive-definite. And since the eigenvalues of a positive-definite matrix are always greater than 0, we can also say that the eigenvalues of Q-P will be greater than the eigen
 

1. What are eigenvalues of a scaled matrix?

The eigenvalues of a scaled matrix are the special set of numbers that represent the scaling factor for the corresponding eigenvectors. In other words, they are the values that when multiplied by the eigenvectors, result in a scaled version of the original vector.

2. How do you find the eigenvalues of a scaled matrix?

To find the eigenvalues of a scaled matrix, you can use the characteristic equation or the determinant method. The characteristic equation involves solving a polynomial equation, while the determinant method involves finding the determinant of the matrix and solving for the values that make it equal to zero.

3. What is the significance of eigenvalues of a scaled matrix?

The eigenvalues of a scaled matrix have many applications in various fields such as physics, engineering, and computer science. They are used to find the equilibrium points of a system, solve differential equations, and perform dimensionality reduction in data analysis.

4. Can a scaled matrix have complex eigenvalues?

Yes, a scaled matrix can have complex eigenvalues. In fact, complex eigenvalues often occur when dealing with systems that involve oscillatory behavior. This is because the characteristic equation may have complex roots, resulting in complex eigenvalues.

5. How do eigenvalues change when a matrix is scaled?

When a matrix is scaled by a factor of k, the eigenvalues are also scaled by the same factor. In other words, multiplying a matrix by a constant results in the eigenvalues being multiplied by the same constant. For example, if a matrix has eigenvalues of 2 and 5, scaling it by 3 would result in eigenvalues of 6 and 15.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • General Engineering
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
733
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
4K
  • Linear and Abstract Algebra
Replies
17
Views
4K
Back
Top