Is Tr(ABAB) Nonnegative for Symmetric Matrices A and B?

  • Thread starter Thread starter jostpuur
  • Start date Start date
  • Tags Tags
    Inequality Trace
Click For Summary
SUMMARY

The discussion centers on the claim that for symmetric matrices A and B, if at least one matrix does not have negative eigenvalues, then the trace of the product Tr(ABAB) is nonnegative. The proof hinges on demonstrating that the eigenvalues of the product AB are real, which can be achieved by simultaneously diagonalizing A and B. The participants explore the implications of positive definiteness and the conditions under which Tr(ABAB) can be strictly positive, emphasizing the need for careful handling of zero eigenvalues in the diagonalization process.

PREREQUISITES
  • Understanding of symmetric matrices and their properties
  • Knowledge of eigenvalues and eigenvectors
  • Familiarity with matrix diagonalization techniques
  • Concept of positive definiteness in linear algebra
NEXT STEPS
  • Study the properties of symmetric matrices and their eigenvalues
  • Learn about simultaneous diagonalization of matrices
  • Investigate the implications of positive definiteness on matrix products
  • Explore the concept of trace in linear algebra and its applications
USEFUL FOR

Mathematicians, students of linear algebra, and researchers in applied mathematics focusing on matrix theory and eigenvalue problems.

jostpuur
Messages
2,112
Reaction score
19
Here's the claim: Assume that A and B are both symmetric matrices of the same size. Also assume that at least other one of them does not have negative eigenvalues. Then

<br /> \textrm{Tr}(ABAB)\geq 0<br />

I don't know how to prove this!
 
Physics news on Phys.org
If ##\lambda## is an eigenvalue of ##AB## then ##\lambda^2## is an eigenvalue of ##ABAB##. Since the trace is the sum of the eigenvalues, it suffices to show that the eigenvalues of ##AB## are all real. This would be true if ##AB## were symmetric, but the product of two symmetric matrices need not be symmetric. But can you show that ##AB## is similar to a symmetric matrix?
 
Never mind, I succeeded in proving this now. After months of wondering the proof appears in few minutes after posting to PF as usual :wink:

My hint for this is that you should first prove that you can assume A to be diagonal with non-negative entries.
 
According to my formulas this is true too: If A is positive definite, and B is non-zero (and the other assumptions), then

<br /> \textrm{Tr}(ABAB) &gt; 0<br />

It is not clear which would be the most desired form for this problem.
 
jbunniii said:
Only positive semidefinite in this case.

My bad. I tripped over the meaning of
at least other one of them does not have negative eigenvalues
Too many negatives!
 
Clearly if ##A## can be semi-definite, it can be zero, and ##ABAB = 0##.

The basic idea of my thinking is to show that ##AB## has a full set of real eigenvalues, by simultaneously diagonalizing ##A## and ##B##. The eigenvalues of ##(AB)(AB)## are then all positive.

If the diagonalization fails because ##A## is singular, replace ##A## with the nonsingular matrix ##A + eI## where ##e > 0##, and take the limit as ##e \to 0##.
 
AlephZero said:
Clearly if ##A## can be semi-definite, it can be zero, and ##ABAB = 0##.

The basic idea of my thinking is to show that ##AB## has a full set of real eigenvalues, by simultaneously diagonalizing ##A## and ##B##. The eigenvalues of ##(AB)(AB)## are then all positive.

If the diagonalization fails because ##A## is singular, replace ##A## with the nonsingular matrix ##A + eI## where ##e > 0##, and take the limit as ##e \to 0##.
I was thinking of doing something similar to how one can "compress" the singular value decomposition when some of the singular values are zero. Here's a sketch:

Since ##A## is symmetric, it can be orthogonally diagonalized: ##A = QDQ^T## where ##D## is diagonal. Since ##A## is positive semidefinite, ##D## has nonnegative diagonal elements: namely, the eigenvalues of ##A##. We may remove any zero eigenvalues as follows. For each ##i## for which ##[D]_{i,i} = 0##, delete the ##i##'th row and column from ##D##. Call the result ##D_0##. Also for each such ##i##, delete the ##i##'th column from ##Q##. Call the result ##Q_0##.

Thus if there is a zero eigenvalue with algebraic multiplicity ##k## and the dimension of ##A## is ##n\times n##, then ##D_0## is ##(n-k)\times(n-k)## and ##Q_0## is ##n \times (n-k)##.

We have ##A = Q_0 D_0 Q_0^T## but now ##D_0## is invertible whereas ##D## was not.

Caveat for any subsequent calculations: ##Q_0## is now rectangular, and we still have ##Q_0^T Q_0 = I## but ##Q_0 Q_0^T## will not be ##I##.

I haven't checked whether this delivers the goods.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
5
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
5K
  • · Replies 4 ·
Replies
4
Views
6K