Eigenvalues and eigenvectors of a matrix product

In summary, the conversation discusses using information about the eigenvalues and eigenvectors of two nxn matrices, A and B, to make statements about the eigenvalues and eigenvectors of their product C=A*B. The largest eigenvalue and its associated eigenvector of C are of particular interest. It is noted that if A and B share an eigenvector, then the eigenvectors of their product have the same eigenvalue. However, it is unclear what can be said about the product if there are no shared eigenvectors. The question remains whether there is any relationship between the eigenvectors of A and B and those of their product C.
  • #1
Leo321
38
0
We have two nxn matrices with non-negative elements, A and B.
We know the eigenvalues and eigenvectors of A and B.
Can we use this information to say anything about the eigenvalues or eigenvectors of C=A*B?
The largest eigenvalue of C and the associated eigenvector are of particular interest.
So can anything be said about C? Even a weak inequality may be useful. Are there particular sets of A and B, for which we can say something?
We can't assume however that the matrices commute.
 
Physics news on Phys.org
  • #2
We can provided A and B share an eigenvector. If v is an eigenvector of A with eigenvalue [itex]\lambda_A[/itex] and also an eigenvector of B with eigenvalue [itex]\lambda_B[/itex] then
[tex]ABv= A(Bv)= A(\lambda_Bv)= \lambda_BAv= \lambda_B(\lambda_Av)= (\lambda_B\lambda_A)v[/tex]
and
[tex]BAv= B(Av)= B(\lambda_Av)= \lambda_ABv= \lambda_A(\lambda_Bv)= (\lambda_A\lambda_B)v[/tex]

That is, if v is an eigenvector of both A and B, with eigenvalues [itex]\lambda_A[/itex] and [itex]\lambda_B[/itex] respectively, then it is also an eigenvector of both AB and BA with eigenvalue [itex]\lambda_A\lambda_B[/itex].
 
  • #3
The question is what can we know about the product if there are no shared eigenvectors. What happens for example if the eigenvectors are close, but not the same?
Is there anything we can say about the eigenvectors of the product based on the eigenvectors and eigenvalues of A and B?
 

FAQ: Eigenvalues and eigenvectors of a matrix product

1. What are eigenvalues and eigenvectors of a matrix product?

Eigenvalues and eigenvectors are important mathematical concepts in linear algebra. In the context of a matrix product, they refer to the values and corresponding vectors that, when multiplied by the matrix, result in the same vector scaled by a constant factor. In other words, the eigenvectors are the directions that are not affected by the matrix transformation, and the eigenvalues are the scaling factors.

2. How do you find the eigenvalues and eigenvectors of a matrix product?

The process of finding the eigenvalues and eigenvectors of a matrix product involves solving the characteristic equation of the matrix. This equation is obtained by subtracting the identity matrix multiplied by a scalar from the original matrix and setting the determinant equal to 0. The resulting solutions correspond to the eigenvalues of the matrix. To find the corresponding eigenvectors, the eigenvalue is substituted back into the original matrix and the system of equations is solved.

3. Why are eigenvalues and eigenvectors important in data analysis and machine learning?

Eigenvalues and eigenvectors play a crucial role in data analysis and machine learning algorithms, as they allow for the reduction of the dimensionality of data and the extraction of important features. By finding the eigenvalues and eigenvectors of a dataset, it is possible to identify the most significant directions or patterns present in the data, which can then be used to make predictions or classify new data points.

4. Can a matrix product have multiple eigenvalues and eigenvectors?

Yes, it is possible for a matrix product to have multiple eigenvalues and corresponding eigenvectors. In fact, most matrices have multiple eigenvalues and eigenvectors, except for special cases such as identity matrices. The number of eigenvalues and eigenvectors is equal to the size of the matrix.

5. How are eigenvalues and eigenvectors related to the diagonalization of a matrix?

The diagonalization of a matrix involves finding a diagonal matrix that is similar to the original matrix, meaning that they share the same eigenvalues. The eigenvectors of the original matrix can then be used to transform the diagonal matrix into the original matrix. In other words, the diagonalization of a matrix is a way to express a matrix in terms of its eigenvalues and eigenvectors, which can simplify calculations and reveal important properties of the matrix.

Back
Top