Eigen-vectors/values under row flipping

  • Thread starter Thread starter onako
  • Start date Start date
  • Tags Tags
    Row
Click For Summary
Flipping the rows of a matrix affects its eigenvectors and eigenvalues, necessitating a recomputation of eigendecomposition for the new matrix. The singular value decomposition (SVD) shows that the right singular vectors and singular values remain unchanged between the original and flipped matrices, as they correspond to the same eigenvectors and eigenvalues. However, the left singular vectors of the flipped matrix are altered due to the row flipping. This behavior indicates that while SVD can absorb row transformations, eigendecomposition cannot, as it requires the matrices to be similar. Thus, the eigenvalues and eigenvectors of the original and flipped matrices are not related.
onako
Messages
86
Reaction score
0
After the eigendecomposition of the following matrix is performed, I wonder what happens to the eigenvectors and eigenvalues of the matrix obtained by flipping rows of the original. Say the original is
0 5 7 8
5 0 2 9
7 2 0 3
8 9 3 0
and the flipped version is:
5 0 2 9
8 9 3 0
7 2 0 3
0 5 7 8
Using the online matrix calculator says (at first glance) that the eigendecomposition data of the original and flipped version have no relation. But, SVD of the original and the flipped appear to be related: right singular vectors and singular values of the original and flipped are the same, and they are equal to the eigenvectors and eigenvalues of the original. (left singular vectors of flipped matrix are also flipped). I would like to hear the reasoning behind this behaviour.
More precisely, what is the relation between the eigendecomposition of the original and the flipped version, and how might this be related to the svd.

Thanks
 
Physics news on Phys.org
Generally speaking when you flip one row with another, that can be described by a matrix acting from the left onto the right of another matrix, i.e.

A(the flipping matrix) * original_matrix = matrix with flipped rows
(likewise, original_matrix * A = matrix_with_flipped_columns)
In this particular case to flip, A has to be something simple like a re-organized identity matrix.

So in the case of the SVD the U vectors absorb the row flipping that you do to your original matrix. (Since yourMatrix, M = U*E*V_t).

For eigenvectors the U and V have to be same vectors, so now the U can't just absorb the change, and a new set of eigenvectors has to be found.
 
So, eigevectors and eigenvalues of the original and flipped matrix are not related (meaning that the eigendecomposition of the flipped matrix needs to be computed anew)?
 
Short answer is that it needs to be recomputed anew.

Start with your basic eigen equation:

Ax=x * \lambda
where A is your original matrix and x is an eigenvector, and \lambda is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*\lambda

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R^{-1} * R.

So you have
{R*A*R^{-1}} * {R*x} = {R*x} * \lambda

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your matrix is not similar to the original matrix and will have its own eigenvalues/vectors.

(This is really annoying the Latex isn't showing up properly, so if you see a -1 and \lambda swapped between the last 2 equations that is why).
 
Last edited:
Wow -- last one blew up -- I pulled out the coding, hopefully this is a little easier to read.
----------------------------------------------------------------------------

Short answer is that it needs to be recomputed anew.

Start with your basic eigen equation:

Ax=x * \lambda
where A is your original matrix and x is an eigenvector, and \lambda is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*\lambda

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R^{-1} * R.

So you have
(R*A*R^{-1} ) * (R*x) = (R*x) * \lambda

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your new matrix is not similar to the original matrix and will have its own eigenvalues/vectors.
 
And now that I take all the latex out, the old latex starts working fine!

I give up :)
 
Thanks.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
3
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 7 ·
Replies
7
Views
1K