# Eigen-vectors/values under row flipping

After the eigendecomposition of the following matrix is performed, I wonder what happens to the eigenvectors and eigenvalues of the matrix obtained by flipping rows of the original. Say the original is
0 5 7 8
5 0 2 9
7 2 0 3
8 9 3 0
and the flipped version is:
5 0 2 9
8 9 3 0
7 2 0 3
0 5 7 8
Using the online matrix calculator says (at first glance) that the eigendecomposition data of the original and flipped version have no relation. But, SVD of the original and the flipped appear to be related: right singular vectors and singular values of the original and flipped are the same, and they are equal to the eigenvectors and eigenvalues of the original. (left singular vectors of flipped matrix are also flipped). I would like to hear the reasoning behind this behaviour.
More precisely, what is the relation between the eigendecomposition of the original and the flipped version, and how might this be related to the svd.

Thanks

Related Linear and Abstract Algebra News on Phys.org
Generally speaking when you flip one row with another, that can be described by a matrix acting from the left onto the right of another matrix, i.e.

A(the flipping matrix) * original_matrix = matrix with flipped rows
(likewise, original_matrix * A = matrix_with_flipped_columns)
In this particular case to flip, A has to be something simple like a re-organized identity matrix.

So in the case of the SVD the U vectors absorb the row flipping that you do to your original matrix. (Since yourMatrix, M = U*E*V_t).

For eigenvectors the U and V have to be same vectors, so now the U can't just absorb the change, and a new set of eigenvectors has to be found.

So, eigevectors and eigenvalues of the original and flipped matrix are not related (meaning that the eigendecomposition of the flipped matrix needs to be computed anew)?

Short answer is that it needs to be recomputed anew.

Ax=x * $$\lambda$$
where A is your original matrix and x is an eigenvector, and $$\lambda$$ is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*$$\lambda$$

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R$$^{-1}$$ * R.

So you have
{R*A*R$$^{-1}$$} * {R*x} = {R*x} * $$\lambda$$

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your matrix is not similar to the original matrix and will have its own eigenvalues/vectors.

(This is really annoying the Latex isn't showing up properly, so if you see a -1 and \lambda swapped between the last 2 equations that is why).

Last edited:
Wow -- last one blew up -- I pulled out the coding, hopefully this is a little easier to read.
----------------------------------------------------------------------------

Short answer is that it needs to be recomputed anew.

Ax=x * \lambda
where A is your original matrix and x is an eigenvector, and \lambda is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*\lambda

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R^{-1} * R.

So you have
(R*A*R^{-1} ) * (R*x) = (R*x) * \lambda

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your new matrix is not similar to the original matrix and will have its own eigenvalues/vectors.

And now that I take all the latex out, the old latex starts working fine!!

I give up :)

Thanks.