# Verification sequence of eigenvalue problem

1. May 29, 2012

### Ronankeating

Hi all,

What is the normal procedure to verify that I got the correct results (eigenvalues and eigen vectors) from the eigenvalue problem?

I'm using the lapack library to solve eigenvalue problem summarized below. I've 2 matrices K and M and I get the negative results for eigenvalues which arouse my suspicion that the results could be wrong. In order to verify that, the given solutions (eigenvalues and eigenvectors) must comply with that K*X = λ*M*X equation where λ=eigenvalues and X =eigenvectors. I multiplied that λ*M, where I was expecting the K matrix but the results(named as "Verification phase" below) are not even close to those inner products of K matrix.

What am I doing wrong here ?

K MATRIX
0.2400 0.3900 0.4200 -0.1600
0.3900 -0.1100 0.7900 0.6300
0.4200 0.7900 -0.2500 0.4800
-0.1600 0.6300 0.4800 -0.0300
M MATRIX
4.1600 -3.1200 0.5600 -0.1000
-3.1200 5.0300 -0.8300 1.0900
0.5600 -0.8300 0.7600 0.3400
-0.1000 1.0900 0.3400 1.1800

EIGENVECTORS ARE:
-0.6901E-01 0.3080E+00 -0.4469E+00 -0.5528E+00
-0.5740E+00 0.5329E+00 -0.3708E-01 -0.6766E+00
-0.1543E+01 -0.3496E+00 0.5048E-01 -0.9276E+00
0.1400E+01 -0.6211E+00 0.4743E+00 0.2510E+00

EIGENVALUES
-2.2254 -0.4548 0.1001 1.1270

VERIFICATION PHASE (λ*M )
-9.2579 6.9434 -1.2463 0.2225
1.4188 -2.2874 0.3774 -0.4957
0.0560 -0.0831 0.0761 0.0340
-0.1127 1.2285 0.3832 1.3299

2. May 29, 2012

### AlephZero

It looks OK to me. For the first eigenpair
$$Kx = K\begin{bmatrix}-0.0690 \\ -0.5740 \\ -1.5430 \\ 1.4000 \end{bmatrix} = \begin{bmatrix} -1.1125 \\ -0.3007 \\ 0.5753 \\ -1.1332 \end{bmatrix}$$
$$Mx = M\begin{bmatrix}-0.0690 \\ -0.5740 \\ -1.5430 \\ 1.4000 \end{bmatrix} = \begin{bmatrix} 0.4997 \\ 0.1348 \\ -0.2589 \\ 0.5086 \end{bmatrix}$$
And that is consistent with Kx = -2.2254 Mx.

3. May 29, 2012

### AlephZero

Just to add: I think you got confused about whch quantites are scalars matrices or vectors in your equation K*X = λ*M*X

If you write it in that form, X is a column vector corresponding to the scalar λ, so you can't "cancel the vector X from each side". Of course there are really N separate equations, one for each λ and the corresponding X.

You can write it as one big matrix equation, but then you have to put the eigenvalues on the right hand side, $KX = MX \Lambda$ where $\Lambda$ is a diagonal matrix containing the λ's. $\Lambda MX$ multiplies the rows of MX by the eigenvalues, but $MX\Lambda$ multipkies the columns of MX which is what you want.

And you can't "cancel out X" from $KX = MX \Lambda$ either, but you can write $X^TKX = X^TMX \Lambda$, and you should find that $X^TKX$ and $X^TMX$ are both diagonal matrices. In fact the eigenvectors are often scaled so that $X^TMX = I$ and $X^TKX = \Lambda$.