# Generalized eigenvector

• I
• Alwar
In summary, the conversation discusses finding the generalized eigenvector matrix for a set of ODE's represented in matrix format. The matrix has algebraic multiplicity 3 and geometric multiplicity 2. The conversation also addresses a possible error in the journal manuscript, as well as the issue of choosing eigenvectors when there are degenerate eigenvalues. The final conclusion is that the sixth column of the eigenvector matrix is not a random vector, but one that allows for the calculation of a Jordan chain. f

#### Alwar

TL;DR Summary
To find the generalized eigenvectors of a 6x6 matrix
Hi,

I have a set of ODE's represented in matrix format as shown in the attached file. The matrix A has algebraic multiplicity equal to 3 and geometric multiplicity 2. I am trying to find the generalized eigenvector by algorithm (A-λI)w=v, where w is the generalized eigenvector and v is the eigenvector found by (A-λI)v=0. But I am not able to get the eigenvector matrix M as shown in the attached file.

Any help could be useful.

Thanks,
Alwar

#### Attachments

• Problem file.pdf
141.6 KB · Views: 142
• Delta2
I'm a little confused, it says the eigenvalues are ##\pm k^2## but then the Jordan normal form only has ##\pm k## on the diagonal. Aren't the diagonal elements supposed to be the eigenvalues?

Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?

I'm a little confused, it says the eigenvalues are ##\pm k^2## but then the Jordan normal form only has ##\pm k## on the diagonal. Aren't the diagonal elements supposed to be the eigenvalues?
This is the part of a manuscript published in the journal. I think they have mentioned eigenvalues wrongly as k^2. When I calculated its only k. I hope the Jordan matrix is correct.

Thanks.

Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?
Thanks Haborix. Inspecific, I cannot find the first and sixth column of eigenvectors in matrix M. But when I find the characteristic equation (A-λI)X = 0 for each of the eigenvalue, both these vectors satisfy the solution upon back substitution. But I cannot find them directly. Or the authors have taken any random vector?

Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?
For λ = -sqrt(α^2+β^2), solving the characteristic equation in the matlab, the fourth column vector in M is displayed as solution. However if we manually do, we can get four equations as shown in the attached figure. And the first equation repeats three times meaning infinite solutions. The sixth column vector will satisfy the solution. But is it just a random vector? Or can it be attained as a solution.

#### Attachments

• IMG_20210730_160242.jpg
42.2 KB · Views: 105
When you have degenerate eigenvalues you usually pick three eigenvectors such that they are all mutually orthogonal. I think there are some typos in your original attachment in post #1. In a few instances if a ##k## were a ##k^2## then two eigenvectors would be orthogonal. The big picture message is that there is freedom in choosing the eigenvectors for a given eigenvalue with when its multiplicity greater than 1.

But is it just a random vector? Or can it be attained as a solution.
I solved for the two eigenvectors ##\vec v_1, \vec v_2## for ##\lambda = -k## the usual way, and I found that the system ##(A-\lambda I)\vec x = \vec v_i## was inconsistent for both eigenvectors. But that was okay because any linear combination of the two is still an eigenvector, and there is one that results in a system with a solution, namely the sixth column of ##M##. So it's not a random vector, but the one that allows you to calculate a Jordan chain.

I found the vector by setting up an augmented matrix where the seventh column was a linear combination of the two eigenvectors and row-reduced until I ended up with a row of zeros in the left six columns. Then I solved for coefficients that caused the corresponding value in the seventh column to vanish.

By the way, I noticed a typo: the bottom diagonal element of ##J## should be ##-k##, not ##k##.

I solved for the two eigenvectors ##\vec v_1, \vec v_2## for ##\lambda = -k## the usual way, and I found that the system ##(A-\lambda I)\vec x = \vec v_i## was inconsistent for both eigenvectors. But that was okay because any linear combination of the two is still an eigenvector, and there is one that results in a system with a solution, namely the sixth column of ##M##. So it's not a random vector, but the one that allows you to calculate a Jordan chain.

I found the vector by setting up an augmented matrix where the seventh column was a linear combination of the two eigenvectors and row-reduced until I ended up with a row of zeros in the left six columns. Then I solved for coefficients that caused the corresponding value in the seventh column to vanish.
Hi Vela,

Thanks for your effort. Can you upload the solution of how you got the sixth column of eigenvector. I hope I will be able to grasp what you have done.

Sorry, no. I did the calculations using Mathematica and didn't save the notebook.

• Delta2
Hi Vela,

Thanks for your effort. Can you upload the solution of how you got the sixth column of eigenvector. I hope I will be able to grasp what you have done.
You can probably find a solution using mathworld