Some questions about eigenvector computation

In summary: My Question is #4 is whether there is such a condition, and if so, it is a condition that is somehow always met, and if not, does that mean that that coordinate cannot be chosen for b.In summary, the eigenproblem is solved by finding the set of eigenvalues and corresponding eigenvectors that satisfy the equation [ K ] { x } = λ [ M ] { x }. The characteristic matrix [ G( λ ) ] is used to solve for the eigenvalues, and the eigenproblem matrix EQ is used to solve for the eigenvectors. The rank of linear dependency of [ G( λ
  • #1
swampwiz
571
83
NOTE: For the answers to all these questions, I'd like an explanation (or a reference to a book or internet page) of how the answer has been derived.

This question can be presumed to be for the general eigenproblem in which [ K ] & [ M ] are Hermitian matrices, with [ M ] also being positive definite, or [ K ] is a normal matrix and [ M ] is the identity matrix. My Question #0 is whether these conditions must be met for there to be a complete eigensolution, or are they more narrow or broad. (I understand that there is finagling that can be done to get [ K ] & [ M ] to get [ M ] to be positive definite, which is called a positive definite pencil)

[ K ] { x } = λ [ M ] { x }

There is the characteristic matrix which is a function of λ

[ G( λ ) ] = [ K ] - λ[ M ]

and the eigenproblem matrix EQ

[ G( λ ) ] { φ } = { 0 }

from which the set of λ is solved by setting the determinant of [ G( λ ) ] to 0. So far so good.

The next step is to solve for the eigenvectors corresponding to each eigenvalue (presume here that the eigenvalues are distinct - the question of what to do if there are repeated values is a question for another thread). Now I get that [ G( λ ) ] has some rank of linear dependency as its determinant is de facto 0 as per the condition. My Question #1 is whether it always has linear dependency of rank 1, or can it have a higher rank.

OK, so the eigenproblem matrix EQ is partitioned into a singular boundary section and an internal section for the rest of the coordinates

Gbb( λ ) φb + { Gbi( λ ) }T { φb } = 0

[ Gib( λ ) ] { φb } + [ Gii( λ ) ] { φi } = 0

The value for φb is then assigned some dummy value (i.e., typically 1), so that the latter partition EQ becomes

{ φi } = [ Gii[/SUB ]( λ ) ]-1 [ Gib( λ ) ]

So obviously [ Gii( λ ) ] must be invertible, and thus b must be chosen to result in this. My Question #2 is whether it is guaranteed that there will always be some b such that the resulting [ Gii( λ ) ] is invertible. My Question #3 is if it turns out that it is not invertible, does that imply that the value of that coordinate's element in { φ } will eventually be calculated to be 0, and if so, is there the converse implication.

And as for Gbb( λ ), there doesn't seem to be the condition that it not be 0 as nothing is being solved for the b partition, although it sure seems like there should be. My Question is #4 is whether there is such a condition, and if so, it is a condition that is somehow always met, and if not, does that mean that that coordinate cannot be chosen for b.

Thanks
NOTE: For the answers to all these questions, I'd like an explanation (or a reference to a book or internet page) of how the answer has been derived.

This question can be presumed to be for the general eigenproblem in which [ K ] & [ M ] are positive Hermitian matrices, or [ K ] is a normal matrix and [ M ] is the identity matrix.

[ K ] { x } = λ [ M ] { x }

There is the characteristic matrix which is a function of λ

[ G( λ ) ] = [ K ] - λ[ M ]

and the eigenproblem matrix EQ

[ G( λ ) ] { φ } = { 0 }

from which the set of λ is solved by setting the determinant of [ G( λ ) ] to 0. So far so good.

The next step is to solve for the eigenvectors corresponding to each eigenvalue (presume here that the eigenvalues are distinct - the question of what to do if there are repeated values is a question for another thread). Now I get that [ G( λ ) ] has some rank of linear dependency as its determinant is de facto 0 as per the condition. My Question #1 is whether it always has linear dependency of rank 1, or can it have a higher rank.

OK, so the eigenproblem matrix EQ is partitioned into a singular boundary section and an internal section for the rest of the coordinates

Gbb( λ ) φb + { Gbi( λ ) }T { φb } = 0

[ Gib( λ ) ] { φb } + [ Gii( λ ) ] { φi } = 0

The value for φb is then assigned some dummy value (i.e., typically 1), so that the latter partition EQ becomes

{ φi } = [ Gii( λ ) ]-1 [ Gib( λ ) ]

So obviously [ Gii( λ ) ] must be invertible, and thus b must be chosen to result in this. My Question #2 is whether it is guaranteed that there will always be some b such that the resulting [ Gii( λ ) ] is invertible. My Question #3 is if it turns out that it is not invertible, does that imply that the value of that coordinate's element in { φ } will eventually be calculated to be 0, and if so, is there the converse implication.

And as for Gbb( λ ), there doesn't seem to be the condition that it not be 0 as nothing is being solved for the b partition, although it sure seems like there should be. My Question is #4 is whether there is such a condition, and if so, it is a condition that is somehow always met, and if not, does that mean that that coordinate cannot be chosen for b.

Thanks
 
Last edited:
  • #3
Well, I have been reading up, and I am getting a better idea about the theory which would answer my questions, but at present, I still am confused.
 
  • #4
Hey swapwiz.

Do you understand how a zero determinant leads to dependence between vectors (within a system) in some way?

That is the most crucial aspect of getting towards the characteristic polynomial and from there on-wards it is using the answer to satisfy this property.
 
  • #5
Well, I have been reading up, and I am getting a better idea about the theory which would answer my questions, but at present, I still am confused.
chiro said:
Hey swapwiz.

Do you understand how a zero determinant leads to dependence between vectors (within a system) in some way?

That is the most crucial aspect of getting towards the characteristic polynomial and from there on-wards it is using the answer to satisfy this property.

Yes, I understand that the homogenous matrix EQ yields a trivial result (i.e., of all 0's), but that non-trivial results are possible when the determinant of the coefficient matrix is 0. I guess where I am confused is how to determine how many linear dependency constraints there are with such a zero-determinant matrix, since more than one dependency yields the same zero-determinant value. I have a hunch that for the eigenproblem, there is always the one linear dependency such that the eigenvectors are all relative to each other as per a certain amount, and each repeat of an eigenvalue results in another dependency. Is there any way short of doing an eigendecomposition to determine how many linear dependencies there are? Also, is the number of linear dependency constraints always equal to the difference of the matrix size and its rank?
 
  • #6
Yes - that is the point.

Think about the situation where either have a zero-vector or a non-zero vector.

That will help you resolve the problem.
 

1. What is an eigenvector and why is it important in computation?

An eigenvector is a vector that does not change direction when multiplied by a given matrix. It is important in computation because it helps in understanding the behavior of a linear transformation and can simplify complex calculations.

2. How do I compute eigenvectors?

To compute eigenvectors, you first need to find the eigenvalues of the matrix. Then, for each eigenvalue, you can find the corresponding eigenvector by solving a system of linear equations using the eigenvalue as a parameter.

3. Can eigenvectors be complex numbers?

Yes, eigenvectors can be complex numbers. In fact, complex eigenvectors are often used in quantum mechanics and other fields of physics.

4. What is the significance of the magnitude and direction of an eigenvector?

The magnitude of an eigenvector represents the scale factor by which the vector is scaled when multiplied by the corresponding eigenvalue. The direction of an eigenvector represents the direction in which the vector does not change when multiplied by the matrix.

5. Are there any real-world applications of eigenvectors?

Yes, eigenvectors have numerous real-world applications, including image and signal processing, finance, and data analysis. They are also used in machine learning algorithms, such as principal component analysis.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
2K
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
16
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Back
Top