How do I numerically find eigenvectors for given eigenvalues?

It says that the matrix of eigenvectors is a product of the rotation matrices in the QR decomposition.In summary, the conversation discusses finding eigenvalues and eigenvectors for a square matrix using the QR algorithm. The solution involves writing out the eigenvalue equation and finding the vector that satisfies it for each value, and then using inverse iteration or Lanczos iteration to find the eigenvectors. Other methods such as singular value decomposition or using libraries for programming languages and math scripts are also mentioned as options.
  • #1
hkBattousai
64
0
My aim was to numerically calculate eigenvalues and eigenvectors for a square A matrix.

I managed to find the eigenvalues by using QR algorithm. Now, I can find all of the eigenvalues for any given square matrix. But, for the next step, how do I find the corresponding eigenvectors? Is there any numerical method which calculates the eigenvectors for given eigenvalues?

Please guide me.
 
Physics news on Phys.org
  • #2
You write out the eigenvalue equation and find the vector that satisfies it for each value.

If [itex]Av_i = a_iv_i[/itex] for the ith eigenvector, then solve [itex](A-Ia_i)v_i = 0[/itex] normally.
 
Last edited:
  • #3
Simon Bridge said:
You write out the eigenvalue equation and find the vector that satisfies it for each value.

If [itex]Av_i = a_iv_i[/itex] for the ith eigenvector, then solve [itex](A-Ia_i)v_i = 0[/itex] normally.

Your idea was very useful, but I found an alternative solution (page 18).

The algorithm:
[itex]S_0=Q_0[/itex]
[itex]Defactorize: \, A_n=Q_nR_n[/itex]
[itex]A_{n+1}=R_nQ_n[/itex]
[itex]S_{n+1}=S_nQ_n[/itex]

As the algorithm converges, [itex]A_n[/itex] become a diagonal matrix, whose diagonal elements give the eigenvalues. And the column vectors of [itex]S_n[/itex] gives the corresponding eigenvectors.

This is an implementation example:
Code:
template <class T>
void Matrix<T>::GetEigens(std::vector<T> & EigenValues, Matrix<T> & EigenVectors) throw(MatrixNotSquare)
{
	// Initializations
	Matrix<T> A = *this, Q, R;
	if (!A.IsSquare()) throw(MatrixNotSquare(L"The matrix must be a square."));
	EigenValues.clear();
	EigenVectors = Matrix<T>(m_unRowSize, m_unColSize);

	// Find eigenvalues and eigenvectors
	A.QRDecomposition(Q, R);
	A = R * Q;
	EigenVectors = Q;
	for (uint64_t i=0; i<ITERATIONS; i++)
	{
		A.QRDecomposition(Q, R);
		A = R * Q;
		EigenVectors *= Q;
		if (A.IsDiagonal()) break;
	}
	for (uint64_t i=0; i<A.GetRowSize(); i++)
	{
		EigenValues.push_back(A(i, i));
	}
}

This code is running successfully and giving correct results. A sample output is attached.
 

Attachments

  • Code Output.png
    Code Output.png
    1.6 KB · Views: 1,180
  • #4
Yep - the algorith does what I described.
When you asked for a numerical method, you failed to specify your constraints.

Most programming math libraries have a defined function to find the eigenvalues and eigenvectors of a matrix.

For gnu-octave, there is a built-in function:

Code:
[b]Loadable Function: [V, LAMBDA] = eig (A)[/b]
     The eigenvalues (and eigenvectors) of a matrix are computed in a
     several step process which begins with a Hessenberg decomposition,
     followed by a Schur decomposition, from which the eigenvalues are
     apparent.  The eigenvectors, when desired, are computed by further
     manipulations of the Schur decomposition.

     The eigenvalues returned by `eig' are not ordered.

I used to use this to solve the schrodinger equation in 1D.
 
  • #5
Another approach is to use singular value decomposition. It's a beast to program, but because it is so very handy (so very, very, very handy), someone has inevitably done it for you already. Pick a language / tool and you will almost certainly find an SVD implementation -- even if you use a language such as Visual Basic and Cobol that is hardly ever used for scientific programming.
 
  • #6
D H said:
Another approach is to use singular value decomposition.

In practice that may be a slightly recursive answer, because the a popular way to calculate the singular values and vectors is actually using QR (adapted to solve that specific problem, of course).

(But D.H. is right that the easy answer to most "how to" questions in numerical linear algebra starts "First find the SVD.")

If you know an eigenvalue, the simplest way to find the vector is to use inverse iteration (a.k.a. the inverse power method with shifts) which will converge in one interation because you already know exactly what shift to use.

On the other hand for finding a few eigenpairs of a large matrix the most popular methods iterate to find the eigenvectors, and the eigenvalues are then be found from the Rayleigh quotient [itex]x^T A x / x^T x[/itex]. But a "goto" method like Lanczos iteration is also a beast to program so it works reliably in practice, even though the math looks deceptively simple.
 
  • #7
eg. http://www.mathworks.com/help/techdoc/ref/svd-singular-value-decomposition.html .

There are libs for major programming languages and math scripts which provide all these methods.

In the QR method - the eigenvectors are the product of the orthogonal transformation in each iteration. Which is what the Olver (example code post #3) paper does.
 
Last edited by a moderator:

1. How do I determine the eigenvalues for a given matrix?

To find the eigenvalues of a matrix, simply solve for the characteristic polynomial by setting the determinant of the matrix minus a scalar times the identity matrix equal to 0. The resulting roots of the polynomial will be the eigenvalues.

2. How can I numerically find eigenvectors for a given eigenvalue?

To find the eigenvectors corresponding to a given eigenvalue, use the power iteration method. Start with an initial vector and repeatedly multiply it by the matrix until it converges to the eigenvector. Alternatively, you can use the inverse power iteration method to find the eigenvector corresponding to the smallest eigenvalue.

3. Can I use software to find eigenvectors for a given eigenvalue?

Yes, there are many software programs that can numerically find eigenvectors for a given eigenvalue, such as MATLAB, Mathematica, and Python's numpy library. These programs use advanced algorithms and can handle larger matrices with more accuracy.

4. Is it possible to have multiple eigenvectors for the same eigenvalue?

Yes, it is possible to have multiple eigenvectors for the same eigenvalue. This is because the eigenvectors of a matrix are not unique and can be scaled by any non-zero constant. Therefore, there can be an infinite number of eigenvectors corresponding to a single eigenvalue.

5. How do I know if I have found all the eigenvectors for a given eigenvalue?

If you have found one eigenvector for a given eigenvalue, you can use the Gram-Schmidt process to find the remaining linearly independent eigenvectors. If the matrix is diagonalizable, then the number of eigenvectors should equal the dimension of the matrix. If the matrix is not diagonalizable, there may be fewer eigenvectors than the dimension of the matrix.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
810
  • Linear and Abstract Algebra
Replies
10
Views
2K
Replies
4
Views
2K
Replies
3
Views
2K
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
4K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
13
Views
1K
  • Advanced Physics Homework Help
Replies
17
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top