Theory for find the eigenvules of a matrix

  • Thread starter Thread starter Jhenrique
  • Start date Start date
  • Tags Tags
    Matrix Theory
Click For Summary
Finding eigenvalues of a matrix is essential before determining its eigenvectors, as the latter depend on the former. There is no universal formula for eigenvectors; they are defined by the equation Mv = λv, where λ represents the eigenvalue. To find eigenvectors, one must first compute the eigenvalues from the characteristic polynomial. While similarity transformations can help reduce a matrix to diagonal form, not all matrices are diagonalizable, and some may only be reducible to Jordan normal form. The QR algorithm can also be used to find eigenvalues, but it does not guarantee that the resulting columns are eigenvectors for non-diagonalizable matrices.
Jhenrique
Messages
676
Reaction score
4
Exist so much theory for find the eigenvules of a matrix (invariants, characteristic polynomials, algebraic formula with trace and determinant...), but don't exist none formula for find the eigenvectors of a matrix? I never saw none! Please, if such formula exist, give me it or tell when I can study this.
 
Physics news on Phys.org
Jhenrique said:
Exist so much theory for find the eigenvules of a matrix (invariants, characteristic polynomials, algebraic formula with trace and determinant...), but don't exist none formula for find the eigenvectors of a matrix? I never saw none! Please, if such formula exist, give me it or tell when I can study this.

You need to find eigenvalues before you can find eigenvectors. Two matrices with the same eigenvalues can have entirely different eigenvectors, so there is no general formula for finding eigenvectors beyond the definition: v \neq 0 is an eigenvector of M if and only if there exists a scalar \lambda such that
<br /> Mv = \lambda v.<br />
Having found the eigenvalues as the roots of the characteristic polynomial
\chi_M(z) = \det (M - zI)
you can then find the corresponding eigenvector(s) by using the definition above.
 
If you use similarity transformations of an ##n\times n## matrix ##A## to reduce it a diagonal form, i.e., repeat
$$
\begin{align}
A_{1} &= P_1^{-1} A P_1 \\
A_{2} &= P_2^{-1} P_1^{-1} A P_1 P_2 \\
\vdots \\
A_{k} &= P_k^{-1} \cdots P_1^{-1} A P_1 \cdots P_k
\end{align}
$$
until ##A_{k}## is diagonal, then
$$
A_{k} = \mathrm{diag}( \lambda_1, \lambda_2, \ldots, \lambda_n)
$$
with ##\lambda_i## the eigenvalues of ##A## and the columns of the matrix
$$
X = P_1 P_2 \cdots P_k
$$
are the corresponding eigenvectors.

The QR algorithm is such a method.
 
DrClaude said:
If you use similarity transformations of an ##n\times n## matrix ##A## to reduce it a diagonal form

Not all matrices are diagonalizable, even over \mathbb{C}! For example
<br /> \begin{pmatrix}<br /> 1 &amp; 1 \\ 0 &amp; 1<br /> \end{pmatrix}<br />

The best you can do is reduce a matrix to its Jordan normal form, which the above matrix is.

EDIT: QR will converge to the Schur form of A, which is upper triangular so the eigenvalues will appear on the diagonal, but is it necessarily the case that the columns of P_1 \dots P_k are generalized eigenvectors of A?
 
Last edited:
pasmith said:
Not all matrices are diagonalizable
I never said they were.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
2K