# Theory for find the eigenvules of a matrix

• Jhenrique
In summary: The QR algorithm is just one method for finding eigenvalues and eigenvectors, and it works for most matrices. For those that are not diagonalizable, it will find the closest possible form, which is the Jordan normal form. The columns of P_1 \dots P_k will be generalized eigenvectors in this case.
Jhenrique
Exist so much theory for find the eigenvules of a matrix (invariants, characteristic polynomials, algebraic formula with trace and determinant...), but don't exist none formula for find the eigenvectors of a matrix? I never saw none! Please, if such formula exist, give me it or tell when I can study this.

Jhenrique said:
Exist so much theory for find the eigenvules of a matrix (invariants, characteristic polynomials, algebraic formula with trace and determinant...), but don't exist none formula for find the eigenvectors of a matrix? I never saw none! Please, if such formula exist, give me it or tell when I can study this.

You need to find eigenvalues before you can find eigenvectors. Two matrices with the same eigenvalues can have entirely different eigenvectors, so there is no general formula for finding eigenvectors beyond the definition: $v \neq 0$ is an eigenvector of $M$ if and only if there exists a scalar $\lambda$ such that
$$Mv = \lambda v.$$
Having found the eigenvalues as the roots of the characteristic polynomial
$$\chi_M(z) = \det (M - zI)$$
you can then find the corresponding eigenvector(s) by using the definition above.

If you use similarity transformations of an ##n\times n## matrix ##A## to reduce it a diagonal form, i.e., repeat
\begin{align} A_{1} &= P_1^{-1} A P_1 \\ A_{2} &= P_2^{-1} P_1^{-1} A P_1 P_2 \\ \vdots \\ A_{k} &= P_k^{-1} \cdots P_1^{-1} A P_1 \cdots P_k \end{align}
until ##A_{k}## is diagonal, then
$$A_{k} = \mathrm{diag}( \lambda_1, \lambda_2, \ldots, \lambda_n)$$
with ##\lambda_i## the eigenvalues of ##A## and the columns of the matrix
$$X = P_1 P_2 \cdots P_k$$
are the corresponding eigenvectors.

The QR algorithm is such a method.

DrClaude said:
If you use similarity transformations of an ##n\times n## matrix ##A## to reduce it a diagonal form

Not all matrices are diagonalizable, even over $\mathbb{C}$! For example
$$\begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}$$

The best you can do is reduce a matrix to its Jordan normal form, which the above matrix is.

EDIT: QR will converge to the Schur form of $A$, which is upper triangular so the eigenvalues will appear on the diagonal, but is it necessarily the case that the columns of $P_1 \dots P_k$ are generalized eigenvectors of $A$?

Last edited:
pasmith said:
Not all matrices are diagonalizable
I never said they were.

## 1. What is an eigenvector and eigenvalue?

An eigenvector is a vector that does not change direction when multiplied by a matrix. An eigenvalue is a number that represents how much the eigenvector is scaled when multiplied by the matrix.

## 2. Why is finding eigenvectors and eigenvalues important?

Finding eigenvectors and eigenvalues is important in many applications, such as solving systems of differential equations, analyzing data, and understanding the behavior of systems in physics and engineering.

## 3. How do you find the eigenvectors and eigenvalues of a matrix?

To find the eigenvectors and eigenvalues of a matrix, you can use the characteristic equation, which involves finding the determinant of the matrix and solving for the roots of the equation.

## 4. What is the relationship between eigenvectors and eigenvalues?

The eigenvectors and eigenvalues of a matrix are closely related. The eigenvalues represent the scaling factor of the eigenvectors when multiplied by the matrix. Additionally, the eigenvectors associated with different eigenvalues are orthogonal (perpendicular) to each other.

## 5. Can a matrix have more than one set of eigenvectors and eigenvalues?

Yes, a matrix can have multiple sets of eigenvectors and eigenvalues. This is because there can be multiple ways for a vector to not change direction when multiplied by a matrix, and each of these vectors will have its own corresponding eigenvalue.

Replies
9
Views
2K
Replies
10
Views
2K
Replies
34
Views
2K
Replies
6
Views
2K
Replies
1
Views
1K
Replies
3
Views
2K
Replies
5
Views
2K
Replies
2
Views
2K
Replies
3
Views
1K
Replies
5
Views
4K