- #1

Jhenrique

- 685

- 4

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

In summary: The QR algorithm is just one method for finding eigenvalues and eigenvectors, and it works for most matrices. For those that are not diagonalizable, it will find the closest possible form, which is the Jordan normal form. The columns of P_1 \dots P_k will be generalized eigenvectors in this case.

- #1

Jhenrique

- 685

- 4

Physics news on Phys.org

- #2

pasmith

Science Advisor

Homework Helper

- 3,187

- 1,768

Jhenrique said:

You need to find eigenvalues before you can find eigenvectors. Two matrices with the same eigenvalues can have entirely different eigenvectors, so there is no general formula for finding eigenvectors beyond the definition: [itex]v \neq 0[/itex] is an eigenvector of [itex]M[/itex] if and only if there exists a scalar [itex]\lambda[/itex] such that

[tex]

Mv = \lambda v.

[/tex]

Having found the eigenvalues as the roots of the characteristic polynomial

[tex]\chi_M(z) = \det (M - zI)[/tex]

you can then find the corresponding eigenvector(s) by using the definition above.

- #3

DrClaude

Mentor

- 8,446

- 5,593

$$

\begin{align}

A_{1} &= P_1^{-1} A P_1 \\

A_{2} &= P_2^{-1} P_1^{-1} A P_1 P_2 \\

\vdots \\

A_{k} &= P_k^{-1} \cdots P_1^{-1} A P_1 \cdots P_k

\end{align}

$$

until ##A_{k}## is diagonal, then

$$

A_{k} = \mathrm{diag}( \lambda_1, \lambda_2, \ldots, \lambda_n)

$$

with ##\lambda_i## the eigenvalues of ##A## and the columns of the matrix

$$

X = P_1 P_2 \cdots P_k

$$

are the corresponding eigenvectors.

The QR algorithm is such a method.

- #4

pasmith

Science Advisor

Homework Helper

- 3,187

- 1,768

DrClaude said:If you use similarity transformations of an ##n\times n## matrix ##A## to reduce it a diagonal form

Not all matrices are diagonalizable, even over [itex]\mathbb{C}[/itex]! For example

[tex]

\begin{pmatrix}

1 & 1 \\ 0 & 1

\end{pmatrix}

[/tex]

The best you can do is reduce a matrix to its Jordan normal form, which the above matrix is.

EDIT: QR will converge to the Schur form of [itex]A[/itex], which is upper triangular so the eigenvalues will appear on the diagonal, but is it necessarily the case that the columns of [itex]P_1 \dots P_k[/itex] are generalized eigenvectors of [itex]A[/itex]?

Last edited:

- #5

DrClaude

Mentor

- 8,446

- 5,593

I never said they were.pasmith said:Not all matrices are diagonalizable

An eigenvector is a vector that does not change direction when multiplied by a matrix. An eigenvalue is a number that represents how much the eigenvector is scaled when multiplied by the matrix.

Finding eigenvectors and eigenvalues is important in many applications, such as solving systems of differential equations, analyzing data, and understanding the behavior of systems in physics and engineering.

To find the eigenvectors and eigenvalues of a matrix, you can use the characteristic equation, which involves finding the determinant of the matrix and solving for the roots of the equation.

The eigenvectors and eigenvalues of a matrix are closely related. The eigenvalues represent the scaling factor of the eigenvectors when multiplied by the matrix. Additionally, the eigenvectors associated with different eigenvalues are orthogonal (perpendicular) to each other.

Yes, a matrix can have multiple sets of eigenvectors and eigenvalues. This is because there can be multiple ways for a vector to not change direction when multiplied by a matrix, and each of these vectors will have its own corresponding eigenvalue.

- Replies
- 9

- Views
- 2K

- Replies
- 10

- Views
- 2K

- Replies
- 34

- Views
- 2K

- Replies
- 6

- Views
- 2K

- Replies
- 1

- Views
- 1K

- Replies
- 3

- Views
- 2K

- Replies
- 5

- Views
- 2K

- Replies
- 2

- Views
- 2K

- Replies
- 3

- Views
- 1K

- Replies
- 5

- Views
- 4K

Share: