- #1

- 22

- 0

How can I prove that that every symmetric real matrix is diagonalizable?

Thanks in advance

Thanks in advance

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter complexhuman
- Start date

- #1

- 22

- 0

How can I prove that that every symmetric real matrix is diagonalizable?

Thanks in advance

Thanks in advance

- #2

HallsofIvy

Science Advisor

Homework Helper

- 41,833

- 964

That's a MAJOR theorem so at best I can just outline the proof here.

The more general theorem is that every self-adjoint linear transformation is diagonalizable- and symmetric (real) matrices, thought of as linear transformations on R^{n}, are self- adjoint. A self-adjoint linear transformation is one such that

<Ax,y>= <x,Ay> where <x, y> is the innerproduct on the vector space.

First show that the eigenvalues of a self-adjoint linear transformation must be real:

If λ is an eigenvalue of A, take x to be a unit length eigenvector. Then λ= λ<x, x>= <λx,x>= <x, λx>= complexconjugate(λ)<x,x>= complexconjugate(λ). Since λ equals its complex conjugate, it is real.

Second, show that eigenvectors corresponding to distinct eigenvalues are orthogonal: If Ax= λx and Ay= μ y then λ<x, y>= <λx,y>= <Ax, y>= <x, Ay>= <x,μy>= μ<x,y> (since μ is real). Then (λ- μ)<x,y>= 0. Since λ and μ are distinct, we must have <x,y>= 0.

So: given any self-adoint linear transformation (or symmetric matrix) we can write the vector space as a direct sum of orthogonal subspaces, every vector of which is an eigenvector corresponding to the same eigenvalue. Choose any orthonormal basis for each subspace and show that the matrix for A, restricted to that subspace, is simply λI- a diagonal matrix with only λ on the diagonal.

Finally, the union of the orthonormal bases for each subspace will be an orthonormal basis for the entire space and the matrix corresponding to A in that basis is diagonal.

The more general theorem is that every self-adjoint linear transformation is diagonalizable- and symmetric (real) matrices, thought of as linear transformations on R

<Ax,y>= <x,Ay> where <x, y> is the innerproduct on the vector space.

First show that the eigenvalues of a self-adjoint linear transformation must be real:

If λ is an eigenvalue of A, take x to be a unit length eigenvector. Then λ= λ<x, x>= <λx,x>= <x, λx>= complexconjugate(λ)<x,x>= complexconjugate(λ). Since λ equals its complex conjugate, it is real.

Second, show that eigenvectors corresponding to distinct eigenvalues are orthogonal: If Ax= λx and Ay= μ y then λ<x, y>= <λx,y>= <Ax, y>= <x, Ay>= <x,μy>= μ<x,y> (since μ is real). Then (λ- μ)<x,y>= 0. Since λ and μ are distinct, we must have <x,y>= 0.

So: given any self-adoint linear transformation (or symmetric matrix) we can write the vector space as a direct sum of orthogonal subspaces, every vector of which is an eigenvector corresponding to the same eigenvalue. Choose any orthonormal basis for each subspace and show that the matrix for A, restricted to that subspace, is simply λI- a diagonal matrix with only λ on the diagonal.

Finally, the union of the orthonormal bases for each subspace will be an orthonormal basis for the entire space and the matrix corresponding to A in that basis is diagonal.

Last edited by a moderator:

Share:

- Replies
- 2

- Views
- 5K