Showing it is orthogonally diagonalizable

  • Thread starter Thread starter braindead101
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on proving that the product of two orthogonally diagonalizable matrices, A and B, is also orthogonally diagonalizable given that they commute (AB=BA). It utilizes the Real Spectral Theorem, which states that a real matrix is orthogonally diagonalizable if and only if it is symmetric. The argument hinges on the fact that commuting matrices share eigenvectors, allowing for simultaneous diagonalization. The conclusion is that if A and B have the same eigenvectors, they can be diagonalized by the same orthogonal matrix Q.

PREREQUISITES
  • Understanding of orthogonal matrices and their properties
  • Familiarity with the Real Spectral Theorem
  • Knowledge of eigenvalues and eigenvectors
  • Concept of commuting matrices in linear algebra
NEXT STEPS
  • Study the Real Spectral Theorem in detail
  • Learn about the properties of commuting matrices
  • Explore the implications of eigenvector independence
  • Investigate the concept of simultaneous diagonalization
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in the properties of orthogonally diagonalizable matrices and their applications in theoretical and applied mathematics.

braindead101
Messages
158
Reaction score
0
Suppose that the real matrices A and B are orthogonally diagonalizable and AB=BA. Show that AB is orthogonally diagonalizable.

I know that orthogonally diagonalizable means that you can find an orthogonal matrix Q and a Diagonal matrix D so Q^TAQ=D, A=QDQ^T.

I am aware of the Real Spectral Theorem which states that "A real (mxn)-matrix A is orthogonally diagonalizable if and only if A is symmetric"

I got a hint saying I am suppose to use the Real Spectral Theorem twice to show it. But I am still unsure as to how to do this.
 
Physics news on Phys.org
i am not femiliar with the termin "orthogonally diagonalizable "

in order to proove that its diagonazable you need to proove
that the eigenvectors are independent

orthogonal meens perpandicular
so i think you should take the columns of the matrix
and if the multiplication of each vector by another equals to zero
then its orthogonal
 
Use the fact that commuting matrices are simultaneously diagonalizable.

Specifically, since B commutes with A, if v is an eigenvector of A with eigenvalue a, then A(Bv)=B(Av)=B(av)=a(Bv), so Bv is another eigenvector of A with the same eigenvalue. Now, if the eigenspace corresponding to the eigenvalue a is one dimensional, this means Bv must be a multiple of v, ie, v is also an eigenvalue of B. If all the eigenspaces of A are one dimensional, then A and B have exactly the same eigenvectors, and so they are diagonalized by the same matrix Q (since the columns of this matrix are precisely the eigenvectors of the matrix being diagonalized).

I'll let you finish the argument and work out what happens when the eigenvalues are degenerate (ie, when some eigenspaces are more than one dimensional).
 
Last edited:

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
5
Views
5K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 2 ·
Replies
2
Views
4K