Orthogonal matrix and eigenvalues

In summary, we are given an orthogonal matrix M with a determinant of 1. To show that M has an eigenvalue of 1, we need to prove that det(M-I)=0. Using the fact that M is diagonalizable and the property of orthogonal matrices, we can show that det(M-I)=det(I-M^T). This leads to the conclusion that the eigenvalues of M are all distinct. Therefore, one of the eigenvalues must be 1, proving that M has an eigenvalue of 1.
  • #1
wormbox
3
0
a) Let M be a 3 by 3 orthogonal matrix and let det(M)=1. Show that M has 1 as an eigenvalue. Hint: prove that det(M-I)=0.

I think I'm supposed to begin from the fact that

det(M)=1=det(I)=det(MTM) and from there reach det(M-I)=0 which of course would mean that there's an eigenvalue of 1 as det(M-tI)=0 for any eigenvalue t.

I mean:

det(M)=1=det(I)=det(MTM) => hard work => det(M-1I)=0 => t=1

b) Let M be a 3 by 3 orthogonal matrix and let det(M)=1. Show that either M=I or the eigenvalue 1 is of rank 1. Hint: M is diagonalizable.

I guess I know how to show the case where M=I:
det(M)=1=det(I)=det(PTP)=det(PTIP)=det(PTMP) => M=I.

But that doesn't cover the fact that the rank of the eigenvalue could be 1.
 
Physics news on Phys.org
  • #2
What is the definition of the "rank" of an eigenvalue?
 
  • #3
I'm sorry. I probably meant order, not rank.

[tex]det(M-tI)=(-1)^3(t-\lambda _1)(t-\lambda _2)(t-\lambda _3)[/tex]

and only one of the eigenvalues would be 1?

For the a) I've managed to get the following. I don't know if I'm supposed to apply the fact that I'm dealing with 3 x 3 matrix so that det(-M)=(-1)³det(M)...

[tex]\begin{array}{l}
det(M)=1=det(I)\ \ \ |\cdot det(M^T-I) \Rightarrow \\
det(M)det(M^T-I)=det(I)det(M^T-I) \\
det(I-M)=det(M^T-I) \\
-det(M-I)=det(M^T-I) \\
det(M-I)=det(I-M^T) \end{array} [/tex]
 
Last edited:
  • #4
Since M is diagonalizable, let's work in a basis where it is diagonal. M is orthogonal, which means the transpose of M times M equals I (in any basis). What does this tell us about each eigenvalue?
 
  • #5
Now I've got the right word: multiplicity of an eigenvalue, not order or rank.

How about part a)? How to proceed, or am I even on the right track?

Avodyne said:
What does this tell us about each eigenvalue?
I can only say that I don't know or understand. I could guess that because of what you said about the basises of an orthogonal matrix, then an orthogonal matrix has only linearly independent eigenvectors, which in turn would mean that all the eigenvalues are distinct.
 
  • #6
wormbox said:
Now I've got the right word: multiplicity of an eigenvalue, not order or rank.

How about part a)? How to proceed, or am I even on the right track?


I can only say that I don't know or understand. I could guess that because of what you said about the basises of an orthogonal matrix, then an orthogonal matrix has only linearly independent eigenvectors, which in turn would mean that all the eigenvalues are distinct.

The eigenvalues are roots of a real polynomial. That means the eigenvalues are either real or come in complex conjugate pairs. Does that help?
 

What is an orthogonal matrix?

An orthogonal matrix is a square matrix where the columns and rows are perpendicular to each other. This means that when the matrix is multiplied by its transpose, the result is the identity matrix.

What are the properties of an orthogonal matrix?

Some properties of an orthogonal matrix include:

  • All of its columns and rows are orthogonal (perpendicular) to each other.
  • The magnitude of each column and row is equal to 1.
  • The determinant of an orthogonal matrix is either 1 or -1.
  • The inverse of an orthogonal matrix is equal to its transpose.

What is an eigenvalue?

An eigenvalue is a scalar that represents how a linear transformation changes the direction of a vector. It is often denoted as λ and is a solution to the equation Av = λv, where A is a square matrix and v is a non-zero vector.

How are eigenvalues and eigenvectors related?

Eigenvalues and eigenvectors are closely related. An eigenvector is a vector that, when multiplied by a matrix, results in a scalar multiple of itself (the eigenvalue). In other words, multiplying a matrix by its eigenvector only changes the magnitude of the vector, not its direction.

How are orthogonal matrices and eigenvalues used in mathematics and science?

Orthogonal matrices and eigenvalues have many applications in mathematics and science. They are commonly used in linear algebra, signal processing, and quantum mechanics. In particular, they are useful for solving systems of linear equations, finding principal components in data analysis, and determining the energy states of quantum systems.

Similar threads

  • Calculus and Beyond Homework Help
Replies
18
Views
2K
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Math POTW for University Students
Replies
10
Views
703
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
19
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
4K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
Back
Top