T/F: Orthogonal matrix has eigenvalues +1, -1

Click For Summary
SUMMARY

The discussion confirms that an orthogonal matrix must have eigenvalues of +1 and -1. It establishes that if a 3x3 matrix A is diagonalizable with eigenvalues -1 and +1, it is a necessary condition for A to be orthogonal. The proof involves demonstrating that the eigenvalues of an orthogonal matrix satisfy the equation ##\lambda^2 = 1##, leading to the conclusion that the eigenvalues must be either +1 or -1. The relationship between the determinant and the orthogonality condition ##A^{T}A = I## is also highlighted.

PREREQUISITES
  • Understanding of orthogonal matrices and their properties
  • Knowledge of eigenvalues and eigenvectors
  • Familiarity with diagonalization of matrices
  • Basic linear algebra concepts, including determinants
NEXT STEPS
  • Study the properties of orthogonal matrices in detail
  • Learn about the diagonalization process of matrices
  • Explore the implications of eigenvalues in linear transformations
  • Investigate counter-examples of matrices that are diagonalizable but not orthogonal
USEFUL FOR

Students of linear algebra, mathematicians, and anyone interested in the properties of matrices and their applications in various fields such as physics and engineering.

Mr Davis 97
Messages
1,461
Reaction score
44

Homework Statement


If a 3 x 3 matrix A is diagonalizable with eigenvalues -1, and +1, then it is an orthogonal matrix.

Homework Equations

The Attempt at a Solution


I feel like this question is false, since the true statement is that if a matrix A is orthogonal, then it has a determinant of +1 or -1, which has nothing to do with diagonalozation. However, I don't see how to prove this rigorously. Would the best way just be to search for a counter-example?
 
Physics news on Phys.org
What do you know about the eigenvectors of an orthogonal matrix?
 
I can't seem to recall anything about the eigenvectors of an orthogonal matrix... I looked it up and couldn't find anything either.
 
Mr Davis 97 said:
I can't seem to recall anything about the eigenvectors of an orthogonal matrix... I looked it up and couldn't find anything either.
How is an orthogonal matrix defined? And how do eigenvalues behave in that equation?
 
fresh_42 said:
How is an orthogonal matrix defined? And how do eigenvalues behave in that equation?
An orthogonal matrix is a square matrix such that ##A^{T}A= I##. I don't see how this can be used to analyze its eigenvalues. I can only see that detA = 1 or -1... But those aren't eigenvalues.
 
Mr Davis 97 said:
An orthogonal matrix is a square matrix such that ##A^{T}A= I##. I don't see how this can be used to analyze its eigenvalues. I can only see that detA = 1 or -1... But those aren't eigenvalues.
Firstly, we have to show (or know), that eigenvalues of ##A^ {T}## are the same as those of ##A##.
This is done by the equation ##\det (M)=\det(M^T)## and therewith ##\det(\lambda I -A^T)=\det((\lambda I-A^T)^T)= \det(\lambda I^T - (A^T)^T)=\det(\lambda I - A)##

Secondly, for an eigenvector ##x## of ##A## with an eigenvalue ##\lambda## what do we get from ##Ax = \lambda x\,##?

Now what does it mean for ##A## to be diagonalizable, and what must be the values of this diagonal?
 
fresh_42 said:
Firstly, we have to show (or know), that eigenvalues of ##A^ {T}## are the same as those of ##A##.
This is done by the equation ##\det (M)=\det(M^T)## and therewith ##\det(\lambda I -A^T)=\det((\lambda I-A^T)^T)= \det(\lambda I^T - (A^T)^T)=\det(\lambda I - A)##

Secondly, for an eigenvector ##x## of ##A## with an eigenvalue ##\lambda## what do we get from ##Ax = \lambda x\,##?

Now what does it mean for ##A## to be diagonalizable, and what must be the values of this diagonal?
If A is diagonlizable, then it can be represented in the form A = PDP-1, where P is a matrix of linearly independent eigenvectors of A, and D is a diagonal matrix with the eigenvalues of A on its diagonal. But I'm not seeing how this definition helps me.

Also, I'm not sure what you mean by what do we get from the ##Ax = \lambda x\##. The only thing I see to do with that might be to multiply both sides by the transpose of A and see what happens...
 
I meant the consideration of a orthogonal matrix ##A##. Being orthogonal leads to ##x=I\, x=A^TA\, x = A^T(A(x))=\lambda^2 x## for an eigenvector ##x## of the (orthogonal) matrix ##A## to the eigenvalue ##\lambda##. Thus ##\lambda^2 = 1## and over the real numbers this means ##\lambda \in \{-1,+1\}##.

So orthogonal matrices must have eigenvalues ##\pm 1##.

Now we have the opposite situation: The eigenvalues are given as ##\pm 1##, which is a necessary condition. Checked.
Then we have that ##A## is diagonalizable, i.e. ##A=PDP^{-1}## for some matrix ##P##.
The diagonal entries of ##D## also have to be ##\pm 1## by the given condition, which means ##D=D^T=D^{-1}## and ##A=A^{-1}##.
Thus to show ##A^TA=I## is equivalent to show ##A^T=A## or ##A^T=A^{-1}##.

Here is where I'm stuck. To solve for the corresponding linear equations is rather unpleasant. (I'll answer, if I find the trick.)
 
  • Like
Likes   Reactions: Mr Davis 97

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K