Eigenvalues of an orthogonal matrix

Yeah that makes more sense, because then ##||A\vec{v}||^{2} = ||\vec{v}||^{2}## since the ##A## and its transpose do some cancelling if ##A## is orthogonal.In that case, can't we just say the square of the norm is going to just be ##\lambda^{2} ||\vec{v}||^{2}##, which is just ##\lambda^{2}## times itself so the modulus of lambda has to be 1, and this gives us all of our...eigenvalues?In summary, the conversation discusses how to prove that the three eigenvalues of a real orthogonal 3x3 matrix are
  • #1
etotheipi
Homework Statement
Show that the three eigenvalues of a real orthogonal 3x3 matrix are ##e^{i\alpha}##, ##e^{-i\alpha}##, and ##+1## or ##-1##, where ##\alpha \in \mathbb{R}##.
Relevant Equations
N/A
I'm fairly stuck, I can't figure out how to start. I called the matrix ##\mathbf{A}## so then it gives us that ##\mathbf{A}\mathbf{A}^\intercal = \mathbf{I}## from the orthogonal bit. I tried 'determining' both sides... $$(\det(\mathbf{A}))^{2} = 1 \implies \det{\mathbf{A}} = \pm 1$$I don't think this helps me, since I need to show that ##\det{(\mathbf{A} - \lambda \mathbf{I})} = 0## for the values of ##\lambda## in the question and as far as I'm aware there is no rule for ##\det{(\mathbf{a}+\mathbf{b})}##.

I don't know if there is a general form of an orthogonal matrix I can use as a way in (I suspect there isn't, though this might be wrong!). I wondered whether anyone could give me a hint! Thank you.
 
Physics news on Phys.org
  • #2
etotheipi said:
Homework Statement:: Show that the three eigenvalues of a real orthogonal 3x3 matrix are ##e^{i\alpha}##, ##e^{-i\alpha}##, and ##+1## or ##-1##, where ##\alpha \in \mathbb{R}##.
Relevant Equations:: N/A

I'm fairly stuck, I can't figure out how to start. I called the matrix ##\mathbf{A}## so then it gives us that ##\mathbf{A}\mathbf{A}^\intercal = \mathbf{I}## from the orthogonal bit. I tried 'determining' both sides... $$(\det(\mathbf{A}))^{2} = 1 \implies \det{\mathbf{A}} = \pm 1$$I don't think this helps me, since I need to show that ##\det{(\mathbf{A} - \lambda \mathbf{I})} = 0## for the values of ##\lambda## in the question and as far as I'm aware there is no rule for ##\det{(\mathbf{a}+\mathbf{b})}##.

I don't know if there is a general form of an orthogonal matrix I can use as a way in (I suspect there isn't, though this might be wrong!). I wondered whether anyone could give me a hint! Thank you.
Hint: what sort of characteristic equation do you get for a 3x3 matrix?
 
  • Informative
Likes etotheipi
  • #3
PeroK said:
Hint: what sort of characteristic equation do you get for a 3x3 matrix?

I get

$$\det \left( \begin{pmatrix}a & b & c\\d & e & f\\g & h & i\\\end{pmatrix} - \begin{pmatrix}\lambda & 0 & 0\\0 & \lambda & 0\\0 & 0 & \lambda\\\end{pmatrix} \right) = 0 $$ $$= (a-\lambda)[(e-\lambda)(i-\lambda) - hf] - b[d(i-\lambda) - gf] + c(dh - g(e-\lambda)) = 0$$

:nb)
 
  • #4
etotheipi said:
I get

$$\det \left( \begin{pmatrix}a & b & c\\d & e & f\\g & h & i\\\end{pmatrix} - \begin{pmatrix}\lambda & 0 & 0\\0 & \lambda & 0\\0 & 0 & \lambda\\\end{pmatrix} \right) = 0 $$ $$= (a-\lambda)[(e-\lambda)(i-\lambda) - hf] - b[d(i-\lambda) - gf] + c(dh - g(e-\lambda)) = 0$$

:nb)
How would you, in general, describe any such equation?
 
  • Like
Likes etotheipi
  • #5
PeroK said:
How would you, in general, describe any such equation?

I expanded it all out (hope I didn't mess anything up!) and got

##\lambda^{3} -(a+e+i)\lambda^{2} + (ai + ea + ei - hf - bd - gc)\lambda + (aei + gfb + cdh - hfa - bdi - cga) = 0##
The coefficient of ##\lambda^{2}## I can tell to be the trace of ##\mathbf{A}##, the final constant term is I think the determinant, whilst the coefficient of ##\lambda## I can't assign to anything of meaning.
 
  • #6
etotheipi said:
I expanded it all out (hope I didn't mess anything up!) and got

##\lambda^{3} -(a+e+i)\lambda^{2} + (ai + ea + ei - hf - bd - gc)\lambda + (aei + gfb + cdh - hfa - bdi - cga) = 0##
The coefficient of ##\lambda^{2}## I can tell to be the trace of ##\mathbf{A}##, the final constant term is I think the determinant, whilst the coefficient of ##\lambda## I can't assign to anything of meaning.
It's a cubic equation!
 
  • Like
Likes etotheipi
  • #7
PeroK said:
It's a cubic equation!

Right so then we have 2 complex roots and 1 real root. The coefficients are real so that fixes ##e^{i\alpha}## and ##e^{-i\alpha}## to be two arbitrary solutions, so all that remains is to show that ##\pm 1## is the other solution. And I think we don't need to prove that algebraically, a geometrical argument should do?
 
  • #8
etotheipi said:
Right so then we have 2 complex roots and 1 real root. The coefficients are real so that fixes ##e^{i\alpha}## and ##e^{-i\alpha}## to be two arbitrary solutions, so all that remains is to show that ##\pm 1## is the other solution. And I think we don't need to prove that algebraically, a geometrical argument should do?
Not quite. It means you have one real root and a pair of conjugate roots.

Do you know the properties of an orthogonal matrix? There's a key (alternative defining) property.
 
  • Informative
Likes etotheipi
  • #9
PeroK said:
Not quite. It means you have one real root and a pair of conjugate roots.

Do you know the properties of an orthogonal matrix? There's a key (alternative defining) property.

Yeah, I just realized I forgot about the modulus, so at the moment we're stuck with a real solution an two complex ones of the form ##\beta e^{i \alpha}## and ##\beta e^{-i \alpha}##.

I went onto Wikipedia and found that for a vector ##\vec{v}## and its transpose ##\vec{v}^\intercal##, if ##Q## is orthogonal then

##\vec{v}^\intercal \vec{v} = \vec{v}^\intercal Q^\intercal Q \vec{v}##

I'll play around with that for a little bit and see if I can get anything useful, assuming that's the one you were making reference to!
 
  • #10
etotheipi said:
Yeah, I just realized I forgot about the modulus, so at the moment we're stuck with a real solution an two complex ones of the form ##\beta e^{i \alpha}##.

I went onto Wikipedia and found that for a vector ##\vec{v}## and its transpose ##\vec{v}^\intercal##, if ##Q## is orthogonal then

##\vec{v}^\intercal \vec{v} = \vec{v}^\intercal Q^\intercal Q \vec{v}##

I'll play around with that for a little bit and see if I can get anything useful, assuming that's the one you were making reference to!
Yes, although I would think of that it terms of the inner product (and the the norm).
 
  • Like
Likes etotheipi
  • #11
PeroK said:
Yes, although I would think of that it terms of the inner product (and the the norm).

Yeah that makes more sense, because then ##||A\vec{v}||^{2} = ||\vec{v}||^{2}## since the ##A## and its transpose do some cancelling if ##A## is orthogonal.

In that case, can't we just say the square of the norm is going to just be ##\lambda^{2} ||\vec{v}||^{2}##, which is just ##\lambda^{2}## times itself so the modulus of lambda has to be 1, and this gives us all of our results?
 
  • #12
etotheipi said:
Yeah that makes more sense, because then ##||A\vec{v}||^{2} = ||\vec{v}||^{2}## since the ##A## and its transpose do some cancelling if ##A## is orthogonal.

In that case, can't we just say the square of the norm is going to just be ##\lambda^{2} ||\vec{v}||^{2}##, which is just ##\lambda^{2}## times itself so the modulus of lambda has to be 1, and this gives us all of our results?
Yes need to be careful because ##\lambda## may be complex. The simplest approach is to consider ##A## as a unitary matrix with real coefficients.
 
  • Like
Likes etotheipi
  • #13
PeroK said:
Yes need to be careful because ##\lambda## may be complex. The simplest approach is to consider ##A## as a unitary matrix with real coefficients.

Yes, completely forgot about that part! I should really write ##||\vec{v}||^2 = ||A \vec{v}||^2 = ||\lambda \vec{v}||^2 = \lambda \lambda^* ||\vec{v}||^2 = |\lambda|^2 ||\vec{v}||^2##. Thanks for the help!
 
  • #14
etotheipi said:
Right so then we have 2 complex roots and 1 real root.
How did you conclude that two of the roots are complex? A cubic could have three real roots.
 
  • Like
Likes WWGD and etotheipi
  • #15
vela said:
How did you conclude that two of the roots are complex? A cubic could have three real roots.

I suppose you're right, I just assumed from how the question was phrased that there were only two possibilities for real roots (##1## and ##-1##), and this indeed turned out to be the case since there are only two re`als with modulus 1.

But the point stands, I should have been more careful at that stage :wink:.
 
  • Like
Likes PeroK
  • #16
etotheipi said:
I suppose you're right, I just assumed from how the question was phrased that there were only two possibilities for real roots (##1## and ##-1##), and this indeed turned out to be the case since there are only two re`als with modulus 1.

But the point stands, I should have been more careful at that stage :wink:.
There are the degenerate cases where ##\lambda = \pm 1## only. Although these do meet the criteria in each case.
 
  • Informative
Likes etotheipi
  • #17
PeroK said:
There are the degenerate cases where ##\lambda = \pm 1## only.

Oh, do you mean if we got something like ##\alpha = \pi## or ##\alpha = 0## so that we would indeed have three real roots, which might be ##(\lambda_1, \lambda_2, \lambda_3) = (1,1,1), (1,1,-1), (1,-1,-1), (-1,-1,-1)##. Hadn't thought of that either!
 
  • Like
Likes vela
  • #18
etotheipi said:
Oh, do you mean if we got something like ##\alpha = \pi## or ##\alpha = 0## so that we would indeed have three real roots, which might be ##(\lambda_1, \lambda_2, \lambda_3) = (1,1,1), (1,1,-1), (1,-1,-1), (-1,-1,-1)##. Hadn't thought of that either!
I didn't think of the degenerate cases, but yes you can make it work like that for those as well.
 
  • Like
Likes etotheipi
  • #19
PeroK said:
There are the degenerate cases where ##\lambda = \pm 1## only. Although these do meet the criteria in each case.
These occur iff the real orthogonal matrix is symmetric. I don't really view involutions as "degenerate" though. In fact involutions are quite nice.
 

1. What are eigenvalues of an orthogonal matrix?

The eigenvalues of an orthogonal matrix are the values that satisfy the equation Ax = λx, where A is the matrix, x is a non-zero vector, and λ is a scalar. In other words, they are the values that when multiplied by the matrix, result in the vector being scaled by a factor of λ.

2. How do you calculate the eigenvalues of an orthogonal matrix?

The eigenvalues of an orthogonal matrix can be calculated by finding the roots of the characteristic polynomial of the matrix, det(A-λI) = 0, where I is the identity matrix. This can be done using various methods such as the QR algorithm or the power method.

3. What is the significance of eigenvalues in an orthogonal matrix?

The eigenvalues of an orthogonal matrix represent the scaling factors of the eigenvectors of the matrix. They also have important applications in various fields such as physics, engineering, and computer science.

4. Can an orthogonal matrix have complex eigenvalues?

No, an orthogonal matrix can only have real eigenvalues. This is because the eigenvalues of an orthogonal matrix are the roots of a real polynomial and the roots of a real polynomial can only be real numbers.

5. How do the eigenvalues of an orthogonal matrix relate to its determinant and trace?

The determinant of an orthogonal matrix is equal to the product of its eigenvalues, while the trace (sum of diagonal elements) is equal to the sum of its eigenvalues. This relationship holds for all square matrices, not just orthogonal matrices.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
726
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
909
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Atomic and Condensed Matter
Replies
0
Views
483
  • Differential Equations
Replies
2
Views
1K
Replies
12
Views
952
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
Replies
3
Views
1K
Back
Top