Eigenvalues of an orthogonal matrix

Click For Summary

Homework Help Overview

The discussion revolves around the eigenvalues of a real orthogonal 3x3 matrix, specifically exploring the characteristic equation and the nature of its roots. Participants are attempting to understand the implications of the orthogonality condition on the eigenvalues.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the determinant of the matrix and its implications for eigenvalues. There are attempts to derive the characteristic polynomial and identify the nature of its roots, with some questioning the assumptions about the roots being complex or real.

Discussion Status

There is an active exploration of the properties of orthogonal matrices and their eigenvalues. Some participants have provided hints and guidance regarding the characteristic equation, while others are reflecting on their assumptions and the implications of the orthogonality condition.

Contextual Notes

Participants are working under the constraints of a homework problem, which may limit the information available and the methods they can employ. There is an ongoing discussion about the nature of the roots of the characteristic polynomial and the properties of orthogonal matrices.

etotheipi
Homework Statement
Show that the three eigenvalues of a real orthogonal 3x3 matrix are ##e^{i\alpha}##, ##e^{-i\alpha}##, and ##+1## or ##-1##, where ##\alpha \in \mathbb{R}##.
Relevant Equations
N/A
I'm fairly stuck, I can't figure out how to start. I called the matrix ##\mathbf{A}## so then it gives us that ##\mathbf{A}\mathbf{A}^\intercal = \mathbf{I}## from the orthogonal bit. I tried 'determining' both sides... $$(\det(\mathbf{A}))^{2} = 1 \implies \det{\mathbf{A}} = \pm 1$$I don't think this helps me, since I need to show that ##\det{(\mathbf{A} - \lambda \mathbf{I})} = 0## for the values of ##\lambda## in the question and as far as I'm aware there is no rule for ##\det{(\mathbf{a}+\mathbf{b})}##.

I don't know if there is a general form of an orthogonal matrix I can use as a way in (I suspect there isn't, though this might be wrong!). I wondered whether anyone could give me a hint! Thank you.
 
Physics news on Phys.org
etotheipi said:
Homework Statement:: Show that the three eigenvalues of a real orthogonal 3x3 matrix are ##e^{i\alpha}##, ##e^{-i\alpha}##, and ##+1## or ##-1##, where ##\alpha \in \mathbb{R}##.
Relevant Equations:: N/A

I'm fairly stuck, I can't figure out how to start. I called the matrix ##\mathbf{A}## so then it gives us that ##\mathbf{A}\mathbf{A}^\intercal = \mathbf{I}## from the orthogonal bit. I tried 'determining' both sides... $$(\det(\mathbf{A}))^{2} = 1 \implies \det{\mathbf{A}} = \pm 1$$I don't think this helps me, since I need to show that ##\det{(\mathbf{A} - \lambda \mathbf{I})} = 0## for the values of ##\lambda## in the question and as far as I'm aware there is no rule for ##\det{(\mathbf{a}+\mathbf{b})}##.

I don't know if there is a general form of an orthogonal matrix I can use as a way in (I suspect there isn't, though this might be wrong!). I wondered whether anyone could give me a hint! Thank you.
Hint: what sort of characteristic equation do you get for a 3x3 matrix?
 
  • Informative
Likes   Reactions: etotheipi
PeroK said:
Hint: what sort of characteristic equation do you get for a 3x3 matrix?

I get

$$\det \left( \begin{pmatrix}a & b & c\\d & e & f\\g & h & i\\\end{pmatrix} - \begin{pmatrix}\lambda & 0 & 0\\0 & \lambda & 0\\0 & 0 & \lambda\\\end{pmatrix} \right) = 0 $$ $$= (a-\lambda)[(e-\lambda)(i-\lambda) - hf] - b[d(i-\lambda) - gf] + c(dh - g(e-\lambda)) = 0$$

:nb)
 
etotheipi said:
I get

$$\det \left( \begin{pmatrix}a & b & c\\d & e & f\\g & h & i\\\end{pmatrix} - \begin{pmatrix}\lambda & 0 & 0\\0 & \lambda & 0\\0 & 0 & \lambda\\\end{pmatrix} \right) = 0 $$ $$= (a-\lambda)[(e-\lambda)(i-\lambda) - hf] - b[d(i-\lambda) - gf] + c(dh - g(e-\lambda)) = 0$$

:nb)
How would you, in general, describe any such equation?
 
  • Like
Likes   Reactions: etotheipi
PeroK said:
How would you, in general, describe any such equation?

I expanded it all out (hope I didn't mess anything up!) and got

##\lambda^{3} -(a+e+i)\lambda^{2} + (ai + ea + ei - hf - bd - gc)\lambda + (aei + gfb + cdh - hfa - bdi - cga) = 0##
The coefficient of ##\lambda^{2}## I can tell to be the trace of ##\mathbf{A}##, the final constant term is I think the determinant, whilst the coefficient of ##\lambda## I can't assign to anything of meaning.
 
etotheipi said:
I expanded it all out (hope I didn't mess anything up!) and got

##\lambda^{3} -(a+e+i)\lambda^{2} + (ai + ea + ei - hf - bd - gc)\lambda + (aei + gfb + cdh - hfa - bdi - cga) = 0##
The coefficient of ##\lambda^{2}## I can tell to be the trace of ##\mathbf{A}##, the final constant term is I think the determinant, whilst the coefficient of ##\lambda## I can't assign to anything of meaning.
It's a cubic equation!
 
  • Like
Likes   Reactions: etotheipi
PeroK said:
It's a cubic equation!

Right so then we have 2 complex roots and 1 real root. The coefficients are real so that fixes ##e^{i\alpha}## and ##e^{-i\alpha}## to be two arbitrary solutions, so all that remains is to show that ##\pm 1## is the other solution. And I think we don't need to prove that algebraically, a geometrical argument should do?
 
etotheipi said:
Right so then we have 2 complex roots and 1 real root. The coefficients are real so that fixes ##e^{i\alpha}## and ##e^{-i\alpha}## to be two arbitrary solutions, so all that remains is to show that ##\pm 1## is the other solution. And I think we don't need to prove that algebraically, a geometrical argument should do?
Not quite. It means you have one real root and a pair of conjugate roots.

Do you know the properties of an orthogonal matrix? There's a key (alternative defining) property.
 
  • Informative
Likes   Reactions: etotheipi
PeroK said:
Not quite. It means you have one real root and a pair of conjugate roots.

Do you know the properties of an orthogonal matrix? There's a key (alternative defining) property.

Yeah, I just realized I forgot about the modulus, so at the moment we're stuck with a real solution an two complex ones of the form ##\beta e^{i \alpha}## and ##\beta e^{-i \alpha}##.

I went onto Wikipedia and found that for a vector ##\vec{v}## and its transpose ##\vec{v}^\intercal##, if ##Q## is orthogonal then

##\vec{v}^\intercal \vec{v} = \vec{v}^\intercal Q^\intercal Q \vec{v}##

I'll play around with that for a little bit and see if I can get anything useful, assuming that's the one you were making reference to!
 
  • #10
etotheipi said:
Yeah, I just realized I forgot about the modulus, so at the moment we're stuck with a real solution an two complex ones of the form ##\beta e^{i \alpha}##.

I went onto Wikipedia and found that for a vector ##\vec{v}## and its transpose ##\vec{v}^\intercal##, if ##Q## is orthogonal then

##\vec{v}^\intercal \vec{v} = \vec{v}^\intercal Q^\intercal Q \vec{v}##

I'll play around with that for a little bit and see if I can get anything useful, assuming that's the one you were making reference to!
Yes, although I would think of that it terms of the inner product (and the the norm).
 
  • Like
Likes   Reactions: etotheipi
  • #11
PeroK said:
Yes, although I would think of that it terms of the inner product (and the the norm).

Yeah that makes more sense, because then ##||A\vec{v}||^{2} = ||\vec{v}||^{2}## since the ##A## and its transpose do some cancelling if ##A## is orthogonal.

In that case, can't we just say the square of the norm is going to just be ##\lambda^{2} ||\vec{v}||^{2}##, which is just ##\lambda^{2}## times itself so the modulus of lambda has to be 1, and this gives us all of our results?
 
  • #12
etotheipi said:
Yeah that makes more sense, because then ##||A\vec{v}||^{2} = ||\vec{v}||^{2}## since the ##A## and its transpose do some cancelling if ##A## is orthogonal.

In that case, can't we just say the square of the norm is going to just be ##\lambda^{2} ||\vec{v}||^{2}##, which is just ##\lambda^{2}## times itself so the modulus of lambda has to be 1, and this gives us all of our results?
Yes need to be careful because ##\lambda## may be complex. The simplest approach is to consider ##A## as a unitary matrix with real coefficients.
 
  • Like
Likes   Reactions: etotheipi
  • #13
PeroK said:
Yes need to be careful because ##\lambda## may be complex. The simplest approach is to consider ##A## as a unitary matrix with real coefficients.

Yes, completely forgot about that part! I should really write ##||\vec{v}||^2 = ||A \vec{v}||^2 = ||\lambda \vec{v}||^2 = \lambda \lambda^* ||\vec{v}||^2 = |\lambda|^2 ||\vec{v}||^2##. Thanks for the help!
 
  • #14
etotheipi said:
Right so then we have 2 complex roots and 1 real root.
How did you conclude that two of the roots are complex? A cubic could have three real roots.
 
  • Like
Likes   Reactions: WWGD and etotheipi
  • #15
vela said:
How did you conclude that two of the roots are complex? A cubic could have three real roots.

I suppose you're right, I just assumed from how the question was phrased that there were only two possibilities for real roots (##1## and ##-1##), and this indeed turned out to be the case since there are only two re`als with modulus 1.

But the point stands, I should have been more careful at that stage :wink:.
 
  • Like
Likes   Reactions: PeroK
  • #16
etotheipi said:
I suppose you're right, I just assumed from how the question was phrased that there were only two possibilities for real roots (##1## and ##-1##), and this indeed turned out to be the case since there are only two re`als with modulus 1.

But the point stands, I should have been more careful at that stage :wink:.
There are the degenerate cases where ##\lambda = \pm 1## only. Although these do meet the criteria in each case.
 
  • Informative
Likes   Reactions: etotheipi
  • #17
PeroK said:
There are the degenerate cases where ##\lambda = \pm 1## only.

Oh, do you mean if we got something like ##\alpha = \pi## or ##\alpha = 0## so that we would indeed have three real roots, which might be ##(\lambda_1, \lambda_2, \lambda_3) = (1,1,1), (1,1,-1), (1,-1,-1), (-1,-1,-1)##. Hadn't thought of that either!
 
  • Like
Likes   Reactions: vela
  • #18
etotheipi said:
Oh, do you mean if we got something like ##\alpha = \pi## or ##\alpha = 0## so that we would indeed have three real roots, which might be ##(\lambda_1, \lambda_2, \lambda_3) = (1,1,1), (1,1,-1), (1,-1,-1), (-1,-1,-1)##. Hadn't thought of that either!
I didn't think of the degenerate cases, but yes you can make it work like that for those as well.
 
  • Like
Likes   Reactions: etotheipi
  • #19
PeroK said:
There are the degenerate cases where ##\lambda = \pm 1## only. Although these do meet the criteria in each case.
These occur iff the real orthogonal matrix is symmetric. I don't really view involutions as "degenerate" though. In fact involutions are quite nice.
 

Similar threads

Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
13
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K