MHB Is the 3-D Rotation Matrix Defined by Euler Rotations or a General Angle?

Click For Summary
The discussion focuses on the definition of a 3-D rotation matrix and its relation to Euler rotations. It clarifies that a rotation can be represented by an orthogonal matrix, with the net angle of rotation, φ, being the sum of rotations about the x, y, and z axes. The conversation explores the characteristic equation of the rotation matrix and its eigenvalues, confirming that one eigenvalue is 1 while the other two are complex, represented as e^{±iφ}. The correct formulation of the characteristic equation is emphasized, leading to the conclusion that the complex roots are indeed e^{iφ} and e^{-iφ}. This analysis solidifies the connection between the rotation matrix and Euler angles in 3D space.
ognik
Messages
626
Reaction score
2
The question mentions an orthogonal matrix describing a rotation in 3D ... where $\phi$ is the net angle of rotation about a fixed single axis. I know of the 3 Euler rotations, is this one of them, arbitrary, or is there a general 3-D rotation matrix in one angle?

If I build one, I would start with the direction cosines $ \begin{bmatrix}cos(x', x)&cos(y', x)&cos(z', x)\\cos(x', y)&cos(y', y)...\\...\end{bmatrix}$

Lets say we rotate a total of $\phi$, I think this means $\phi = \phi_x + \phi_y + \phi_z$? But around the z axis only (for example), $\phi = \phi_z$?

So I'm not sure how to apply this to the matrix above, is everything except w.r.t. z = $\delta_{ij}$?
 
Physics news on Phys.org
Would appreciate corrections/confirmations to the above please - and if I put something confusingly I'll be happy to improve it, if I know what it is :-)
 
I'm now sure the question could use any of the 3 Eular (orthogonal) rotation matrices, the diagonals of each have 2 $Cos \phi$ terms and a 1. i.e. the sum of the 3 eigenvalues is $2Cos \phi + 1$

The question now is - given 1 eigenvalue = 1, show the other 2 = $e^{\pm i\phi}$

Choosing the rotation about the z axis, $R_z = \begin{bmatrix}Cos&-Sin&0\\Sin&Cos&0\\0&0&1\end{bmatrix}$

My Characteristic eqtn is $ (1-\lambda)(Cos^2\phi -2\lambda Cos\phi + Sin^2 \phi) $ = $ (1-\lambda)(1-2\lambda Cos\phi) $

The simplest (to me anyway) roots are $\lambda = 1$ (as expected) and $ \lambda = \frac{1}{2 Cos\phi}$ (But 2 complex roots expected?)

Now I could say that $ Cos\phi =Re\left\{ \frac{e^{i \phi}+e^{-i \phi}} {2} \right\} $ and dredge $\lambda = e^{\pm i\phi}$ out of this - but the question states that these 2 should be complex eigenvalues and I have to take the real parts to make this work?
 
Just revisiting this and noticed a silly mistake, off course $ Cos \phi = \frac{1}{2} \left( e^{i \phi} + e^{-i \phi} \right)$. Also my characteristic eqtn was wrong (Doh) ,

$ (1-\lambda)(Cos^2\phi -2\lambda Cos\phi + \lambda^2 + Sin^2 \phi) $ = $ (1 -\lambda)(\lambda^2-2\lambda Cos\phi +1) $ ... which indeed provides 2 complex roots - $e^{i \phi}, e^{-i \phi} $ - in addition to $\lambda = 1$
 
Last edited:
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 26 ·
Replies
26
Views
760
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 9 ·
Replies
9
Views
794
  • · Replies 4 ·
Replies
4
Views
2K
Replies
31
Views
3K