MHB Matrix Ops: R(x)v & R(x)w Rotate Counter-Clockwise

  • Thread starter Thread starter brunette15
  • Start date Start date
  • Tags Tags
    Matrix Operations
brunette15
Messages
58
Reaction score
0
I have the following matrix R(x) = [cos(x) -sin(x) ; sin(x) cos(x)]

Now consider the unit vectors v = [1;0] and w = [0,1].

Now if we compute R(x)v and R(x)w the vectors are supposed to rotate about the origin by the angle x in a counter clockwise direction. I am struggling to see how this works.

Can anyone please further explain this idea?

Thanks in advance!
 
Physics news on Phys.org
brunette15 said:
I have the following matrix R(x) = [cos(x) -sin(x) ; sin(x) cos(x)]

Now consider the unit vectors v = [1;0] and w = [0,1].

Now if we compute R(x)v and R(x)w the vectors are supposed to rotate about the origin by the angle x in a counter clockwise direction. I am struggling to see how this works.

Can anyone please further explain this idea?

Thanks in advance!

First of all, it helps if you can visualise a point in both Cartesian form (which it is usually written in its matrix form as) and its polar form, in other words, in terms of its distance from the origin (radius) and its direction (angle swept out).

Consider a point $\displaystyle \begin{align*} (x, y) \end{align*}$. It can be written in its polar form as $\displaystyle \begin{align*} \left( r\cos{ ( \theta ) }, r\sin{ (\theta )} \right) \end{align*}$. Suppose it is rotated by an angle of $\displaystyle \begin{align*} \alpha \end{align*}$ in the anticlockwise direction. Then the new point $\displaystyle \begin{align*} \left( x' , y' \right) \end{align*}$ has the same distance, but now its angle has $\displaystyle \begin{align*} \alpha \end{align*}$ added to it, thus $\displaystyle \begin{align*} \left( x' , y' \right) = \left( r\cos{ \left( \theta + \alpha \right) } , r\sin{ \left( \theta + \alpha \right) } \right) \end{align*}$. This doesn't really help us though, because we would like to be able to see a transformation in terms of the original x and y. Thankfully they simplify with the compound angle identities as

$\displaystyle \begin{align*} x' &= r\cos{ \left( \theta + \alpha \right) } \\ &= r \left[ \cos{ \left( \theta \right) } \cos{ \left( \alpha \right) } - \sin{ \left( \theta \right) } \sin{ \left( \alpha \right) } \right] \\ &= r\cos{ \left( \theta \right) } \cos{ \left( \alpha \right) } - r\sin{ \left( \theta \right) } \sin{ \left( \alpha \right) } \\ &= x\cos{ \left( \alpha \right) } - y\sin{ \left( \alpha \right) } \end{align*}$

and

$\displaystyle \begin{align*} y' &= r\sin{ \left( \theta + \alpha \right) } \\ &= r\left[ \sin{\left( \theta \right) } \cos{ \left( \alpha \right) } + \cos{ \left( \theta \right) } \sin{ \left( \alpha \right) } \right] \\ &= r \sin{ \left( \theta \right) } \cos{ \left( \alpha \right) } + r\cos{ \left( \theta \right) } \sin{ \left( \alpha \right) } \\ &= y\cos{ \left( \alpha \right) } + x\sin{ \left( \alpha \right) } \end{align*}$

Can you see how it would look in matrix form now?
 
brunette15 said:
I have the following matrix R(x) = [cos(x) -sin(x) ; sin(x) cos(x)]

Now consider the unit vectors v = [1;0] and w = [0,1].

Now if we compute R(x)v and R(x)w the vectors are supposed to rotate about the origin by the angle x in a counter clockwise direction. I am struggling to see how this works.

Can anyone please further explain this idea?

Thanks in advance!

Hey brunette15! (Smile)

See how v = [1;0], the first unit vector, "selects" the leftmost column of the matrix?
So the leftmost column has to be the image of the first unit vector.
Indeed, [cos(x); sin(x)] is the unit vector rotated by an angle of x.

Same for the second unit vector, w = [0;1], that must be mapped to [-sin(x) ; cos(x)].

In general, if you want to find any matrix, consider what the images of the unit vectors must be.
Put them beside each other in a matrix and presto! (Mmm)
 
Thankyou!
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 26 ·
Replies
26
Views
687
Replies
31
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
4K