Regarding Orthogonal Transformations

In summary, an orthogonal transformation is a type of linear transformation that preserves lengths and angles. It is represented by a square matrix with orthonormal columns and rows and has various applications in computer graphics, robotics, signal processing, and physics. It differs from other types of transformations as it does not change the shape or size of an object, and it is important in mathematics for its ability to represent and manipulate geometric objects efficiently and precisely.
  • #1
johndoe3344
29
0
Find an orthogonal transformation T from R3 to R3 such that

T of the column vector [2/3 2/3 1/3] is equal to the column vector [0 0 1]

So I tried to construct out the 3x3 matrix

[a b c]
[d e f]
[g h i]

and applied the properties of an orthogonal matrix and basic algebra. I ended up with a giant mess of equations without any way to solve them... for example..

(2/3)a + (2/3)b + (1/3)c = 0
...
a^2 + d^2 + g^2 = 1
...
ab + de + gh = 0
...
and so on.

Could anyone shed some light on how to approach these types of problems?
 
Physics news on Phys.org
  • #2
These are both unit length vectors. You need a rotation matrix T that rotates the first vector into the other. Draw the vectors and think of what simple rotations of the initial vector you need to do in sequence to obtain the final vector.

I would rotate [2/3 2/3 1/3] around the Z axis till it falls into YZ plane and then rotate around X axis so I get [0 0 1]. The matrices that rotate a certain angle around certain coordinate axis are standard. Your final matrix will be the product of the two rotation matrices.
 
Last edited:
  • #3
Since both of the vector are unit length then you can easily make two orthonormal bases of R^3: (u)={u_1,u_2,u_3} and (v)={v_1,v_2,v_3} where
u_1=(2/3, 2/3, 1/3) and
v_1=(0,0,1)
Now define T so that Tu_1 = v_1, Tu_2 = v_2, Tu_3 = v_3.
Since this transformation transforms one orthonormal base into another one it's an orthogonal transformation. And since Tu_1 = v_1 it satisfies your equation. The 3x3 matrix in the base (u) is
([v_1]_(u) | ([v_2]_(u) | ([v_3]_(u) ) where [v]_(u) is the coordnate vector of v in the base (u) which is a colomn vector.
 
  • #4
I follow up to this part:

"where [v]_(u) is the coordinate vector of v in the base (u) which is a column vector."

What do you mean specifically? What would be the basis of u?

Is it [1,0,0] [0,1,0] [0,0,1]? If so, how would I go about constructing [v]_(u)?
 
  • #5
First of all, as I said in the beginning, (u)={u_1,u_2,u_3} (v)={v_1,v_2,v_3}
And I mixed up the last part with the 3x3 matrix so I'll explain again:
If we have some transformation T:V->V then using some basis of V B={x_1,...,x_n} you can define a matrix of T - [T] - where the columns of the matrix [T] are the coordinates of T(x_i) according to the basis B.
When I say coordinates of T(x_i) according to B I mean the column vector
[T(x_i)]_B = (a_1,...,a_n)^t where T(x_i) = a_1*x_1 + ... + a_n*x_n
You can find this vector by solving a linear system. So when I write [v]_(u) I mean the column vector (a_1,a_2,a_3)^t where
T(v) = a_1*u_1 + a_2*u_2 + a_3*x_3
In this case I wrote ([v_1]_(u) | ([v_2]_(u) | ([v_3]_(u) ) instead of ([T(u_1)]_(u) | ([T(u_2)]_(u) | ([T(u_3)]_(u) ) since T(u_i) = v_i.
 
  • #6
I don't think he needs T in the basis u. The problem asks for T in the initial basis in which the two vectors are given. The basis v can be chosen to coincide with the initial basis.
 
  • #7
Why are you trying to solve for nine quantities when you already know three of them? Suppose you have three orthonormal column vectors in R3, [itex]\hat e_1[/itex], [itex]\hat e_2[/itex], and [itex]\hat e_3[/itex] in some frame F1. These orthonormal vectors form the basis for some other reference frame F2. The transformation matrix from frame F1 to frame F2 is just
[tex]T_{F_1\to F_2} = \bmatrix \hat e_1^{,T} \\ \hat e_2^{,T} \\ \hat e_3^{,T} \endbmatrix[/tex]
You already have one unit vector, e_3. So you know the matrix must be of the form
[tex]T = \bmatrix & \hat e_1^{,T} & \\ & \hat e_2^{,T} & \\ \frac 2 3 & \frac 2 3 & \frac 1 3 \endbmatrix[/tex]

What you need to do is find some vector normal to [itex]\bmatrix 2/3 & 2/3 & 1/3\endbmatrix[/itex]. The solution is not unique.
 
Last edited:
  • #8
The physical rotation that rotates one unit vector into another not coinciding with it is unique because the two vectors fix the axis and angle of rotation - the axis would be the normal to the plane spanned by the two vectors, the angle would be the one that rotates the first vector into the second remaining in that plane. Each physical rotation is represented by exactly one rotation matrix so the solution must be unique.
 
Last edited:
  • #9
Thanks for all the help. But wait, so is the solution unique or not unique?
 
  • #10
There are an infinite number of solutions to this problem. Suppose you find one such solution. Designate the transformation matrix as

[tex]T=\bmatrix
&& \hat e_1^{;T} && \\
&& \hat e_2^{;T} && \\
&& \hat e_3^{;T} && \endbmatrix [/tex]

where each of the [itex] \hat e_j [/itex] are orthogonal unit vectors and [itex] \hat e_3 = \bmatrix 2/3 & 2/3 & 1/3 \endbmatrix ^T[/itex]. This obviously transforms [itex] \hat e_3 [/itex] to [itex] \hat z [/itex]. Simply swapping e_1 and e_2 (with sign change to keep the system right-handed) will also transform [itex] \hat e_3 [/itex] to [itex] \hat z [/itex]. In fact, any matrix of the form

[tex] T_\phi = \bmatrix
&& \cos \phi \hat e_1^{;T} - \sin \phi \hat e_2^{;T} && \\
&& \sin\phi \hat e_1^{;T} + \cos\phi \hat e_2^{;T} && \\
&& \hat e_3^{;T} && \endbmatrix [/tex]

also transforms [itex] \hat e_3 [/itex] to [itex] \hat z [/itex].
 
  • #11
I think DH is right. I was thinking that physical rotation can be uniquely fixed by rotating only one vector but it turns out you need to specify how a basis of three vectors is rotated. There are many axis-angle combinations that will rotate a fixed initial vector into a fixed final vector.

What I can't understand is the fact that a 3x3 orthogonal matrix has 3 free parameters and specifying a vector that rotates into another vector gives 3 equations that should fix the free parameters uniquely but it doesn't ...
 
  • #12
Three equations in three unknowns do not yield a unique solution if the three equations are not linearly independent, which is exactly what is happening here.
 

1. What is an orthogonal transformation?

An orthogonal transformation, also known as an orthogonal matrix, is a type of linear transformation that preserves lengths and angles. This means that the distance between any two points and the angle between any two lines or planes will remain the same after the transformation. In other words, the transformation does not distort the shape or size of an object.

2. How is an orthogonal transformation represented?

An orthogonal transformation can be represented by a square matrix in which the columns and rows are orthonormal, meaning they are both orthogonal (perpendicular) and normalized (unit length). This matrix can be denoted as Q and is often referred to as a rotation matrix, as it can rotate an object in 2D or 3D space without changing its shape or size.

3. What are some applications of orthogonal transformations?

Orthogonal transformations have many practical applications in fields such as computer graphics, robotics, signal processing, and physics. They are used to rotate, reflect, and scale objects in computer graphics and animation, as well as in robotics for controlling the movement of robots. In signal processing, orthogonal transformations can be used to compress or decompress data, and in physics, they are used to describe the behavior of physical systems.

4. How is an orthogonal transformation different from other types of transformations?

Unlike other types of transformations, such as shear, stretch, or skew, an orthogonal transformation does not change the shape or size of an object. It only changes the orientation or position of the object in space. Additionally, an orthogonal transformation matrix has the special property of being both orthogonal and invertible, meaning it can be undone by multiplying by its inverse matrix.

5. What is the importance of orthogonal transformations in mathematics?

Orthogonal transformations are essential in mathematics because they provide a way to represent and manipulate geometric objects in a precise and efficient manner. They can simplify complex calculations and are used in many mathematical theories and applications, such as in the study of symmetry, geometry, and linear algebra. Additionally, the concept of orthogonality is fundamental in many areas of mathematics, including vector spaces, inner product spaces, and Fourier analysis.

Similar threads

  • Linear and Abstract Algebra
Replies
20
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
192
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
933
  • Linear and Abstract Algebra
Replies
8
Views
873
  • Linear and Abstract Algebra
Replies
9
Views
566
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
869
  • Linear and Abstract Algebra
Replies
6
Views
512
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Back
Top