Orthogonal Transformation in Euclidean Space

Click For Summary

Discussion Overview

The discussion revolves around the existence of an orthogonal transformation in Euclidean space that maps one vector to another, given that both vectors have the same magnitude. Participants explore the conditions under which such a transformation can be defined, focusing on linearity and the construction of appropriate bases.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose a specific linear transformation \(f\) that maps vector \(u\) to vector \(v\) while leaving other vectors unchanged, but question its linearity.
  • Others argue that the transformation must be defined on an orthonormal basis and extended by linearity to ensure it is orthogonal.
  • A later reply emphasizes the necessity of constructing a linear map that takes a basis to another basis, ensuring invertibility.
  • One participant provides an explicit example in \(\mathbb{R}^3\) to illustrate the process of constructing orthonormal bases and applying Gram-Schmidt to achieve the desired transformation.

Areas of Agreement / Disagreement

Participants generally disagree on the validity of the initial proposed transformation. While some support the idea of defining \(f\) directly, others contend that it cannot be linear unless \(u\) and \(v\) are identical. The discussion remains unresolved regarding the best approach to construct the orthogonal transformation.

Contextual Notes

Limitations include the assumption that both vectors are non-zero and the need for clarity on the definitions of linearity and orthogonality in the context of transformations. The discussion also highlights the complexity of proving orthogonality in transformations defined on arbitrary bases.

Who May Find This Useful

Readers interested in linear algebra, transformations in Euclidean spaces, and the properties of orthogonal maps may find this discussion beneficial.

Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

Here's one of the questions that I encountered recently along with my answer. Let me know if you see any mistakes. I would really appreciate any comments, shorter methods etc. :)

Problem:

Let \(u,\,v\) be two vectors in a Euclidean space \(V\) such that \(|u|=|v|\). Prove that there is an orthogonal transformation \(f:\, V\rightarrow V\) such that \(v=f(u)\).

Solution:

We assume that \(u\) and \(v\) are non zero. Otherwise the result holds trivially.

Let \(B\) denote the associated symmetric bilinear function of the Euclidean space. Let us define the linear transformation \(f\) as,

\[f(x)=\begin{cases}x&\mbox{if}&x\neq u\\v&\mbox{if}&x=u\end{cases}\]

It's clear that, \(B(f(x),\,f(y))=B(x,\,y)\) whenever \(x,\,y\neq u\). Also \(B(f(u),\,f(u))=B(v,\,v)\) and since \(|v|=|u|\Rightarrow B(v,\,v)=B(u,\,u)\) we have \(B(f(u),\,f(u))=B(u,\,u)\).

It remains to show that, \(B(f(x),\,f(u))=B(x,\,u)\) for \(x\neq u\).

\[B(f(v+u),\,f(v+u))=B(f(v),\,f(v))+2B(f(v),\,f(u))+B(f(u),\,f(u))\]

Also since \(v+u\neq u\),

\[B(f(v+u),\,f(v+u))=B(v+u,\,v+u)=B(v,\,v)+2B(v,\,u)+B(u,\,u)\]

Using the above two results and the fact that \(B(u,\,u)=B(v,\,v)\) we get,

\[B(f(v),\,f(u))=B(v,\,u)\]

Now consider \(B(f(x+v),\,f(x+u))\).

Case I: \(x+v\neq u\)

\[B(f(x+v),\,f(x+u))=B(f(x),\,f(x))+B(f(x),\,f(u))+B(f(v),\,f(x))+B(f(v),\,f(u))\]

Also,

\[B(f(x+v),\,f(x+u))=B(x+v,\,x+u)=B(x,\,x)+B(x,\,u)+B(x,\,v)+B(v,\,u)\]

Using the above two results and the fact that \(B(f(v),\,f(u))=B(v,\,u)\) we get,

\[B(f(x),\,f(u))=B(x,\,u)\]

Case II: \(x+v=u\)

\[B(x,\,v)=B(u-v,\,v)=B(u,\,v)-B(v,\,v)\]
\[B(x,\,u)=B(u-v,\,u)=B(u,\,u)-B(v,\,u)\]
Therefore, \[B(x,\,u)=-B(x,\,v)~~~~~~(1)\]

\[B(f(x),\,f(u))=B(f(u-v),\,f(u))=B(f(u),\,f(u))-B(f(v),\,f(u))=B(v,\,v)-B(v,\,v)=0\]

Then since \(B(f(x),\,f(u))=B(x,\,v)=0\) by (1) we get \(B(x,\,u)=0\)

\[\therefore B(f(x),\,f(u))=B(x,\,u)\]
 
Physics news on Phys.org
Sudharaka said:
Problem:

Let \(u,\,v\) be two vectors in a Euclidean space \(V\) such that \(|u|=|v|\). Prove that there is an orthogonal transformation \(f:\, V\rightarrow V\) such that \(v=f(u)\).

Solution:

We assume that \(u\) and \(v\) are non zero. Otherwise the result holds trivially.

Let \(B\) denote the associated symmetric bilinear function of the Euclidean space. Let us define the linear transformation \(f\) as,

\[f(x)=\begin{cases}x&\mbox{if}&x\neq u\\v&\mbox{if}&x=u\end{cases}\]
The problem with this is that the map $f$ can never be linear (unless $u=v$).

It may help to think in terms of a simple example. In the space $V=\mathbb{R}^2$, let $u=(1,0)$ and $v=(0,1)$. The only orthogonal transformations taking $u$ to $v$ are a rotation of the whole space through a right angle, or a reflection of the whole space in the line $y=x$. Either way, the transformation has to shift just about every vector in the space. The map that just takes $u$ to $v$ and leaves everything else fixed is not linear, and certainly not orthogonal.

To prove this result, you need to construct a linear map $f$ taking $u$ to $v$. The way to do that is to define $f$ on an orthonormal basis for $V$ and then extend it by linearity to a map on the whole of $V$. Start by constructing an orthonormal basis $\{e_1,e_2,\ldots,e_n\}$ such that $e_1$ is a multiple of $u$. Then do the same for $v$, showing that there is an orthonormal basis $\{g_1,g_2,\ldots,g_n\}$ such that $g_1$ is a multiple of $v$. You can then define $f$ by $f(e_k) = g_k$ for $1\leqslant k\leqslant n$.

It should then be straightforward to check that the map $f$ is orthogonal.
 
Opalg said:
The problem with this is that the map $f$ can never be linear (unless $u=v$).

It may help to think in terms of a simple example. In the space $V=\mathbb{R}^2$, let $u=(1,0)$ and $v=(0,1)$. The only orthogonal transformations taking $u$ to $v$ are a rotation of the whole space through a right angle, or a reflection of the whole space in the line $y=x$. Either way, the transformation has to shift just about every vector in the space. The map that just takes $u$ to $v$ and leaves everything else fixed is not linear, and certainly not orthogonal.

To prove this result, you need to construct a linear map $f$ taking $u$ to $v$. The way to do that is to define $f$ on an orthonormal basis for $V$ and then extend it by linearity to a map on the whole of $V$. Start by constructing an orthonormal basis $\{e_1,e_2,\ldots,e_n\}$ such that $e_1$ is a multiple of $u$. Then do the same for $v$, showing that there is an orthonormal basis $\{g_1,g_2,\ldots,g_n\}$ such that $g_1$ is a multiple of $v$. You can then define $f$ by $f(e_k) = g_k$ for $1\leqslant k\leqslant n$.

It should then be straightforward to check that the map $f$ is orthogonal.

Thanks so much for the informative reply. I think I am getting the idea. First we can choose an orthonormal basis \(\{e_1,e_2,\ldots,e_n\}\) such that \(e_1\) is a multiple of \(u\). Then if we rotate this basis by a certain angle so as to align \(e_1\) with \(v\) we could get the basis \(\{g_1,g_2,\ldots,g_n\}\). Since \(|v|=|u|\) our new basis would have \(g_1\) a multiple of \(v\). Am I correct? Or is there a more formal way of doing this? :)
 
I think what Opalg is getting at is this:

If you have a linear map that takes a basis to a basis, it is certainly invertible.

For $e_1$, we can always choose $e_1 = u/|u|$, and use something like Gram-Schmidt to turn any basis extension we create into an orthogonal basis:

$\{e_1,\dots,e_n\}$.

The same process is then used to create the 2nd basis:

$\{g_1,\dots,g_n\}$, where $g_1 = v/|v|$.

We then DEFINE, for any $x \in V$:

$T(x) = T(c_1e_1 + \cdots + c_ne_n) = c_1g_1 + \cdots + c_ng_n$.

Note that $T(u) = T(|u|e_1) = |u|T(e_1) = |v|T(e_1)$ (since $|u| = |v|$)

$= |v|g_1 = |v|(v/|v|) = v$.

Now proving orthogonality is a bit of a mess to write explicitly, but the idea is this:

Since both bases are ORTHOGONAL (we can actually insist on orthonormal by scaling the basis vectors to unit vectors), we have:

$B(e_i,e_i) = B(g_i,g_i) = 1$
$B(e_i,e_j) = B(g_i,g_j) = 0,\ i \neq j$.

So if:

$x = c_1e1 + \cdots + c_ne_n$
$y = d_1e_1 + \cdots + d_ne_n$, then:

$B(x,y) = B(c_1e1 + \cdots + c_ne_n,d_1e_1 + \cdots + d_ne_n)$

$\displaystyle = \sum_{i,j} c_id_jB(e_i,e_j) = \sum_i c_id_i$

by the bilinearity of $B$ and the orthogonality of our basis.

Similarly, evaluating $B(T(x),T(y))$ gives the same answer, and there you go.
 
Forgive the double-post, but I thought I would give an explicit example for $\Bbb R^3$.

For our symmetric Euclidean bilinear form, I will use the standard dot-product. It is possible to use so-called "weighted" inner products, but they just add needlessly complicated calculation to the scenario.

For our first vector, we will take $u = (2,0,0)$. For the second, we will take $v = (1,\sqrt{2},1)$, which I think are perfectly reasonable choices.

For our first basis, the usual $\{(1,0,0),(0,1,0),(0,0,1)\}$ will do quite nicely. The second basis is a bit of a pain to come up with, we start with the unit vector:

$g_1 = (\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})$.

To get a basis, we'll just add in (0,1,0) and (0,0,1) and apply Gram-Schmidt:

First, we calculate:

$(0,1,0) - \frac{(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})\cdot(0,1,0)}{(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})\cdot(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})}(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})$

$= (\frac{-\sqrt{2}}{4},\frac{1}{2},\frac{-\sqrt{2}}{4})$

and normalizing this gives us:

$g_2 = (\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})$

Finally, we calculate:

$(0,0,1) - \frac{(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})\cdot(0,0,1)}{(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})\cdot(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})}(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2}) - \frac{(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})\cdot(0,0,1)}{(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})\cdot(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})}(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})$

$= (\frac{-1}{2},0,\frac{1}{2})$ which upon normalization gives us:

$g_3 = (\frac{-\sqrt{2}}{2},0,\frac{\sqrt{2}}{2})$.

It is clear, then, then that the orthogonal linear mapping we are looking for is given by the matrix (relative to the standard basis for $\Bbb R^3$):

$[T] = \begin{bmatrix}\frac{1}{2}&\frac{-1}{2}&\frac{-\sqrt{2}}{2}\\ \frac{\sqrt{2}}{2}&\frac{\sqrt{2}}{2}&0\\ \frac{1}{2}&\frac{-1}{2}&\frac{\sqrt{2}}{2} \end{bmatrix}$

which obviously (heh!) has determinant 1, and is orthogonal, and moreover:

$T(u) = T(2,0,0) = (1,\sqrt{2},1) = v$.
 
Deveno said:
I think what Opalg is getting at is this:

If you have a linear map that takes a basis to a basis, it is certainly invertible.

For $e_1$, we can always choose $e_1 = u/|u|$, and use something like Gram-Schmidt to turn any basis extension we create into an orthogonal basis:

$\{e_1,\dots,e_n\}$.

The same process is then used to create the 2nd basis:

$\{g_1,\dots,g_n\}$, where $g_1 = v/|v|$.

We then DEFINE, for any $x \in V$:

$T(x) = T(c_1e_1 + \cdots + c_ne_n) = c_1g_1 + \cdots + c_ng_n$.

Note that $T(u) = T(|u|e_1) = |u|T(e_1) = |v|T(e_1)$ (since $|u| = |v|$)

$= |v|g_1 = |v|(v/|v|) = v$.

Now proving orthogonality is a bit of a mess to write explicitly, but the idea is this:

Since both bases are ORTHOGONAL (we can actually insist on orthonormal by scaling the basis vectors to unit vectors), we have:

$B(e_i,e_i) = B(g_i,g_i) = 1$
$B(e_i,e_j) = B(g_i,g_j) = 0,\ i \neq j$.

So if:

$x = c_1e1 + \cdots + c_ne_n$
$y = d_1e_1 + \cdots + d_ne_n$, then:

$B(x,y) = B(c_1e1 + \cdots + c_ne_n,d_1e_1 + \cdots + d_ne_n)$

$\displaystyle = \sum_{i,j} c_id_jB(e_i,e_j) = \sum_i c_id_i$

by the bilinearity of $B$ and the orthogonality of our basis.

Similarly, evaluating $B(T(x),T(y))$ gives the same answer, and there you go.

Deveno said:
Forgive the double-post, but I thought I would give an explicit example for $\Bbb R^3$.

For our symmetric Euclidean bilinear form, I will use the standard dot-product. It is possible to use so-called "weighted" inner products, but they just add needlessly complicated calculation to the scenario.

For our first vector, we will take $u = (2,0,0)$. For the second, we will take $v = (1,\sqrt{2},1)$, which I think are perfectly reasonable choices.

For our first basis, the usual $\{(1,0,0),(0,1,0),(0,0,1)\}$ will do quite nicely. The second basis is a bit of a pain to come up with, we start with the unit vector:

$g_1 = (\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})$.

To get a basis, we'll just add in (0,1,0) and (0,0,1) and apply Gram-Schmidt:

First, we calculate:

$(0,1,0) - \frac{(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})\cdot(0,1,0)}{(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})\cdot(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})}(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})$

$= (\frac{-\sqrt{2}}{4},\frac{1}{2},\frac{-\sqrt{2}}{4})$

and normalizing this gives us:

$g_2 = (\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})$

Finally, we calculate:

$(0,0,1) - \frac{(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2})\cdot(0,0,1)}{(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})\cdot(\frac{1}{2},\frac{\sqrt{2}}{2}, \frac{1}{2})}(\frac{1}{2},\frac{\sqrt{2}}{2},\frac{1}{2}) - \frac{(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})\cdot(0,0,1)}{(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})\cdot(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})}(\frac{-1}{2},\frac{\sqrt{2}}{2},\frac{-1}{2})$

$= (\frac{-1}{2},0,\frac{1}{2})$ which upon normalization gives us:

$g_3 = (\frac{-\sqrt{2}}{2},0,\frac{\sqrt{2}}{2})$.

It is clear, then, then that the orthogonal linear mapping we are looking for is given by the matrix (relative to the standard basis for $\Bbb R^3$):

$[T] = \begin{bmatrix}\frac{1}{2}&\frac{-1}{2}&\frac{-\sqrt{2}}{2}\\ \frac{\sqrt{2}}{2}&\frac{\sqrt{2}}{2}&0\\ \frac{1}{2}&\frac{-1}{2}&\frac{\sqrt{2}}{2} \end{bmatrix}$

which obviously (heh!) has determinant 1, and is orthogonal, and moreover:

$T(u) = T(2,0,0) = (1,\sqrt{2},1) = v$.

Hi Denevo,

Thanks very much for both of your posts. After reading them I understood almost everything that is required to solve the problem. Now I think I should read more about the Gram-Schimdt process. :)
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K