I like Serena
Science Advisor
Homework Helper
MHB
- 16,335
- 258
Let's see what $\sigma_v$ looks like with respect to the basis $(v,w,b_3,b_4)$ assuming that $v$ and $w$ are independent.
We have $\sigma_v(w)=w-2(w\cdot v)v$.
So:
$$D_{\sigma_v} = \begin{pmatrix}-1&-2w\cdot v\\&1\\&&1\\&&&1\end{pmatrix}$$
Similarly we have $\sigma_w(v)=v-2(v\cdot w)w$. So:
$$D_{\sigma_w} = \begin{pmatrix}1\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
And:
$$D_{\sigma_v} D_{\sigma_w}= \begin{pmatrix}-1+4v\cdot w&2v\cdot w\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
Close, isn't it? But not quite there.
We do see that we have a top left 2x2 matrix and otherwise the identity matrix.
And we already know that 2 reflections are supposed to make a rotation in 2 dimensions.
But that is with respect to an orthonormal basis.
So suppose we construct an orthonormal basis.
We can start with $v$.
We can pick a second vector $b_2$ perpendicular to $v$ with unit length such that $w$ is a linear combination of $v$ and $b_2$.
If $v$ and $w$ are independent, then we can construct it with $\tilde b_2 =w-(w\cdot v)v$ and then pick $b_2=\tilde b_2/\|\tilde b_2\|$.
What can we do if $v$ and $w$ are dependent?
Finally we pick $b_3$ and $b_4$ so that they are orthonormal to $v$ and $b_2$.
The resulting matrix $D_{\sigma_v} D_{\sigma_w}$ will again have a top left 2x2 matrix and will otherwise be the identity matrix, won't it?
We have $\sigma_v(w)=w-2(w\cdot v)v$.
So:
$$D_{\sigma_v} = \begin{pmatrix}-1&-2w\cdot v\\&1\\&&1\\&&&1\end{pmatrix}$$
Similarly we have $\sigma_w(v)=v-2(v\cdot w)w$. So:
$$D_{\sigma_w} = \begin{pmatrix}1\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
And:
$$D_{\sigma_v} D_{\sigma_w}= \begin{pmatrix}-1+4v\cdot w&2v\cdot w\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
Close, isn't it? But not quite there.

We do see that we have a top left 2x2 matrix and otherwise the identity matrix.
And we already know that 2 reflections are supposed to make a rotation in 2 dimensions.
But that is with respect to an orthonormal basis.
So suppose we construct an orthonormal basis.
We can start with $v$.
We can pick a second vector $b_2$ perpendicular to $v$ with unit length such that $w$ is a linear combination of $v$ and $b_2$.
If $v$ and $w$ are independent, then we can construct it with $\tilde b_2 =w-(w\cdot v)v$ and then pick $b_2=\tilde b_2/\|\tilde b_2\|$.
What can we do if $v$ and $w$ are dependent?

Finally we pick $b_3$ and $b_4$ so that they are orthonormal to $v$ and $b_2$.
The resulting matrix $D_{\sigma_v} D_{\sigma_w}$ will again have a top left 2x2 matrix and will otherwise be the identity matrix, won't it?

Last edited: