# Proof of combination of reflections

1. May 27, 2008

### gop

1. The problem statement, all variables and given/known data

V is a n-dimensional euclidean space. U and W are n-1 dimensional subspaces of V.
U and W define a reflection (because of their property as n-1 dimensional subspaces).

Show that
$$s_U \circ s_W = s_W \circ s_U$$

if and only if

$$W^{\perp}, U^{\perp}$$

are perpendicular.

2. Relevant equations

$$W^{\perp}$$ is the subspace of V such that every vector in $$W^{\perp}$$ is perpendicular to W.

2. May 27, 2008

### Dick

This is really a two dimensional problem. The only vectors that are affected by the reflections are the normals to W and U. The n-2 other independent vectors that are common to the W and U subspaces are not. So tackle it in two dimensions first and then think how to extend it.

3. May 28, 2008

### gop

Well I tried to do it in 2D and then to generalize it to n then I get

$$s_{w}&=&2\cdot\sum_{i=1}^{n}\langle w_{i},v\rangle\cdot w_{i}-v$$
$$s_{u}&=&2\cdot\sum_{i=1}^{n}\langle u_{i},v\rangle\cdot w_{i}-v$$
Then with $$\langle u_{i},v_{i}\rangle=0$$

$$s_{w}\circ s_{u}&=&2\cdot\sum_{i=1}^{n}\langle w_{i},2\cdot\sum_{i=1}^{n}\langle u_{i},v\rangle\cdot w_{i}-v\rangle\cdot w_{i}-2\cdot\sum_{i=1}^{n}\langle u_{i},v\rangle\cdot w_{i}+v$$
$$=&-2\cdot\sum_{i=1}^{n}\langle w_{i},v\rangle\cdot w_{i}-2\cdot\sum_{i=1}^{n}\langle u_{i},v\rangle\cdot w_{i}\ \ +v$$

Now if W and U are perpendicular we have $$W=U^{\perp}$$. Thus we can write every vector in V as the sum of the projection with respect to W and to U which gives us

$$-2v+v=-v$$

I can repeat the same with $$s_{u}\circ s_{w}$$ and get at the same result.

However I have still no idea how to proof the if and only if part. Thus how to show that this is the only case where the two functions are equal.

4. May 28, 2008

### Dick

That's pretty confusing. I have no idea what all of those things are. If you assume u and w are normalized (which you may as well), you can write s_u(x)=x-2*<x,u>u and s_w(x)=x-2*<x,w>w. If you form s_u(s_w(x))-s_w(s_u(x)) you should get <u,w> times a linear combination of u and w. The linear combination can only vanish if u and w are linearly dependent (which is the trivial case where they are parallel), so <u,w> must vanish. Try it again without all the messy subscripts.

5. May 28, 2008

### gop

Okay so I did it with just u and w (v is what you called x) and got the desired result

$$\langle v,u\rangle\cdot\langle u,w\rangle\cdot w-\langle v,w\rangle\cdot\langle u,w\rangle\cdot u&=&0$$

Now I have to argue why this result holds in n-dimensional space.
I now do

$$s_{w}&=&2\cdot\sum_{i=1}^{n-1}\langle w_{i},v\rangle\cdot w_{i}-v\\&=&2\cdot(v-v_{\perp})-v\\&=&v-2\cdot v_{\perp}\\&=&v-2\langle v,w_{n}\rangle\cdot w_{n}$$

where w_1, ..., w_(n-1) is the orthonormal base of the subspace W and w_n is the othornomal extension to a base of V.
Then the n-dimensional case is the two-dimensional case with w=w_n and v=v_n

6. May 28, 2008

### Dick

That's it!! Good job. You are actually done. You didn't assume anything about the dimensionality of the space to get that result. There is nothing else to do. I only mentioned doing it in two dimensions first in case you were working with an explicit basis or something. But you bypassed that step.

7. May 29, 2008

### gop

Thank you for your help! Once you see the idea it'r really simple I guess...
( I was also able to solve the second problem you gave me a hint too so thank you again)