# Geometry problem

## Main Question or Discussion Point

It's been a long time since I've studied geometry, and had an extremely poor instructor at the time, so I'm having difficulty remembering how to prove certain theorems. My linear algebra is a bit rusty as well, though I'm more well versed in that than in geometry. One of my students needs help and I'm at a loss as to how to explain this. The theorem in question is that:

Any Euclidean transformation can be obtained as a composition of Euclidean reflections in hyperplanes.

I'm aware that a proof by induction is required of the base case in R^1, and then an inductive proof of the theorem in R^n. Beyond that, if someone could remind me of what needs to be to prove such a thing, it'd be much appreciated.

I took another shot at it, so let me know if I'm wrong on any details or just going about this the wrong way. I tried to prove the base case in both R^1 and R^2. For R^1, since any transformation matrix of a reflection must have a determinant of -1, [-1] is the only allowable matrix, which obviously will not result in anything other than reflections no matter how many times it's applied. For R^2, a reflection matrix has the form:

[cos t sin t]
[sin t -cos t]

This will result in a reflection about the line y = tan(t/2)*x. Multiplying two such matrices with distinct values for t results in a matrix that, through basic trig sum and difference identities, gives yet another reflection matrix with angle t of t1 - t2.

What am I missing? Do I need to do reflections from R^n to R^n+1 and back to R^n to get a transformation matrix which is not a basic reflection?

mathwonk
Homework Helper
try it in the plane.

try it in the plane.
I'm not sure what you mean by that.

Should I be using householder transformations to prove this? I've tried a few approaches so far, for the base case only (trying R^1 and R^2), and have met with little success. If I were to try to prove for R^2, should I be using 2x3 orthogonal matrices rather than2x2, so that I'm reflecting about a space in R^3 rather than a plane? Or, if I were to take R^1 as the base case, should I be using 1x2 orthogonal matrices? Is that what mathwonk meant by his post?

If anyone can offer some more substantive information on this, it'd be appreciated, as I'm supposed to meet with my student tomorrow. I'm not sure what the time frame is for when she'll need to know how to do this, but I'd like to be able to understand it myself ASAP so I can at least get the ball rolling.

I think you need to clarify exactly what you're trying to prove. Typically, Euclidean transformations are understood to include rotations, reflections and translations. Note that a translation is affine, not linear, and so cannot be represented as a simple matrix product. The question, then, is what you mean when you say "Euclidean reflections"; are these simply reflections, or reflections + translations? As far as I can tell, you can't build a translation out of reflections, so it's certainly not the case that any Euclidean transformation can be composed out of simple reflections. Either you need to constrain yourself to showing only that rotations can be built out of reflections, or you need to use reflections+translations to do the construction. I suspect that the latter is what you want to do, as a simple reflection (with no affine term) can only represent reflections accross hyperplanes through the origin. A reflection+translation, on the other hand, is required to reflect accross a general hyperplane.

I think you need to clarify exactly what you're trying to prove. Typically, Euclidean transformations are understood to include rotations, reflections and translations. Note that a translation is affine, not linear, and so cannot be represented as a simple matrix product. The question, then, is what you mean when you say "Euclidean reflections"; are these simply reflections, or reflections + translations? As far as I can tell, you can't build a translation out of reflections, so it's certainly not the case that any Euclidean transformation can be composed out of simple reflections. Either you need to constrain yourself to showing only that rotations can be built out of reflections, or you need to use reflections+translations to do the construction. I suspect that the latter is what you want to do, as a simple reflection (with no affine term) can only represent reflections accross hyperplanes through the origin. A reflection+translation, on the other hand, is required to reflect accross a general hyperplane.
Good points. I did say that my geometry was rusty. When I studied modern geometry some years ago, our instructor confined most of the teaching to Euclidean and absolute geometry, at a level almost dumbed down to high school, and did not include affine geometry. I also ordinarily only teach subjects that would be part of a two-year college math cirriculum, so this is just a bit out of my expertise, but it's always good to sharpen my sword, I suppose, especially considering that I might pursue a master's degree in mathematics someday when it's feasible for me to do so.

I don't know exactly what she needs to do, but after reading your post, I suspect the latter as well, as the former seems fairly trivial (a rotation in R^2 is two translations, and to rotate by angle t, a reflection across the line y = tan(t/8)x and another across the line y = tan(3t/8), if memory serves me correctly). I do know that any reflection about a hyperplane not containing the origin is an affine transformation rather than a Euclidean transformation, so it seems correct that translations and glide reflections can't be obtained in this manner.

So, assuming we're trying to show by inductive proof (or some other method of proof, but I'm almost certain that this proof calls for this method) that any Euclidean transformation can be performed with reflections and translations in R^n (Something like v' = A*v + w, where v, v', and w are vectors in R^n, and A is an n*n reflection matrix, does that sound about right?), what would I do to show this? Also, if anyone has any links to material on the subject, that'd be helpful as well.

I don't know exactly what she needs to do, but after reading your post, I suspect the latter as well, as the former seems fairly trivial (a rotation in R^2 is two translations, and to rotate by angle t, a reflection across the line y = tan(t/8)x and another across the line y = tan(3t/8), if memory serves me correctly).
Actually, the two problems come down to basically the same thing, which is showing that rotations can be decomposed into reflections.

So, assuming we're trying to show by inductive proof (or some other method of proof, but I'm almost certain that this proof calls for this method) that any Euclidean transformation can be performed with reflections and translations in R^n (Something like v' = A*v + w, where v, v', and w are vectors in R^n, and A is an n*n reflection matrix, does that sound about right?), what would I do to show this? Also, if anyone has any links to material on the subject, that'd be helpful as well.
Well, first of all, the only Euclidean transformation other than reflection and translation is rotation, so the problem reduces to showing that an arbitrary rotation can be decomposed into reflections and translations. You should be able to assume that it's a rotation around the origin without any loss of generality, which is to say that it's simply an orthogonal matrix. At that point, I'd consider looking at Householder reflections to show that any such matrix can be written as a product of reflections. On the other hand, there may well be an elegant way to do it inductively in the dimension.

Thanks for the assist, I'll see what I can come up with.

Okay, I've proved that it holds true for the trivial case of R^1.

I've also done so in R^2, as that much better serves to illustrate the mechanisms of what's occurring.

With that done, how would I prove inductively that this will work for any R^n? Showing that it will work for translation is easy, as is of course reflection (a reflection is a product of exactly one reflection, itself, also trivial).

I suppose that I just need a hand on how to prove that any rotation in R^n through the origin can be composed from a series of reflections in R^n about a hyperplane, then. I have about 14 hours to find something before I meet with my student, so as usual, any help received quickly will be appreciated.

What I most need to know is the general form of the transformation matrix for a Euclidean rotation in R^n, and if a better form (better suited for this proof, that is) of a reflection matrix exists other than the Householder matrix (I - 2*v*vT), then I might be able to see how to obtain one from the other.

Edit: I'm aware that any reflection or rotation matrix is orthogonal and have determinants of -1 and 1, respectively, but if there's a better way than that that I can write them to do this proof, that might help me. For example, I defined reflection matrices in R^1 and R^2 in my second post.

Last edited:
I suppose that I just need a hand on how to prove that any rotation in R^n through the origin can be composed from a series of reflections in R^n about a hyperplane, then.
Well, if you're doing it inductively, you might show that you can first decompose a 2-dimensional portion of the rotation matrix by using reflections, and then use more reflections to handle the 3rd dimension, 4th dimension and so on. That's very hand-wavey, but you get the idea...

What I most need to know is the general form of the transformation matrix for a Euclidean rotation in R^n, and if a better form (better suited for this proof, that is) of a reflection matrix exists other than the Householder matrix (I - 2*v*vT), then I might be able to see how to obtain one from the other.
I don't think there's any special structure beyond the fact that it's orthogonal and has determinant 1 (while reflections are orthogonal and have determinant -1). This implies that you're going to need an even number of reflections to get a rotation, but I'm not sure there's any other conclusions to be drawn...

mathwonk
Homework Helper
in post 10 you do what i suggested, i.e. work first in R^2.

Well, if you're doing it inductively, you might show that you can first decompose a 2-dimensional portion of the rotation matrix by using reflections, and then use more reflections to handle the 3rd dimension, 4th dimension and so on. That's very hand-wavey, but you get the idea...

I don't think there's any special structure beyond the fact that it's orthogonal and has determinant 1 (while reflections are orthogonal and have determinant -1). This implies that you're going to need an even number of reflections to get a rotation, but I'm not sure there's any other conclusions to be drawn...
That's what I originally figured, that angles were merely a convenient notation to express a rotation in a given direction. In R^2, it's obvious what angle t refers to due to the fact that there are only two possible directions for Euclidean rotation, clockwise and counterclockwise. However, in R^3 and higher dimensions, a rotation matrix expressed in terms of angle must have some means of determining which direction the transformation is causing rotation in. For example, if I'm working in R^4, such that vectors can be expressed as [x y z w], and in the use of proof by induction, I assume that any Euclidean transformation can be obtained by a composition of glide reflections. Now, I only need to worry about reflection, rotation, and transformation in the fourth dimension. As translation can be handed by applying any reflection matrix twice (as it's orthogonal, its transpose is its inverse, so applying it twice results in the identity matrix being applied), then adding the vector [0 0 0 w] in the glide reflection. Reflection is still trivial, as it is still a composition of itself. However, for rotation, it would help greatly to have an idea of the form of the transformation matrices of rotation and reflection, through only the direction of the extra dimension, now that the rest has been accomplished.

Would these be rotation and reflection matrices, respectively:

[cos(t) 0 0 -sin(t)]
[ 0 1 0 0 ]
[ 0 0 1 0 ]
[sin(t) 0 0 cos(t)]

[cos(t) 0 0 sin(t) ]
[ 0 1 0 0 ]
[ 0 0 1 0 ]
[sin(t) 0 0 -cos(t)]

Or is there something else I'm missing? If I'm right, I think I can prove the rest fairly easily.

Would these be rotation and reflection matrices, respectively:

[cos(t) 0 0 -sin(t)]
[ 0 1 0 0 ]
[ 0 0 1 0 ]
[sin(t) 0 0 cos(t)]

[cos(t) 0 0 sin(t) ]
[ 0 1 0 0 ]
[ 0 0 1 0 ]
[sin(t) 0 0 -cos(t)]

Or is there something else I'm missing? If I'm right, I think I can prove the rest fairly easily.
Yes, you can construct higher-dimensional extensions of the usual 2-d rotation and reflection matrices this way. See http://en.wikipedia.org/wiki/Givens_rotation for more. Provided you know how to represent a 2-d rotation as a composition of 2-d reflections, then, you should be able to work out an extension to represent a particular Givens rotation as a composition of reflections. Then, if you can show that an arbitrary rotation can be decomposed into a sequence of Givens rotations (and you can), you're done.

Last edited by a moderator:
Solved it now, thanks.