Homework Help: Show the product of two Orthogonal Matrices of same size is Orthogonal

1. Jul 4, 2008

1. The problem statement, all variables and given/known data
Show that the product of two Orthogonal Matrices of same size is an Orthogonal matrix.

I am a little lost as to how to start this one.

If A is some nxn orthogonal matrix, than $AA^T=A^TA=I$ where I is the nxn identity matrix.

So now what? Let's take A and B to be nxn orthogonal matrices.

Can I get a push in the right direction

2. Jul 4, 2008

So if b_ij is an entry from B and a_ij is an entry from A, then an entry from the product C=BA is given by

$$c_{ij}=\sum_{k=1}^nb_{ik}a_{kj},$$

Now, how I do I work in the details of the definitions of orthogonal matrices?? Or is this a bad approach?

3. Jul 4, 2008

nicksauce

The way this is proved in a textbook of mine is using the definition that a linear transformation T from R^n to R^n is called orthogonal if it preserves the length of vectors, and then showing that the product AB preserves length.

4. Jul 4, 2008

Ouch. Yeah, I have not run into any of those terms yet in this book. I have to assume that it wants us to use the definition of orthogonality that I gave.... and then some properties of products and sums or something. Thanks though Nick! The book you're using sounds like it uses more of a 'physical' approach. I think I would like that better!

5. Jul 4, 2008

morphism

Well, what does it mean for the matrix AB to be orthogonal?

6. Jul 4, 2008

b0it0i

the solution to this is pretty simple
focus on the definition you provided

"If A is some nxn orthogonal matrix, than Click to see the LaTeX code for this image where I is the nxn identity matrix."

now if you look at the question it tells you to show that AB is also orthogonal
now apply the definition to show that

(AB) transpose multiplied by AB = AB multiplied by (AB) transpose = Identity matrix

the trick to this problem is to note what (AB) transpose is. you have to be careful here, and it should work out in a single line.

7. Jul 4, 2008

HallsofIvy

Have you already proved that, for any square matrices, of the same size, (AB)T= BTAT?

8. Jul 4, 2008

maze

Alternatively, you could use determinants.

I actually prefer the determinant proof since it emphasizes the fact that orthogonal matrices don't scale volumes, they just rotate and flip.

9. Jul 4, 2008

matt grime

There are two mistakes here: something that just preserves volume is in SL, not SO, and orthogonal matrices 'do flip', as a refection is an orthogonal matrix.

10. Jul 4, 2008

maze

SL has determinant +1 (no -1) and cannot flip, so that does not apply... SL is too small to encoumpass all volume preserving transformations.

If you take the measure of a set before and after a flip, you will find they are the same. There is no mistake.

Last edited: Jul 4, 2008
11. Jul 4, 2008

matt grime

The point, maze, was that you can't prove it by appeal to determinant. (My mistake about the flip part, I misread what you wrote).

SL is precisely the group that preserves (signed) volumes.

O(n) is the set of *distance* preserving matrices, it is not characterised by its effect on volumes.

12. Jul 4, 2008

maze

I really didn't want to give out the proof, as this is the homework section, but at this point theres no other way to explain it.

Orthogonal matrices are precisely those that have determinant +/-1. The determinant of the product is the product of the determinants. 1 and minus 1 are closed under multiplication. Therefore the product of orthogonal matrices is an orthogonal matrix.

13. Jul 4, 2008

matt grime

Sorry, maze, but you appear to have your definitions confused. There are non-orthogonal matrices with determinant plus or minus 1. E.g. diag(2,1/2) has determinant 1 and is not in O(2), they preserve length not volume (ignoring signs).

O(n) is the set of matrices {X in M(n) : XX^t = Id }, they certainly have determinant plus or minus 1, but that does not uniquely characterise them.

Last edited: Jul 4, 2008
14. Jul 4, 2008

maze

Err, hmm. I'd like to call something like diag(2,1/2) orthogonal, as the columns are orthogonal, but I suppose that wouldn't work under the AA'=I definition. I'd feel more comfortable calling AA'=I matrices orthonormal instead, but apparently this isn't the convention.

But even so, the determinant proof still wouldn't work in light of things like $$\left(\begin{matrix}1 & \sqrt{2} \\ 0 & 1\end{matrix}\right)$$. Very well, disregard the determinant idea.

Last edited: Jul 4, 2008
15. Jul 4, 2008

matt grime

If you're going to give homework advice, then it's best to find out what the objects in question are, rather than what you'd like them to be.

Moreover, something like diag(2,2) would be "maze-orthogonal", and that has determinant 4.

16. Jul 4, 2008

This is what I am thinking about. I think I am heading in the right direction. Thanks!

I have not worked it out, I have only seen that it is the case in a short list of rules about sums and products of transposes. Maybe I should work it out? Is that what you're implying? Or should I just employ it here? For some reason I had this rule in mind as I was falling asleep last night, but I couldn't figure out how to use it. (I must be losing my mind)

Alright, thanks guys! I'll work on this and post more when I have more

17. Jul 4, 2008

matt grime

Salad, what is the definition of orthogonal again? If A and B are orthogonal, you want to show AB satisfies the same criterion, which you wrote above. Namely that

(AB)(AB)^t = Id

Surely it you can see how knowing what (AB)^t is is going to be useful?

18. Jul 4, 2008

Is this right? Do I just use the associative property of multiplication and the rule stated above to say:

If $AA^T=I\ and\ BB^T=I$

$(AB)*(AB)^T=(AB)*B^TA^T=AA^TBB^T=I$

Does this suffice as proof? Can I do that with $B^TA^T$ ? That is, can I move each factor 'back' that far? I thought I could only 'group' them differently.

19. Jul 4, 2008

Defennder

"Grouping" them differently allows you to remove and add brackets in the expression freely. So that enables you to remove the brackets in the expression and "regroup" the matrices together to give you the answer. It's the associative property of matrix multiplication.

But matrix multiplication is not necessarily commutative, AB is not always BA. But in the case of your post you appear to have rearranged the matrices, which is allowed only if they are commutative. The reasoning is incorrect, though the conclusion is right.

Last edited: Jul 4, 2008
20. Jul 4, 2008

HallsofIvy

Well, if you are allowed to use that, then (AB)TAB= BTBTAB. What does that give you?

21. Jul 4, 2008

I don't understand this. Where did all the Bs come from?

22. Jul 4, 2008

b0it0i

im sure it was a typo and he/she meant

(AB)T (AB) = BT AT (AB)

use associativity in that line, and as others have pointed out not commutativity, to "regroup/rewrite" it

and the fact that A AND B are orthogonal... and you get the identity matrix

do the same for AB (AB)T

23. Jul 4, 2008

(AB)(AB)T=BTAT(AB)
=BT(ATA)B=BT*I*B
=BTB
=I

That works right!?

24. Jul 4, 2008

Defennder

How is this valid? You exchanged the matrix on the right with that on the left. That isn't possible unless the matrices commute.

25. Jul 5, 2008

Oops! But, fortunately if you write it out correctly $(AB)(AB)^T=(AB)B^TA^T=A(BB^T)A^T=AIA^T=AA^T=I$