Show the product of two Orthogonal Matrices of same size is Orthogonal

In summary: Moreover, something like diag(2,2) would be "maze-orthogonal", and that has determinant...2.In summary, the product of two orthogonal matrices of the same size is an orthogonal matrix.
  • #1
Saladsamurai
3,020
7

Homework Statement


Show that the product of two Orthogonal Matrices of same size is an Orthogonal matrix.

I am a little lost as to how to start this one.

If A is some nxn orthogonal matrix, than [itex]AA^T=A^TA=I[/itex] where I is the nxn identity matrix.

So now what? Let's take A and B to be nxn orthogonal matrices.

Can I get a push in the right direction:redface:
 
Physics news on Phys.org
  • #2
So if b_ij is an entry from B and a_ij is an entry from A, then an entry from the product C=BA is given by

[tex]c_{ij}=\sum_{k=1}^nb_{ik}a_{kj},[/tex]

Now, how I do I work in the details of the definitions of orthogonal matrices?? Or is this a bad approach? :smile:
 
  • #3
The way this is proved in a textbook of mine is using the definition that a linear transformation T from R^n to R^n is called orthogonal if it preserves the length of vectors, and then showing that the product AB preserves length.
 
  • #4
Ouch. :eek: :smile: Yeah, I have not run into any of those terms yet in this book. I have to assume that it wants us to use the definition of orthogonality that I gave... and then some properties of products and sums or something. Thanks though Nick! The book you're using sounds like it uses more of a 'physical' approach. I think I would like that better!
 
  • #5
Well, what does it mean for the matrix AB to be orthogonal?
 
  • #6
the solution to this is pretty simple
focus on the definition you provided

"If A is some nxn orthogonal matrix, than Click to see the LaTeX code for this image where I is the nxn identity matrix."

now if you look at the question it tells you to show that AB is also orthogonal
now apply the definition to show that

(AB) transpose multiplied by AB = AB multiplied by (AB) transpose = Identity matrix

the trick to this problem is to note what (AB) transpose is. you have to be careful here, and it should work out in a single line.
 
  • #7
Have you already proved that, for any square matrices, of the same size, (AB)T= BTAT?
 
  • #8
Alternatively, you could use determinants.

I actually prefer the determinant proof since it emphasizes the fact that orthogonal matrices don't scale volumes, they just rotate and flip.
 
  • #9
maze said:
Alternatively, you could use determinants.

I actually prefer the determinant proof since it emphasizes the fact that orthogonal matrices don't scale volumes, they just rotate and flip.

There are two mistakes here: something that just preserves volume is in SL, not SO, and orthogonal matrices 'do flip', as a refection is an orthogonal matrix.
 
  • #10
matt grime said:
There are two mistakes here: something that just preserves volume is in SL, not SO, and orthogonal matrices 'do flip', as a refection is an orthogonal matrix.

SL has determinant +1 (no -1) and cannot flip, so that does not apply... SL is too small to encoumpass all volume preserving transformations.

If you take the measure of a set before and after a flip, you will find they are the same. There is no mistake.
 
Last edited:
  • #11
The point, maze, was that you can't prove it by appeal to determinant. (My mistake about the flip part, I misread what you wrote).

SL is precisely the group that preserves (signed) volumes.

O(n) is the set of *distance* preserving matrices, it is not characterised by its effect on volumes.
 
  • #12
I really didn't want to give out the proof, as this is the homework section, but at this point there's no other way to explain it.

Orthogonal matrices are precisely those that have determinant +/-1. The determinant of the product is the product of the determinants. 1 and minus 1 are closed under multiplication. Therefore the product of orthogonal matrices is an orthogonal matrix.
 
  • #13
Sorry, maze, but you appear to have your definitions confused. There are non-orthogonal matrices with determinant plus or minus 1. E.g. diag(2,1/2) has determinant 1 and is not in O(2), they preserve length not volume (ignoring signs).

O(n) is the set of matrices {X in M(n) : XX^t = Id }, they certainly have determinant plus or minus 1, but that does not uniquely characterise them.
 
Last edited:
  • #14
Err, hmm. I'd like to call something like diag(2,1/2) orthogonal, as the columns are orthogonal, but I suppose that wouldn't work under the AA'=I definition. I'd feel more comfortable calling AA'=I matrices orthonormal instead, but apparently this isn't the convention.

But even so, the determinant proof still wouldn't work in light of things like [tex]\left(\begin{matrix}1 & \sqrt{2} \\ 0 & 1\end{matrix}\right)[/tex]. Very well, disregard the determinant idea.
 
Last edited:
  • #15
maze said:
Err, hmm. I'd like to call something like diag(2,1/2) orthogonal, as the columns are orthogonal

If you're going to give homework advice, then it's best to find out what the objects in question are, rather than what you'd like them to be.

Moreover, something like diag(2,2) would be "maze-orthogonal", and that has determinant 4.
 
  • #16
b0it0i said:
the solution to this is pretty simple
focus on the definition you provided

"If A is some nxn orthogonal matrix, than Click to see the LaTeX code for this image where I is the nxn identity matrix."

now if you look at the question it tells you to show that AB is also orthogonal
now apply the definition to show that

(AB) transpose multiplied by AB = AB multiplied by (AB) transpose = Identity matrixthe trick to this problem is to note what (AB) transpose is. you have to be careful here, and it should work out in a single line.
This is what I am thinking about. I think I am heading in the right direction. Thanks!

HallsofIvy said:
Have you already proved that, for any square matrices, of the same size, (AB)T= BTAT?

I have not worked it out, I have only seen that it is the case in a short list of rules about sums and products of transposes. Maybe I should work it out? Is that what you're implying? Or should I just employ it here? For some reason I had this rule in mind as I was falling asleep last night, but I couldn't figure out how to use it. (I must be losing my mind:smile:)

Alright, thanks guys! I'll work on this and post more when I have more :smile:
 
  • #17
Salad, what is the definition of orthogonal again? If A and B are orthogonal, you want to show AB satisfies the same criterion, which you wrote above. Namely that

(AB)(AB)^t = Id

Surely it you can see how knowing what (AB)^t is is going to be useful?
 
  • #18
matt grime said:
Salad, what is the definition of orthogonal again? If A and B are orthogonal, you want to show AB satisfies the same criterion, which you wrote above. Namely that

(AB)(AB)^t = Id

Surely it you can see how knowing what (AB)^t is is going to be useful?

Is this right? Do I just use the associative property of multiplication and the rule stated above to say:

If [itex]AA^T=I\ and\ BB^T=I [/itex]

[itex](AB)*(AB)^T=(AB)*B^TA^T=AA^TBB^T=I[/itex]

Does this suffice as proof? Can I do that with [itex]B^TA^T[/itex] ? That is, can I move each factor 'back' that far? I thought I could only 'group' them differently.
 
  • #19
"Grouping" them differently allows you to remove and add brackets in the expression freely. So that enables you to remove the brackets in the expression and "regroup" the matrices together to give you the answer. It's the associative property of matrix multiplication.

But matrix multiplication is not necessarily commutative, AB is not always BA. But in the case of your post you appear to have rearranged the matrices, which is allowed only if they are commutative. The reasoning is incorrect, though the conclusion is right.
 
Last edited:
  • #20
Well, if you are allowed to use that, then (AB)TAB= BTBTAB. What does that give you?
 
  • #21
HallsofIvy said:
Well, if you are allowed to use that, then (AB)TAB= BTBTAB. What does that give you?

I don't understand this. Where did all the Bs come from? :smile:
 
  • #22
im sure it was a typo and he/she meant

(AB)T (AB) = BT AT (AB)

use associativity in that line, and as others have pointed out not commutativity, to "regroup/rewrite" it

and the fact that A AND B are orthogonal... and you get the identity matrix

do the same for AB (AB)T
 
  • #23
b0it0i said:
im sure it was a typo and he/she meant

(AB)T (AB) = BT AT (AB)

use associativity in that line, and as others have pointed out not commutativity, to "regroup/rewrite" it

and the fact that A AND B are orthogonal... and you get the identity matrix

do the same for AB (AB)T

How about
(AB)(AB)T=BTAT(AB)
=BT(ATA)B=BT*I*B
=BTB
=I

That works right!? :smile:
 
  • #24
Saladsamurai said:
(AB)(AB)T=BTAT(AB)
How is this valid? You exchanged the matrix on the right with that on the left. That isn't possible unless the matrices commute.
 
  • #25
Defennder said:
How is this valid? You exchanged the matrix on the right with that on the left. That isn't possible unless the matrices commute.

Oops! But, fortunately if you write it out correctly [itex](AB)(AB)^T=(AB)B^TA^T=A(BB^T)A^T=AIA^T=AA^T=I[/itex]
you get the same result.

Same for (AB)^T(AB). Thus, AB is orthogonal.

:smile:
 

What does it mean for two matrices to be orthogonal?

Two matrices are considered to be orthogonal if their dot product is equal to zero, meaning that they are perpendicular to each other.

Can two orthogonal matrices have different sizes?

No, for two matrices to be orthogonal, they must have the same size. This means that they must have the same number of rows and columns.

How do you show that the product of two orthogonal matrices of the same size is orthogonal?

To show that the product of two orthogonal matrices is orthogonal, you must prove that the dot product of the product matrix with itself is equal to the identity matrix. This can be done by using the properties of orthogonality and matrix multiplication.

What are the properties of orthogonal matrices?

Orthogonal matrices have several key properties, including:

  • They are square matrices with the same number of rows and columns.
  • Their inverse is equal to their transpose.
  • Their dot product with themselves is equal to the identity matrix.
  • Their rows and columns are orthonormal, meaning they are perpendicular to each other and have a magnitude of 1.

Why are orthogonal matrices important in linear algebra?

Orthogonal matrices play a critical role in linear algebra because they preserve the length and angle of vectors when multiplied by them. This makes them useful for transformations and rotations in various applications, such as computer graphics and physics.

Similar threads

  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
998
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
972
  • Calculus and Beyond Homework Help
Replies
4
Views
953
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
5K
Replies
9
Views
1K
Back
Top