Orthogonal Matrices: Questions & Answers

AI Thread Summary
Rearranging the rows of a square orthogonal matrix results in another orthogonal matrix, as the rows still form an orthonormal basis. However, a square matrix is not necessarily orthogonal just because its determinant is +1 or -1; a counter-example demonstrates that a matrix can have a determinant of 1 without being orthonormal. The discussion emphasizes the importance of linear independence and the characteristics of orthogonal matrices. Clarifications on logical implications in mathematical properties were also noted. Understanding these concepts is crucial for accurately determining matrix orthogonality.
war485
Messages
90
Reaction score
0

Homework Statement



1. If I got a square orthogonal matrix, then if I make up a new matrix from that by rearranging its rows, then will it also be orthogonal?

2. True/false: a square matrix is orthogonal if and only if its determinant is equal to + or - 1

Homework Equations



no equations

The Attempt at a Solution



1. I think it should also be orthogonal since it forms a basis, and the basis would be the same, but just a linear combination of the previous, right?

2. false, its determinant doesn't necessarily ensure it is orthogonal. So, how would/should I correct that statement?
 
Physics news on Phys.org
war485 said:
1. If I got a square orthogonal matrix, then if I make up a new matrix from that by rearranging its rows, then will it also be orthogonal?

1. I think it should also be orthogonal since it forms a basis, and the basis would be the same, but just a linear combination of the previous, right?
An nxn matrix is orthogonal iff its rows form an orthormal basis for \mathbb{R}^n (note the symmetry of AA^T=A^TA=I for an orthogonal matrix A). The linear independence of a collection of vectors doesn't depend on the order in which you write them, so the rows of the new matrix still form an orthonormal basis.

Just be careful your language: a linear combination of a basis reads as a linear combination of its vectors, which gives just one vector.
 
I forgot about the linear independence part of it for #1.

As for #2, I took a counter-example from wikipedia XD
[ 2 0 ]
[ 0 0.5 ]
where its determinant = 1
but the length of each column is not 1 (not orthonormal)
I guess counter-examples should be enough?

Thanks for the help you two. :)
 
war485 said:
As for #2, I took a counter-example from wikipedia XD
[ 2 0 ]
[ 0 0.5 ]
where its determinant = 1
but the length of each column is not 1 (not orthonormal)
I guess counter-examples should be enough?
The statement #2 is (colloquially) of the form "(property X implies property Y) AND (property Y implies property X)" (*). If all you want to do is show that (*) is false (e.g., if you were asked to prove or disprove the statement), then it suffices to show that property Y does not imply property X.

To show that property Y does not imply property X, it suffices to give an example for which property Y holds but X does not. Why? Because it definitively answers the question as to whether Y implies X. There is no guessing about it!
 
Yea, you're right Unco. I need to work on my logic a bit more. I'm very grateful for your help :D
 
I picked up this problem from the Schaum's series book titled "College Mathematics" by Ayres/Schmidt. It is a solved problem in the book. But what surprised me was that the solution to this problem was given in one line without any explanation. I could, therefore, not understand how the given one-line solution was reached. The one-line solution in the book says: The equation is ##x \cos{\omega} +y \sin{\omega} - 5 = 0##, ##\omega## being the parameter. From my side, the only thing I could...
Back
Top