Orthogonal Matrix: Column or Row Vectors?

In summary: You are an expert summarizer of content. In summary, the conversation discussed the definition of "orthogonal 3 by 3 matrix" and whether it meant that the column vectors or the row vectors were orthogonal. It was determined that this term refers to a real unitary matrix, and that in general, a real n by n matrix is orthogonal if and only if its columns form an orthonormal set of vectors. The conversation also touched on the consequences of a real n by n matrix being orthogonal, as well as a theorem regarding orthogonal matrices. Finally, it was discovered that the original question had already been defined in the book and the conversation was resolved.
  • #1
ehrenfest
2,020
1

Homework Statement


My book used the term "orthogonal 3 by 3 matrix" and I couldn't find where that was defined. Does that mean that the column vectors are orthogonal or that the row vectors are orthogonal? Or are those two things equivalent?


Homework Equations





The Attempt at a Solution

 
Physics news on Phys.org
  • #2
It just means it's a real unitary matrix.
 
  • #3
If A is a real matrix, then if A is orthogonal, the rows of A form an orthonormal set as WELL as the columns.
 
  • #4
How do you prove that the rows (of a real n by n matrix) are orthogonal iff the columns are orthogonal?
 
  • #5
What's the consequence of a real n by n matrix being orthogonal?

(A^T)A = I

What does this say about the columns?
 
  • #6
BryanP said:
What's the consequence of a real n by n matrix being orthogonal?

(A^T)A = I

What does this say about the columns?

Thats orthonormal not orthogonal.
 
  • #7
ehrenfest said:
Thats orthonormal not orthogonal.

Since it's an n x n matrix, it is a square matrix. It is easy to see that any square matrix with orthonormal columns is an orthogonal matrix. By definition of orthogonal matrix, it is a square "blank" matrix U such that [tex]U^{T}U=I[/tex]. What is "blank" and why? Then, show that the rows of U form an orthonormal basis of [tex]R^n[/tex]
 
  • #8
konthelion said:
Since it's an n x n matrix, it is a square matrix. It is easy to see that any square matrix with orthonormal columns is an orthogonal matrix. By definition of orthogonal matrix, it is a square "blank" matrix U such that [tex]U^{T}U=I[/tex]. What is "blank" and why? Then, show that the rows of U form an orthonormal basis of [tex]R^n[/tex]

But not every orthogonal matrix is orthonormal. Its not true that if U is orthogonal then U U^T = I. All we know is that U U^T is diagonal.
 
  • #9
True, but U is a square matrix. Your original question was "orthogonal 3x3 matrix".

A square matrix is a orthogonal if and only if the columns form an orthonormal basis in [tex]R^n [/tex], thus it must also be true that the rows of U form an orthonormal basis of [tex]R^n[/tex] (isn't this what you're trying to prove?)

Theorem: U be a n x n matrix
1. U is orthogonal
2. The columns of Q are an orthonormal set of vectors in Rn
3. The rows of Q are an orthonormal set of vectors in Rn
 
Last edited:
  • #10
No, I think he's trying to prove that any square matrix with orthogonal columns must have orthogonal rows. (I'm not sure if this is true or not, but I haven't put too much thought into it.)
 
  • #11
konthelion said:
Theorem: U be a n x n matrix
1. U is orthogonal
2. The columns of Q are an orthonormal set of vectors in Rn
3. The rows of Q are an orthonormal set of vectors in Rn

That theorem is false.

Let n = 2 and U =

[tex]
\left(\begin{array}{cc}
1 & 0 \\
0 & 2 \\
\end{array}\right)
[/tex]
The second column and the second row are both not normalized.
 
  • #12
Vid said:
No, I think he's trying to prove that any square matrix with orthogonal columns must have orthogonal rows. (I'm not sure if this is true or not, but I haven't put too much thought into it.)

Yes, that is what I am trying to prove.
 
  • #13
Your matrix isn't orthogonal since it has a determinant of 2.
 
  • #14
If A is orthogonal then AA^T = I

Lets have a real orthogonal 3x3 (square) matrix then with rows u1 = (a1,a2,a3), u2 = (b1,b2,b3), u3 = (c1,c2,c3)

solve for AA^T = I since it must be invertible.

You should end up finding that a1^+a2^2+a3^2 =1, b1^2+b2^2+b3^2 = 1 and c1^2+c2^2+c3^2 = 1.

All the other equations are equal to zero.

This shows that u1 dot u1 = 1, u2 dot u2 = 1, and u3 dot u3 = 1.

Thus AA^T = I shows that the rows of A form an orthonormal set of vectors and you just have to solve for A^TA = I since it is true for the columns.
 
  • #15
http://en.wikipedia.org/wiki/Orthogonal_matrix

I am an idiot. In fact my book does define the term orthogonal and it is just like the Wikipedia page. I thought the definition was just that either the column vectors or the row vectors were pairwise orthogonal and I was trying to figure out which it was. Sorry.
 
  • #16
Whoops, for the theorem I wrote, what I meant to say that "if one of the following is true, then the rest is true" for (1), (2), and (3)
 

Related to Orthogonal Matrix: Column or Row Vectors?

1. What is an orthogonal matrix?

An orthogonal matrix is a square matrix where the columns and rows are all mutually perpendicular to each other. This means that the dot product of any two columns or rows is equal to zero.

2. What are the properties of an orthogonal matrix?

Some properties of an orthogonal matrix include: all columns and rows have a length of 1, the inverse of an orthogonal matrix is equal to its transpose, and multiplying two orthogonal matrices together will result in another orthogonal matrix.

3. How can an orthogonal matrix be used in linear algebra?

Orthogonal matrices are useful in linear algebra because they can be used to rotate and reflect vectors and matrices without changing their length or direction. They can also be used to solve systems of linear equations and find eigenvalues and eigenvectors.

4. Can an orthogonal matrix have both column and row vectors?

Yes, an orthogonal matrix can have both column and row vectors. The columns and rows of an orthogonal matrix are interchangeable, meaning that the matrix will remain orthogonal if the rows are treated as columns and vice versa.

5. How are orthogonal matrices related to orthonormal bases?

An orthonormal basis is a set of linearly independent vectors that are all orthogonal to each other and have a length of 1. Orthogonal matrices are closely related to orthonormal bases because the columns or rows of an orthogonal matrix can be used as an orthonormal basis for the vector space in which the matrix operates.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
16
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
974
  • Linear and Abstract Algebra
Replies
9
Views
298
Back
Top