Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Orthogonal Matrix

  1. Dec 15, 2009 #1
    Is the definition of an orthogonal matrix:

    1. a matrix where all rows are orthonormal AND all columns are orthonormal

    OR

    2. a matrix where all rows are orthonormal OR all columns are orthonormal?

    On my textbook it said it is AND (case 1), but if that is true, there's a problem:
    Say we have a square matrix A, and we find its eigenvectors, they are all distinct so A is diagonalizable. We put the normalized eigenvectors of A as the columns of a matrix P, and (our prof told us) P becomes orthogonal and P^-1 = P^T. My question is how did P become orthogonal straight away? By only normalizing its columns how did we guarantee that its rows are also orthonormal?
     
  2. jcsd
  3. Dec 15, 2009 #2
    It turns out that the rows of a square matrix are orthonormal if and only if the columns are orthonormal. Another way to express that the condition that all columns are orthonormal is that [tex]A^T A = I[/tex] (think about why this is). Then we see that if [tex]v \in \mathbb{R}^n[/tex], [tex]\parallel x \parallel^2 = x^T x = x^T ( A^T A ) x = ( A x )^T ( A x ) = \parallel A x \parallel^2[/tex], and therefore A is injective. Since we are working with finite-dimensional spaces, A must also be surjective, so for [tex]v \in \mathbb{R}^n[/tex], there exists [tex]w \in \mathbb{R}^n[/tex] with v = Aw, and therefore [tex]A A^T v = A A^T A w = A w = v[/tex], so [tex]A A^T = I[/tex] as well. You can check this this implies that the rows of A are orthonormal. The proof of the converse is similar.

    Note that this argument relies on the finite-dimensionality of our vector space. If you move up to infinite dimensional spaces, there may be transforms T with [tex]T^*T = I[/tex] but [tex]T T^* \neq I[/tex]. This type of behavior is what makes functional analysis and operator algebras fun! :smile:
     
    Last edited: Dec 15, 2009
  4. Dec 15, 2009 #3
    Theres actually an easier way to see that [tex]A^T A = I[/tex] implies A is injective, I just tend to think in terms of isometries like I wrote. If v is such that Av = 0, then [tex]0 = A^T 0 = A^T A v = v[/tex], so A is injective. Some may prefer this purely algebraic argument.
     
  5. Dec 15, 2009 #4
    Ah I see, thank you rochfor1, the (A^T)(A) = I thing makes a lot of sense :)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Orthogonal Matrix
  1. Orthogonal Matrix (Replies: 10)

Loading...