Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Partitioned Orthogonal Matrix

  1. Dec 29, 2009 #1
    "Partitioned Orthogonal Matrix"

    I was reading the following theorem in the Matrix Computations book by Golub and Van Loan:

    If [tex]V_1 \in R^{n\times r}[/tex] has orthonormal columns, then there exists [tex]V_2 \in R^{n\times (n-r)}[/tex] such that,
    [tex] V = [V_1V_2] [/tex] is orthogonal.
    Note that [tex]ran(V_1)^{\bot}=ran(V_2)[/tex]

    It also says that the proof is a standard result from introductory linear algebra.

    So I picked up my copy of Introduction to linear algebra by Strang and did not find this.
    I then looked in the Matrix Analysis book by Carl D. Meyer, and here he mentiones this under the name "partitioned orthogonal matrix". I did not find a proof though.

    Is there a proper name for this "decomposition"?

  2. jcsd
  3. Dec 29, 2009 #2
    Re: "Partitioned Orthogonal Matrix"

    This may be easier to see if you rephrase the problem. The columns of [tex]V_1[/tex] form an orthonormal basis for an r-dimensional subspace of [tex]\matbb{R}^n[/tex], and it is a standard result from the theory of Hilbert spaces that you may extend an orthonormal basis for a subspace to the whole space. The functional analyst in me would use Zorn's lemma to show that every orthonormal set (the columns of [tex]V_1[/tex]) is contained in an orthonormal basis of the whole space, but this is overkill in the finite-dimensional case. In this case, I'd simply find a basis of [tex]\mathbb{R}^n[/tex] that contains the columns of [tex]V_1[/tex] (using your favorite argument) and then use the Gram-Schmidt process. Not that if you let the columns of [tex]V_1[/tex] be the first r vectors in the Gram-Schmidt process, they will remain unchanged (because they are already orthogonal).
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook