Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Question about Gram-Schmidt orthogonalization

  1. Jul 19, 2009 #1
    Hello everyone,

    I have a query regarding the Gram-Schmidt factorization:

    Say I have 3 independent vectors, u, v, w and I used the factorization scheme to get U, V, W vectors that are orthonormal to each other.

    So U, V, W are orthogonal to each other.

    Is it also true that V is orthogonal to u (small u)
    and W is orthogonal to small u and v.

    In my mind, I am quite convinced it is so. However, what is the exact mathematical reason for this. I think that W is going to be orthogonal to the whole plane described by u and v and V is going to be orthogonal to the line described by u. I am just a tad unsure why this would be.

    I am trying to understand that when we do QR factorization why we get an upper triangular matrix and that seems to depend on the above statement being true.

    Many thanks,


    Edit: I thought about this a bit more and have the following explanation. Please let me know if it sounds plausible

    If we consider the vectors U and V. They form the basis for a 2D subspace in a higher dimension space. So, the vector W lies in the null space of this 2D subspace and this null space will be orthogonal to the row space of the 2D subspace. Hence, W is orthogonal to u and v as well as they lie in the same subspace.

    Does this make sense?
    Last edited: Jul 19, 2009
  2. jcsd
  3. Jul 19, 2009 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Yes, it is true. When you look at the process you can see two things going on (I'm assuming you maintain the vectors in their current order of u,v,w)

    1) You orthogonalize your current vector by removing the components in the space spanned by your previous (now orthonormal) vectors
    2) You normalize your vector

    Ignore step 2. Here's your job:
    Prove that if you have a set of vectors u1, u2,...., uk linearly independent and you do Grahm Schmidt to get U1,...., Uk then the spaces spanned by u1,....ur and U1[/sub,...Ur are the same for each r (use induction. The base case is trivial).

    Then since Ur+1 when inner producted with any element of the space spanned by U1,...Ur must come out to 0 (since it's orthogonal to each of those) Ur+1 must be orthogonal to u1,... ur as you suspected

    If you're really and truly only interested in the case of three vectors, you don't even have to do induction, and can just prove the above directly, but it's not as satisfying
  4. Jul 19, 2009 #3


    User Avatar
    Science Advisor

    The Gram-Schmidt process starts with vector u and finds U that has length 1 and is in the same direction as u. It then, given vector v, finds v' that is orthogonal to vector U and then V that has length 1 and is in the same direction as v', not v. Finally, given vector w, it finds w' that is orthogonal to both U and V and then W that has length 1 and is in the same direction as w', not w. U is in the same direction as u, V and W are not necessarily in the same direction as v and w.

    Of course, you could do the Gram-Schmidt orthogonalization process starting with v or w instead of u which would change the result.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook