1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Prove mutually non-zero orthogonal vectors are linearly independent

  1. Jul 17, 2013 #1
    1. The problem statement, all variables and given/known data

    Let a1, a2, ..... an be vectors in Rn and assume that they are mutually perpendicular and none of them equals 0. Prove that they are linearly independent.

    2. Relevant equations

    3. The attempt at a solution

    Consider βiai + βjaj ≠ 0 for all i, j

    => βiai + βjaj + βkak ≠ 0 for all i, j, k.

    Therefore β1a1 + β2a2 + ..... + βnan ≠ 0 (Linearly independent)
  2. jcsd
  3. Jul 17, 2013 #2
    I do not understand your attempt.
  4. Jul 17, 2013 #3
    I first start off with adding 2 vectors, and showing they are non-zero. Then I add a third one, which is also non-zero. Then i add everything to show it is also non-zero.
  5. Jul 17, 2013 #4
    I do not see how you show that any of those sums is non-zero. You just state that it is. You might as well state the end result immediately, it will be just as (un)justified.
  6. Jul 17, 2013 #5


    Staff: Mentor

    This is NOT the definition of linear independence. The equation β1a1 + β2a2 + ..... + βnan = 0 appears in the definition for linear independence, and in the definition for linear dependence.

    How then do we distinguish between a set of vectors that is linearly independent from one that is linearly dependent?

    What you showed above, with the ≠ symbol, doesn't appear in either definition.
  7. Jul 18, 2013 #6
    I think the first step is to show that the vector sum of any 2 vectors is non-zero. But since all the vectors are mutually orthogonal, sum of both can't be zero?

    Quick proof:

    Assume βiai + βjaj = 0

    This implies that ai = -(βji)aj is parallel to aj.
    So any two mutually orthogonal vectors are linearly independent. By mathematical induction, Ʃβiai ≠ 0.
  8. Jul 18, 2013 #7


    User Avatar
    Science Advisor
    Gold Member

    Consider the inner product of vector β1a12a2+...+βnan with vector ai and show that it is zero only if βi=0.

    Therefore β1a12a2+...+βnan = 0 iff all βi are zero.
  9. Jul 18, 2013 #8


    User Avatar
    Science Advisor

    You have not said anything about the [itex]\beta_i[/itex] not all being 0. Obviously if [itex]\beta_i= 0[/itex] for all i, that sum is 0.
  10. Jul 18, 2013 #9
    You have, almost, proved the base of induction. But you still have to prove the ##n \rightarrow n + 1## induction step.
  11. Jul 19, 2013 #10
    I don't think induction works here. Even though sum of any 2 vectors = 0, it doesn't mean that adding a third vector wont make it zero.

    Assume the sum of 3 vectors = 0, it implies the third vector added is parallel to the sum of the two (No contradiction, as the statement says that the vectors are mutually orthogonal and nothing is said about the orthogonality between a vector and a sum of vectors.

    I think the right way is to take the inner product of any vector with respect to the entire sum.
  12. Jul 19, 2013 #11
    Well, this can in fact be proved, too, but this is probably more difficult than what you have to do.

    Why does that have to be any vector?
  13. Jul 20, 2013 #12
    1. Add any 2 vectors, show that they are non-zero.

    2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

    3. Carry on process till last vector.

    4. QED
  14. Jul 20, 2013 #13
    This does not prove linear independence. Use the definition of the latter.
  15. Jul 20, 2013 #14


    User Avatar
    Science Advisor

    Unscientific, the basic problem appears to be that you have an incorrect idea of what the definition of "independent" is!

    It is NOT
    A set of vectors [tex]\{v_1, v_2, ..., v_n\}[/tex] is "independent" if an only if the only way [tex]\beta_1v_1+ \beta_2v_2+ ... + \beta_nv_n= 0[/tex] is if [tex]\beta_1= \beta_2= ...= \beta_n[/tex].

    It is NOT just a matter of adding vectors and saying the sum is not 0. It is the the only way a linear combination of them can be 0 is if all coefficients are 0. That is equivalent to the statement that no one of the vectors can be written as a linear combination of the other.

    So to prove a set of vectors is linearly independent is to start, "Suppose [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n= 0[/tex]" and show that every one of the \betas is equal to 0. Here the only condition on the vectors is that they are "mutually perpendicular"- and no where have you used that condition.

    What would you get if you took the dot product of [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n[/tex] with each of [tex]v_1[/tex], [tex]v_2[/tex], ..., [tex]v_n[/tex] in turn?
  16. Jul 21, 2013 #15
    Yup, sorry for not being concise. What i meant by "add any 2 vectors" I mean adding βiai.
  17. Jul 21, 2013 #16

    1. Add any 2 vectors, show that they are non-zero.

    βiai + βjaj can't be zero, otherwise

    ai = -(βji)aj implying they are parallel. Contradiction, as they are orthogonal.

    2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

    βiai + βjaj + βkak.

    Taking inner product, the first two innerproducts give 0, due to orthogonality. The last one, which is with itself, gives non-zero due to positivity of the norm.

    3. Carry on process till last vector.

    4. QED (assuming coefficients are non-zero)

    I hope this is clear enough..thanks for the help guys!
  18. Jul 21, 2013 #17
    What if ##\beta_i = \beta_j = 0##. Doesn't the expression give ##0##?

    What if ##\beta_i=0##, won't you divide by ##0##?
    Why are parallel vectors not orthogonal?
  19. Jul 21, 2013 #18
    I'm assuming all coefficients are non-zero.
  20. Jul 21, 2013 #19
    Well, you need to say this. And why can you assume this anyway?
  21. Jul 21, 2013 #20
    Because I am choosing them to be non-zero, in order to work towards the proof of linear independence. No point choosing any of them to be zero. (I thought this was straightforward enough not to say..)
  22. Jul 21, 2013 #21
    That's not what linear independence states. It says that ##\beta_i\mathbf{v}_i + \beta_j\mathbf{w}_j \neq \mathbf{0}## whenever ##\beta_i## and ##\beta_j## are both nonzero. So it can certainly happen that one of the ##\beta_i## is zero.
  23. Jul 22, 2013 #22
    Yes I get what you mean. But the question wants to show the linear independence of all vectors from a1 to an! If you let any of the coefficients be 0, then you are at most showing linear independence of all vectors except that one you let the coefficient be 0.
  24. Jul 22, 2013 #23
    You are required to prove that the entire set of vectors is linearly independent. By definition, you must prove that their linear combination is zero only when all the coefficients are zero. If you prove any other statement, then you need also a proof that your statement leads to linear independence per the original definition.
  25. Jul 22, 2013 #24
    That's right, thanks for putting it in a more elegant way!
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted