# Homework Help: Prove mutually non-zero orthogonal vectors are linearly independent

1. Jul 17, 2013

### unscientific

1. The problem statement, all variables and given/known data

Let a1, a2, ..... an be vectors in Rn and assume that they are mutually perpendicular and none of them equals 0. Prove that they are linearly independent.

2. Relevant equations

3. The attempt at a solution

Consider βiai + βjaj ≠ 0 for all i, j

=> βiai + βjaj + βkak ≠ 0 for all i, j, k.

Therefore β1a1 + β2a2 + ..... + βnan ≠ 0 (Linearly independent)

2. Jul 17, 2013

### voko

I do not understand your attempt.

3. Jul 17, 2013

### unscientific

I first start off with adding 2 vectors, and showing they are non-zero. Then I add a third one, which is also non-zero. Then i add everything to show it is also non-zero.

4. Jul 17, 2013

### voko

I do not see how you show that any of those sums is non-zero. You just state that it is. You might as well state the end result immediately, it will be just as (un)justified.

5. Jul 17, 2013

### Staff: Mentor

This is NOT the definition of linear independence. The equation β1a1 + β2a2 + ..... + βnan = 0 appears in the definition for linear independence, and in the definition for linear dependence.

How then do we distinguish between a set of vectors that is linearly independent from one that is linearly dependent?

What you showed above, with the ≠ symbol, doesn't appear in either definition.

6. Jul 18, 2013

### unscientific

I think the first step is to show that the vector sum of any 2 vectors is non-zero. But since all the vectors are mutually orthogonal, sum of both can't be zero?

Quick proof:

Assume βiai + βjaj = 0

This implies that ai = -(βji)aj is parallel to aj.
($\Rightarrow$$\Leftarrow$)
So any two mutually orthogonal vectors are linearly independent. By mathematical induction, Ʃβiai ≠ 0.

7. Jul 18, 2013

### hilbert2

Consider the inner product of vector β1a12a2+...+βnan with vector ai and show that it is zero only if βi=0.

Therefore β1a12a2+...+βnan = 0 iff all βi are zero.

8. Jul 18, 2013

### HallsofIvy

You have not said anything about the $\beta_i$ not all being 0. Obviously if $\beta_i= 0$ for all i, that sum is 0.

9. Jul 18, 2013

### voko

You have, almost, proved the base of induction. But you still have to prove the $n \rightarrow n + 1$ induction step.

10. Jul 19, 2013

### unscientific

I don't think induction works here. Even though sum of any 2 vectors = 0, it doesn't mean that adding a third vector wont make it zero.

Assume the sum of 3 vectors = 0, it implies the third vector added is parallel to the sum of the two (No contradiction, as the statement says that the vectors are mutually orthogonal and nothing is said about the orthogonality between a vector and a sum of vectors.

I think the right way is to take the inner product of any vector with respect to the entire sum.

11. Jul 19, 2013

### voko

Well, this can in fact be proved, too, but this is probably more difficult than what you have to do.

Why does that have to be any vector?

12. Jul 20, 2013

### unscientific

1. Add any 2 vectors, show that they are non-zero.

2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

3. Carry on process till last vector.

4. QED

13. Jul 20, 2013

### voko

This does not prove linear independence. Use the definition of the latter.

14. Jul 20, 2013

### HallsofIvy

Unscientific, the basic problem appears to be that you have an incorrect idea of what the definition of "independent" is!

It is NOT
A set of vectors $$\{v_1, v_2, ..., v_n\}$$ is "independent" if an only if the only way $$\beta_1v_1+ \beta_2v_2+ ... + \beta_nv_n= 0$$ is if $$\beta_1= \beta_2= ...= \beta_n$$.

It is NOT just a matter of adding vectors and saying the sum is not 0. It is the the only way a linear combination of them can be 0 is if all coefficients are 0. That is equivalent to the statement that no one of the vectors can be written as a linear combination of the other.

So to prove a set of vectors is linearly independent is to start, "Suppose $$\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n= 0$$" and show that every one of the \betas is equal to 0. Here the only condition on the vectors is that they are "mutually perpendicular"- and no where have you used that condition.

What would you get if you took the dot product of $$\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n$$ with each of $$v_1$$, $$v_2$$, ..., $$v_n$$ in turn?

15. Jul 21, 2013

### unscientific

Yup, sorry for not being concise. What i meant by "add any 2 vectors" I mean adding βiai.

16. Jul 21, 2013

### unscientific

1. Add any 2 vectors, show that they are non-zero.

βiai + βjaj can't be zero, otherwise

ai = -(βji)aj implying they are parallel. Contradiction, as they are orthogonal.

2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

βiai + βjaj + βkak.

Taking inner product, the first two innerproducts give 0, due to orthogonality. The last one, which is with itself, gives non-zero due to positivity of the norm.

3. Carry on process till last vector.

4. QED (assuming coefficients are non-zero)

I hope this is clear enough..thanks for the help guys!

17. Jul 21, 2013

### micromass

What if $\beta_i = \beta_j = 0$. Doesn't the expression give $0$?

What if $\beta_i=0$, won't you divide by $0$?
Why are parallel vectors not orthogonal?

18. Jul 21, 2013

### unscientific

I'm assuming all coefficients are non-zero.

19. Jul 21, 2013

### micromass

Well, you need to say this. And why can you assume this anyway?

20. Jul 21, 2013

### unscientific

Because I am choosing them to be non-zero, in order to work towards the proof of linear independence. No point choosing any of them to be zero. (I thought this was straightforward enough not to say..)

21. Jul 21, 2013

### micromass

That's not what linear independence states. It says that $\beta_i\mathbf{v}_i + \beta_j\mathbf{w}_j \neq \mathbf{0}$ whenever $\beta_i$ and $\beta_j$ are both nonzero. So it can certainly happen that one of the $\beta_i$ is zero.

22. Jul 22, 2013

### unscientific

Yes I get what you mean. But the question wants to show the linear independence of all vectors from a1 to an! If you let any of the coefficients be 0, then you are at most showing linear independence of all vectors except that one you let the coefficient be 0.

23. Jul 22, 2013

### voko

You are required to prove that the entire set of vectors is linearly independent. By definition, you must prove that their linear combination is zero only when all the coefficients are zero. If you prove any other statement, then you need also a proof that your statement leads to linear independence per the original definition.

24. Jul 22, 2013

### unscientific

That's right, thanks for putting it in a more elegant way!