Prove mutually non-zero orthogonal vectors are linearly independent

Click For Summary

Homework Help Overview

The problem involves proving that a set of mutually orthogonal, non-zero vectors in Rn are linearly independent. The original poster presents an attempt at a solution but faces challenges in justifying their reasoning and definitions related to linear independence.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the definition of linear independence and question the validity of the original poster's reasoning. Some suggest showing that the sum of any two vectors is non-zero, while others emphasize the need for a rigorous proof involving the inner product.

Discussion Status

The discussion is ongoing, with participants exploring different interpretations of linear independence and the implications of mutual orthogonality. Some have provided guidance on how to approach the proof, particularly regarding the use of inner products and the necessity of demonstrating that coefficients must be zero for the linear combination to equal zero.

Contextual Notes

Participants express confusion over the original poster's approach and the definitions being used. There is a recognition that the proof may require a more formal structure, particularly in relation to the definitions of linear dependence and independence.

unscientific
Messages
1,728
Reaction score
13

Homework Statement



Let a1, a2, ... an be vectors in Rn and assume that they are mutually perpendicular and none of them equals 0. Prove that they are linearly independent.


Homework Equations





The Attempt at a Solution



Consider βiai + βjaj ≠ 0 for all i, j

=> βiai + βjaj + βkak ≠ 0 for all i, j, k.

Therefore β1a1 + β2a2 + ... + βnan ≠ 0 (Linearly independent)
 
Physics news on Phys.org
I do not understand your attempt.
 
voko said:
I do not understand your attempt.

I first start off with adding 2 vectors, and showing they are non-zero. Then I add a third one, which is also non-zero. Then i add everything to show it is also non-zero.
 
unscientific said:
I first start off with adding 2 vectors, and showing they are non-zero. Then I add a third one, which is also non-zero. Then i add everything to show it is also non-zero.

I do not see how you show that any of those sums is non-zero. You just state that it is. You might as well state the end result immediately, it will be just as (un)justified.
 
  • Like
Likes   Reactions: 1 person
unscientific said:
Therefore β1a1 + β2a2 + ... + βnan ≠ 0 (Linearly independent)
This is NOT the definition of linear independence. The equation β1a1 + β2a2 + ... + βnan =[/color] 0 appears in the definition for linear independence, and in the definition for linear dependence.

How then do we distinguish between a set of vectors that is linearly independent from one that is linearly dependent?

What you showed above, with the ≠ symbol, doesn't appear in either definition.
 
Mark44 said:
This is NOT the definition of linear independence. The equation β1a1 + β2a2 + ... + βnan =[/color] 0 appears in the definition for linear independence, and in the definition for linear dependence.

How then do we distinguish between a set of vectors that is linearly independent from one that is linearly dependent?

What you showed above, with the ≠ symbol, doesn't appear in either definition.

I think the first step is to show that the vector sum of any 2 vectors is non-zero. But since all the vectors are mutually orthogonal, sum of both can't be zero?

Quick proof:

Assume βiai + βjaj = 0

This implies that ai = -(βji)aj is parallel to aj.
([itex]\Rightarrow[/itex][itex]\Leftarrow[/itex])
So any two mutually orthogonal vectors are linearly independent. By mathematical induction, Ʃβiai ≠ 0.
 
Consider the inner product of vector β1a12a2+...+βnan with vector ai and show that it is zero only if βi=0.

Therefore β1a12a2+...+βnan = 0 iff all βi are zero.
 
unscientific said:
I think the first step is to show that the vector sum of any 2 vectors is non-zero. But since all the vectors are mutually orthogonal, sum of both can't be zero?

Quick proof:

Assume βiai + βjaj = 0

This implies that ai = -(βji)aj is parallel to aj.
([itex]\Rightarrow[/itex][itex]\Leftarrow[/itex])
So any two mutually orthogonal vectors are linearly independent. By mathematical induction, Ʃβiai ≠ 0.
You have not said anything about the [itex]\beta_i[/itex] not all being 0. Obviously if [itex]\beta_i= 0[/itex] for all i, that sum is 0.
 
By mathematical induction, Ʃβiai ≠ 0.

You have, almost, proved the base of induction. But you still have to prove the ##n \rightarrow n + 1## induction step.
 
  • #10
voko said:
You have, almost, proved the base of induction. But you still have to prove the ##n \rightarrow n + 1## induction step.

I don't think induction works here. Even though sum of any 2 vectors = 0, it doesn't mean that adding a third vector won't make it zero.

Assume the sum of 3 vectors = 0, it implies the third vector added is parallel to the sum of the two (No contradiction, as the statement says that the vectors are mutually orthogonal and nothing is said about the orthogonality between a vector and a sum of vectors.

I think the right way is to take the inner product of any vector with respect to the entire sum.
 
  • #11
unscientific said:
I don't think induction works here. Even though sum of any 2 vectors = 0, it doesn't mean that adding a third vector won't make it zero.

Assume the sum of 3 vectors = 0, it implies the third vector added is parallel to the sum of the two (No contradiction, as the statement says that the vectors are mutually orthogonal and nothing is said about the orthogonality between a vector and a sum of vectors.

Well, this can in fact be proved, too, but this is probably more difficult than what you have to do.

I think the right way is to take the inner product of any vector with respect to the entire sum.

Why does that have to be any vector?
 
  • #12
voko said:
Well, this can in fact be proved, too, but this is probably more difficult than what you have to do.



Why does that have to be any vector?

1. Add any 2 vectors, show that they are non-zero.

2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

3. Carry on process till last vector.

4. QED
 
  • #13
This does not prove linear independence. Use the definition of the latter.
 
  • #14
Unscientific, the basic problem appears to be that you have an incorrect idea of what the definition of "independent" is!

It is NOT
1. Add any 2 vectors, show that they are non-zero.

2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

3. Carry on process till last vector.

4. QED

A set of vectors [tex]\{v_1, v_2, ..., v_n\}[/tex] is "independent" if an only if the only way [tex]\beta_1v_1+ \beta_2v_2+ ... + \beta_nv_n= 0[/tex] is if [tex]\beta_1= \beta_2= ...= \beta_n[/tex].

It is NOT just a matter of adding vectors and saying the sum is not 0. It is the the only way a linear combination of them can be 0 is if all coefficients are 0. That is equivalent to the statement that no one of the vectors can be written as a linear combination of the other.

So to prove a set of vectors is linearly independent is to start, "Suppose [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n= 0[/tex]" and show that every one of the \betas is equal to 0. Here the only condition on the vectors is that they are "mutually perpendicular"- and no where have you used that condition.

What would you get if you took the dot product of [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n[/tex] with each of [tex]v_1[/tex], [tex]v_2[/tex], ..., [tex]v_n[/tex] in turn?
 
  • Like
Likes   Reactions: 1 person
  • #15
HallsofIvy said:
Unscientific, the basic problem appears to be that you have an incorrect idea of what the definition of "independent" is!

It is NOT


A set of vectors [tex]\{v_1, v_2, ..., v_n\}[/tex] is "independent" if an only if the only way [tex]\beta_1v_1+ \beta_2v_2+ ... + \beta_nv_n= 0[/tex] is if [tex]\beta_1= \beta_2= ...= \beta_n[/tex].

It is NOT just a matter of adding vectors and saying the sum is not 0. It is the the only way a linear combination of them can be 0 is if all coefficients are 0. That is equivalent to the statement that no one of the vectors can be written as a linear combination of the other.

So to prove a set of vectors is linearly independent is to start, "Suppose [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n= 0[/tex]" and show that every one of the \betas is equal to 0. Here the only condition on the vectors is that they are "mutually perpendicular"- and no where have you used that condition.

What would you get if you took the dot product of [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n[/tex] with each of [tex]v_1[/tex], [tex]v_2[/tex], ..., [tex]v_n[/tex] in turn?

Yup, sorry for not being concise. What i meant by "add any 2 vectors" I mean adding βiai.
 
  • #16
HallsofIvy said:
Unscientific, the basic problem appears to be that you have an incorrect idea of what the definition of "independent" is!

It is NOT A set of vectors [tex]\{v_1, v_2, ..., v_n\}[/tex] is "independent" if an only if the only way [tex]\beta_1v_1+ \beta_2v_2+ ... + \beta_nv_n= 0[/tex] is if [tex]\beta_1= \beta_2= ...= \beta_n[/tex].

It is NOT just a matter of adding vectors and saying the sum is not 0. It is the the only way a linear combination of them can be 0 is if all coefficients are 0. That is equivalent to the statement that no one of the vectors can be written as a linear combination of the other.

So to prove a set of vectors is linearly independent is to start, "Suppose [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n= 0[/tex]" and show that every one of the \betas is equal to 0. Here the only condition on the vectors is that they are "mutually perpendicular"- and no where have you used that condition.

What would you get if you took the dot product of [tex]\beta_1v_1+ \beta_2v_2+ ...+ \beta_nv_n[/tex] with each of [tex]v_1[/tex], [tex]v_2[/tex], ..., [tex]v_n[/tex] in turn?
1. Add any 2 vectors, show that they are non-zero.

βiai + βjaj can't be zero, otherwise

ai = -(βji)aj implying they are parallel. Contradiction, as they are orthogonal.2. Add a third vector, take inner product of that sum with that vector just added. Non-zero.

βiai + βjaj + βkak.

Taking inner product, the first two innerproducts give 0, due to orthogonality. The last one, which is with itself, gives non-zero due to positivity of the norm.

3. Carry on process till last vector.

4. QED (assuming coefficients are non-zero)

I hope this is clear enough..thanks for the help guys!
 
  • #17
unscientific said:
1. Add any 2 vectors, show that they are non-zero.

βiai + βjaj can't be zero, otherwise

What if ##\beta_i = \beta_j = 0##. Doesn't the expression give ##0##?

ai = -(βji)aj implying they are parallel. Contradiction, as they are orthogonal.

What if ##\beta_i=0##, won't you divide by ##0##?
Why are parallel vectors not orthogonal?
 
  • #18
micromass said:
What if ##\beta_i = \beta_j = 0##. Doesn't the expression give ##0##?



What if ##\beta_i=0##, won't you divide by ##0##?
Why are parallel vectors not orthogonal?

I'm assuming all coefficients are non-zero.
 
  • #19
unscientific said:
I'm assuming all coefficients are non-zero.

Well, you need to say this. And why can you assume this anyway?
 
  • #20
micromass said:
Well, you need to say this. And why can you assume this anyway?

Because I am choosing them to be non-zero, in order to work towards the proof of linear independence. No point choosing any of them to be zero. (I thought this was straightforward enough not to say..)
 
  • #21
unscientific said:
Because I am choosing them to be non-zero, in order to work towards the proof of linear independence. No point choosing any of them to be zero. (I thought this was straightforward enough not to say..)

That's not what linear independence states. It says that ##\beta_i\mathbf{v}_i + \beta_j\mathbf{w}_j \neq \mathbf{0}## whenever ##\beta_i## and ##\beta_j## are both nonzero. So it can certainly happen that one of the ##\beta_i## is zero.
 
  • #22
micromass said:
That's not what linear independence states. It says that ##\beta_i\mathbf{v}_i + \beta_j\mathbf{w}_j \neq \mathbf{0}## whenever ##\beta_i## and ##\beta_j## are both nonzero. So it can certainly happen that one of the ##\beta_i## is zero.

Yes I get what you mean. But the question wants to show the linear independence of all vectors from a1 to an! If you let any of the coefficients be 0, then you are at most showing linear independence of all vectors except that one you let the coefficient be 0.
 
  • #23
unscientific said:
Yes I get what you mean. But the question wants to show the linear independence of all vectors from a1 to an! If you let any of the coefficients be 0, then you are at most showing linear independence of all vectors except that one you let the coefficient be 0.

You are required to prove that the entire set of vectors is linearly independent. By definition, you must prove that their linear combination is zero only when all the coefficients are zero. If you prove any other statement, then you need also a proof that your statement leads to linear independence per the original definition.
 
  • Like
Likes   Reactions: 1 person
  • #24
voko said:
You are required to prove that the entire set of vectors is linearly independent. By definition, you must prove that their linear combination is zero only when all the coefficients are zero. If you prove any other statement, then you need also a proof that your statement leads to linear independence per the original definition.

That's right, thanks for putting it in a more elegant way!
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K