Two linearly dependent vectors

1. The problem statement, all variables and given/known data

Prove that two vectors are linearly dependent if and only if one is a scalar multiple of the other.

2. Relevant equations



3. The attempt at a solution This seems at glance to be a fairly easy proof:

Part I Assume that vectors u and v are linearly dependent.

Then c1u + c2v = 0 where c1 and c2 are not both 0

then u = -c2/c1 * v
and v = -c1/c2 * u But this doesn't make sense to me because what if one of c1 or c2 does equal zero?

Part II Assume that u =av and v=bu , where a and b are constants

then u - av = 0 where the coefficient of u is 1 and v - bu = 0 where the coefficient of v is 1.

Therefore u and v are linearly dependent.

I'm struggling a bit with linear algebra proofs, so any critique or suggestions that anyone could offer would be greatly appreciated.
 
141
1
Either c1 or c2 is nonzero. Without loss of generality, let c1 be nonzero. Then since c1u + c2v = 0, dividing by c1, we get u + (c2/c1)v = 0. What does this tell you?
 

gabbagabbahey

Homework Helper
Gold Member
5,002
6
Part I Assume that vectors u and v are linearly dependent.

Then c1u + c2v = 0 where c1 and c2 are not both 0

then u = -c2/c1 * v
and v = -c1/c2 * u But this doesn't make sense to me because what if one of c1 or c2 does equal zero?
Look at two different cases:

Case1: c1=0 and c2≠0

Case 2: c2=0 and c1≠0

...your method in partII is not valid, since it neglects the c1=0 case.
 
Ok so v = c1u

v -c1u = 0 and suppose that c1 = 0.
v must be the zero vector

now suppose that v-c1=0 and c1 is not equal to zero.
then v-c1u=0 where the coefficients of v and u are both nonzero

Am I going on the right track?

and then I would do a similar thing for u = c2v
 

gabbagabbahey

Homework Helper
Gold Member
5,002
6
Ok so v = c1u

v -c1u = 0 and suppose that c1 = 0.
v must be the zero vector
No, start with c1u+c2v=0...as you did before

If c1=0, what can you say about c2? (Remember, the vectors are by assumption not zero vectors)

If c2=0, what can you say about c1?

The case where neither of them are zero was already covered in your 1st post.
 
If c1=0, what can you say about c2? : c2 is nonzero

If c2=0, what can you say about c1? c1 is nonzero

so assume c1u + c2v = 0

if c1 is nonzero u = -c2/c1 * v
if c2 is nonzero v= -c1/c2 * u


part II

assume that u and v are scalar multiples of each other:

u = av and v = bu where both a and b are nonzero scalars and u and v are nonzero vectors

u - av = 0 and v - bu = 0 therefore u and v are linearly dependent


the problem also gives a hint that I should separately consider the case where one of the vectors is the zero vector

so assume u = av and v=bu

let u be the zero vector 1u-av = 0 therefore u and v are linearly dependent

let v be the zero vector 1v - bu= 0 therefore u and v are linearly dependent


I don't know if I'm getting this right at all.
 

HallsofIvy

Science Advisor
41,626
821
If c1=0, what can you say about c2? : c2 is nonzero

If c2=0, what can you say about c1? c1 is nonzero

so assume c1u + c2v = 0

if c1 is nonzero u = -c2/c1 * v
if c2 is nonzero v= -c1/c2 * u
Yes, that is correct.


part II

assume that u and v are scalar multiples of each other:

u = av and v = bu where both a and b are nonzero scalars and u and v are nonzero vectors
You are making the same mistake you did before. If u and v are scalar multiples of one another, then u= av and v= bu but it does NOT FOLLOW THAT "both a and b are nonzero scalars and u and v are nonzero vectors". If neither u nor v are zero, then you can say u= av for a non zero and so u- av= 0 with coefficients 1 and -a. If u= 0 then au+ 0v= 0 for any non-zero a and if v= 0, then 0u+ bv= 0 for any non-zero b.

u - av = 0 and v - bu = 0 therefore u and v are linearly dependent


the problem also gives a hint that I should separately consider the case where one of the vectors is the zero vector

so assume u = av and v=bu

let u be the zero vector 1u-av = 0 therefore u and v are linearly dependent

let v be the zero vector 1v - bu= 0 therefore u and v are linearly dependent


I don't know if I'm getting this right at all.
 
part i. If c1 is zero and c2 is not zero then for c1u + c2v = 0 to hold, either u = o(vector) or v = 0 and vice versa since it has been said that u and v are linearly dependent
 
Is there a linear space V in which the union of any subspaces of V is a subspace except the trivial subspaces V and {0}? pls help
 

The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top