Linear Dependence Proof for a Set of Vectors

newtomath
Messages
37
Reaction score
0
given S is a set of vectors S= (v1,v2,..vn), prove that S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S?

Can someone point me in the right direction of how to start this proof? I am completely lost.
 
Physics news on Phys.org
A good idea would be to start with the definition of linear dependence/independence.
 
Start with a 2-D case where S=(X,Y,A), where A=c1 . X + c2 . Y and then proceed.
 
newtomath said:
given S is a set of vectors S= (v1,v2,..vn), prove that S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S?

Can someone point me in the right direction of how to start this proof? I am completely lost.

Do you know how to prove an "if and only if" statement (aka "iff")?
"P if and only if Q", or
"P <=> Q"...

Typically, you'll prove each "direction" separately (i.e. "=>" separate from "<=").
So we start by proving that "S linearly dependent => S is a linear combination...".
The way to prove a conditional statement is to assume the first part ("P"), and prove the second part ("Q").

The other guys were right in that you'll want to use the definition of linearly independent. You might consider contradiction.

Most proofs that I have seen rely on the assumption (at some point) that one of the coefficients of the vectors is DIFFERENT from zero. Then you can divide by it and rearrange the terms to get it to "fit" the definition (usually involving a1v1 + ... + anvn = 0 implies that all the a's are equal to zero)
 
Thanks guys.

Can you take a peek below and advise if you agree with me?


if S is linearly dependent then S is a linear combination of all the other vectors in S.

s is linearly dependent if constants (not all zero) exist where
c1v1 +c2v2 + c3v3 + cnvn =0

if the answer are all zeros, it is a trivial solution and the vectors are linearly independent.
if one vector in S is equal to the sum of scalar multiples of the other vectors, then it is a linear combination of the other vectors in S.

We can re arrange the equation as so: c1v1= -c2v2 -c3v3 - cnvn. Recall that linearly dependency requires an answer where not all the constants(c1,c2...cn) are zeros, ie...not the trivial solution. So c1v1 above must be a combination of the other vectors of all the other vectors in S.

if s is a linear combination of all the other vectors in S then S is linearly dependent.
A vector v is a linear combination of vector space S if constants exists for the below:
c1v1 +c2v2 + c3v3 + cnvn= v.
If we assume that is true, then s is linearly dependent because a vector v can be written as multiples of all the other vectors.
 
newtomath said:
...

We can re arrange the equation as so: c1v1= -c2v2 -c3v3 - cnvn. Recall that linearly dependency requires an answer where not all the constants(c1,c2...cn) are zeros, ie...not the trivial solution. So c1v1 above must be a combination of the other vectors of all the other vectors in S.
...

This is close, but you have to show that they are not all zero.
Since all of the ci are not zero, there exists at least one that is not zero. We can assume, without loss of generality (might want to brush up on that phrase), that it is the first one (c1). Since this is not zero, you can divide by it.
It might help to look at "where you are going". If you want to prove that something is a linear combination, look at that definition, and manipulate your equation until you have the same form as the definition.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top