# Vector independence proof question

1. Mar 4, 2009

### transgalactic

prove that vectors $$v_1$$,..,$$v_n$$ on a vectorinc space V over feild F
are linearly dependant if and only if there is an index 1<=i<=n
so $$v_i$$ is a lenear combination of the previus vectors by its index
$$v_1$$,..,$$v_{i-1}$$
??

i got a prove but i cant fully understand it:
suppose v_i is a lenear combination of its previous
v_i=a_1v_1+..+a_i-1v_i-1

we transfer v_i on the other side
0=-1v_1+a_1v_1+..a_i-1v_i-1

then they say that
0c_i+..+0v_1
so it lenear dependant

(why??)
then they pick an index
all the index are 0 except the first one which is not
$$i_0=max(i|a_i\neq0)$$ for which a_i differs 0

but its true only for i_0>=2

so we get the expression

$$v_{i0}=(\frac{-a_1}{}a_{i0})v_1+..+()v_i$$
so there is a lenear dependance and we proved it.

the lecturer was in a hury
can you fill the gaps
make sence out of it
??

2. Mar 4, 2009

### Quantumpencil

What does it mean for a set of vectors to be linearly dependent?

3. Mar 4, 2009

### transgalactic

it means that there is no vector in this set which could be written as a manipulation
of the other vectors

4. Mar 4, 2009

### Staff: Mentor

That's your own definition. What is the exact definition? That is, a set of vectors v1, v2, v3, ... , vn in a vector space over a field F is linearly dependent iff _____________. You fill in the blank

5. Mar 4, 2009

### transgalactic

their determinant differs zero
their row reduction doesnt give us a row of zeros
their dim(Ker)=0

thats the only option i can think of

6. Mar 4, 2009

### Staff: Mentor

No, these are just vectors, not a matrix. Even if you created a matrix by entering these vectors as columns, there is no guarantee that the matrix would be square. A matrix has to be square in order for its determinant to be defined.
As above, these are just vectors.
Still no matrix
How about the definition of linear dependence? You have a textbook, right? It has the definition.

7. Mar 4, 2009

### transgalactic

why do i need the definition??

8. Mar 5, 2009

### Staff: Mentor

How in the world are you going to prove a statement like this:
if you don't know what linear dependence means?

9. Mar 5, 2009

### transgalactic

i gave almost the complete prove
do you get the idea?

10. Mar 5, 2009

### Staff: Mentor

You're the one who doesn't understand the proof. How can you expect to understand a proof that involves linear independence/linear dependence if you don't know what these terms mean?

I'm not asking you for the definition because I need to know it -- you need to know it.

11. Mar 5, 2009

### transgalactic

i know what the independent vector means
" a set of vectors that, in a linear combination, can represent every vector in a given vector space or free module, and such that no element of the set can be represented as a linear combination of the others. In other words, a basis is a linearly independent spanning set.
"

but its not helping with understanding this prove
??

12. Mar 5, 2009

### HallsofIvy

Staff Emeritus
There's part of your problem. That is NOT a definition of "independent", it is a definition of 'basis". What is the definition of "independent vectors"?

13. Mar 5, 2009

### transgalactic

that there is no linear combination of the three vectors that will add to zero unless the coefficients multiplying the three vectors (not their internal components) are individually zero.

the only way
av1+bv2+cv3=0
if
a=b=c=0

how to use it?

14. Mar 5, 2009

### Staff: Mentor

OK, this is essentially the definition of linear independence for three vectors. A bit more generally, a set of vectors {v1, v2, v3, ... , vn} in a vector space over a field F is linearly independent iff the only solution for the equation c1*v1 + c2*v2 + ... + cn*vn = 0 is c1 = c2 = ... = cn = 0.

For the same set of vectors to be linearly dependent, the equation c1*v1 + c2*v2 + ... + cn*vn = 0 has a solution where at least one of the ci's is nonzero.

Now, go back to your original post in this thread and see how this idea is being used.

15. Mar 5, 2009

### transgalactic

ok i heve this expression
0=-1v_1+a_1v_1+..a_i-1v_i-1
not all the vectors can be with 0 coefficient
v_1 is not.(he has -1)

why they pick maximal index for which the coefficient differs 0

??

16. Mar 6, 2009

### transgalactic

what to do next??

17. Mar 6, 2009

### Staff: Mentor

Whoever wrote what you're reading is using the definition of linear dependence. The set {v1, v2, v3, ..., vn} is assumed to be linearly dependent, which means that the equation a1*v1 + a2*v2 + ... + an*vn = 0 has a solution where at least one ai is not 0. The definition of linear dependence guarantees that at least one such number is not zero, but it doesn't say which one. There might be just one constant ai or a bunch of them. The max() part says to pick the constant with the highest such index. Let's call it ak instead of what he uses, which is ai0. Then he moves that term to the other side of the equation to get
-ak*vk = a1*v1 + a2*v2 + ... + a[k-1]*v[k-1] + a[k+1]*v[k+1] + ... + an*vn

Since ak is not zero, you can divide both sides of the equation by it, thereby showing that vk is a linear combination of the other vectors.

18. Mar 6, 2009

### transgalactic

thanks i got it
:)