Vector independence proof question

Click For Summary

Homework Help Overview

The discussion revolves around proving the linear dependence of a set of vectors \( v_1, \ldots, v_n \) in a vector space \( V \) over a field \( F \). The original poster presents a proof attempt but expresses confusion regarding certain steps and definitions related to linear dependence.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore the definition of linear dependence and its implications for proving the statement. Questions arise about the proof steps, particularly regarding the manipulation of coefficients and the selection of indices.

Discussion Status

Participants are actively engaging with the proof, questioning definitions and clarifying concepts. Some have provided insights into the relationship between linear dependence and the coefficients of the vectors involved, while others emphasize the importance of understanding the definitions to follow the proof correctly.

Contextual Notes

There is an ongoing discussion about the definitions of linear dependence and independence, with some participants noting the need for precise terminology to facilitate understanding of the proof. The original poster's proof is acknowledged as incomplete, with specific gaps identified that require further clarification.

transgalactic
Messages
1,386
Reaction score
0
prove that vectors [tex]v_1[/tex],..,[tex]v_n[/tex] on a vectorinc space V over field F
are linearly dependent if and only if there is an index 1<=i<=n
so [tex]v_i[/tex] is a lenear combination of the previus vectors by its index
[tex]v_1[/tex],..,[tex]v_{i-1}[/tex]
??

i got a prove but i can't fully understand it:
suppose v_i is a lenear combination of its previous
v_i=a_1v_1+..+a_i-1v_i-1

we transfer v_i on the other side
0=-1v_1+a_1v_1+..a_i-1v_i-1

then they say that
0c_i+..+0v_1
so it lenear dependent

(why??)
then they pick an index
all the index are 0 except the first one which is not
[tex]i_0=max(i|a_i\neq0)[/tex] for which a_i differs 0

but its true only for i_0>=2

so we get the expression

[tex] v_{i0}=(\frac{-a_1}{}a_{i0})v_1+..+()v_i[/tex]
so there is a lenear dependence and we proved it.


the lecturer was in a hury
can you fill the gaps
make sense out of it
??
 
Physics news on Phys.org
What does it mean for a set of vectors to be linearly dependent?
 
it means that there is no vector in this set which could be written as a manipulation
of the other vectors
 
That's your own definition. What is the exact definition? That is, a set of vectors v1, v2, v3, ... , vn in a vector space over a field F is linearly dependent iff _____________. You fill in the blank
 
their determinant differs zero
their row reduction doesn't give us a row of zeros
their dim(Ker)=0

thats the only option i can think of
 
transgalactic said:
their determinant differs zero
No, these are just vectors, not a matrix. Even if you created a matrix by entering these vectors as columns, there is no guarantee that the matrix would be square. A matrix has to be square in order for its determinant to be defined.
transgalactic said:
their row reduction doesn't give us a row of zeros
As above, these are just vectors.
transgalactic said:
their dim(Ker)=0
Still no matrix
transgalactic said:
thats the only option i can think of

How about the definition of linear dependence? You have a textbook, right? It has the definition.
 
why do i need the definition??
 
How in the world are you going to prove a statement like this:
prove that vectors ,.., on a vectorinc space V over field F
are linearly dependent if and only if ...
if you don't know what linear dependence means?
 
i gave almost the complete prove
do you get the idea?
 
  • #10
You're the one who doesn't understand the proof. How can you expect to understand a proof that involves linear independence/linear dependence if you don't know what these terms mean?

I'm not asking you for the definition because I need to know it -- you need to know it.
 
  • #11
i know what the independent vector means
" a set of vectors that, in a linear combination, can represent every vector in a given vector space or free module, and such that no element of the set can be represented as a linear combination of the others. In other words, a basis is a linearly independent spanning set.
"

but its not helping with understanding this prove
??
 
  • #12
transgalactic said:
i know what the independent vector means
" a set of vectors that, in a linear combination, can represent every vector in a given vector space or free module, and such that no element of the set can be represented as a linear combination of the others. In other words, a basis is a linearly independent spanning set.
"

but its not helping with understanding this prove
??
There's part of your problem. That is NOT a definition of "independent", it is a definition of 'basis". What is the definition of "independent vectors"?
 
  • #13
that there is no linear combination of the three vectors that will add to zero unless the coefficients multiplying the three vectors (not their internal components) are individually zero.

the only way
av1+bv2+cv3=0
if
a=b=c=0

how to use it?
 
  • #14
OK, this is essentially the definition of linear independence for three vectors. A bit more generally, a set of vectors {v1, v2, v3, ... , vn} in a vector space over a field F is linearly independent iff the only solution for the equation c1*v1 + c2*v2 + ... + cn*vn = 0 is c1 = c2 = ... = cn = 0.

For the same set of vectors to be linearly dependent, the equation c1*v1 + c2*v2 + ... + cn*vn = 0 has a solution where at least one of the ci's is nonzero.

Now, go back to your original post in this thread and see how this idea is being used.
 
  • #15
ok i heve this expression
0=-1v_1+a_1v_1+..a_i-1v_i-1
not all the vectors can be with 0 coefficient
v_1 is not.(he has -1)

why they pick maximal index for which the coefficient differs 0

??
 
  • #16
what to do next??
 
  • #17
transgalactic said:
ok i heve this expression
0=-1v_1+a_1v_1+..a_i-1v_i-1
not all the vectors can be with 0 coefficient
v_1 is not.(he has -1)
why they pick maximal index for which the coefficient differs 0

??
Whoever wrote what you're reading is using the definition of linear dependence. The set {v1, v2, v3, ..., vn} is assumed to be linearly dependent, which means that the equation a1*v1 + a2*v2 + ... + an*vn = 0 has a solution where at least one ai is not 0. The definition of linear dependence guarantees that at least one such number is not zero, but it doesn't say which one. There might be just one constant ai or a bunch of them. The max() part says to pick the constant with the highest such index. Let's call it ak instead of what he uses, which is ai0. Then he moves that term to the other side of the equation to get
-ak*vk = a1*v1 + a2*v2 + ... + a[k-1]*v[k-1] + a[k+1]*v[k+1] + ... + an*vn

Since ak is not zero, you can divide both sides of the equation by it, thereby showing that vk is a linear combination of the other vectors.
 
  • #18
thanks i got it
:)
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
2K
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K