Proof: Linear Dependence of Vectors in a Vector Space

In summary: Is that correct?Yes, that is correct. You could also do a direct proof, but proof by contradiction or proof by contrapositive are fine as well.
  • #1
Danielm
21
0

Homework Statement


Prove the following theorem: Let (v1, . . . , vk) be a sequence of vectors from a vector space V . Prove that the sequence if linearly dependent if and only if for some j, 1 ≤ j ≤ k, vj is a linear combination of (v1, . . . , vk) − (vj ).

Homework Equations

The Attempt at a Solution


the if and only if is what bothers me. I know how to prove the following direction: If vj is a linear combination of (v1,...,vk)-vj then linearly dependent

My approach is if c1v1+...+ckvk=vj then c1v1+...+ckvk-vj=0, so there exists a set of constants c1,..,ck,cj=-1 such that c1v1+...+ckvk=0 (note c1,...,ck can't all be zero) is that right?I don't know how to show if linearly dependent then vj is a linear combination of (v1,...,vk)-vj, I guess the contrapositive would be ok

if for all vj, vj is not a linear combination of (v1,...,vk)-vj then the sequence of vectors is not linearly independent.

Proof by contradiction

Assume to the contrary that there exists a vj such that vj is a linear combination of (v1,..,vk)-vj and the sequence of vectors is not linearly independent.

then there exists a set of constants such that c1v1+...+ckvk=vj ,so c1v1+...+ckvk-vj=0 which shows the system is linearly dependent, so contradiction
..
 
Last edited:
Physics news on Phys.org
  • #2
If ##v_1,...,v_k## are linearly dependent then there exist ##c_1,...,c_k## such that ##\sum_{i=1}^kc_iv_i=0##. You want to be able to choose one of the vectors in that sum to be your ##v_j## and then rearrange the equation so that ##v_j## is all by itself on one side of the equation and does not appear on the other side. What property does ##c_j## have to have to allow you to do that? Can you be sure that at least one of the coefficients has that property? Why?
 
  • #3
andrewkirk said:
If ##v_1,...,v_k## are linearly dependent then there exist ##c_1,...,c_k## such that ##\sum_{i=1}^kc_iv_i=0##. You want to be able to choose one of the vectors in that sum to be your ##v_j## and then rearrange the equation so that ##v_j## is all by itself on one side of the equation and does not appear on the other side. What property does ##c_j## have to have to allow you to do that? Can you be sure that at least one of the coefficients has that property? Why?
vj is a linear combination of (v1,...,vk)-vj.

c1v1+...+c_j-1v_j-1+c_j+1v_j+1+...+c_kv_k-v_j=0

so c1v1+...+c_j-1v_j-1+c_j+1v_j+1+...+c_kv_k has to add up to v_j so the sum is 0 and this can't be achieved if all c_j are zero
 
  • #4
Danielm said:
so c1v1+...+c_j-1v_j-1+c_j+1v_j+1+...+c_kv_k has to add up to v_j so the sum is 0 and this can't be achieved if all c_j are zero
Right. And if there is no linear combination that adds to zero for which the ##c_j## are not all zero, what does that tell us about whether the set of vectors is linearly dependent?
 
  • #5
andrewkirk said:
Right. And if there is no linear combination that adds to zero for which the ##c_j## are not all zero, what does that tell us about whether the set of vectors is linearly dependent?
it means the set of vectors is linearly independent hence the only solution to the system is the trivial solution
 
  • #6
Good. Do you now understand how to prove the 'only if' direction?
 
  • #7
andrewkirk said:
Good. Do you now understand how to prove the 'only if' direction?
yes, well what I understand about the bi-conditional is that we have to prove both directions of the statement. If vj is a linear combination of (v1,...,vk)-vj then linearly dependent. f linearly dependent then vj is a linear combination of (v1,...,vk)-vj. The second one I would prove it by using the contrapositive. if for all vj, vj is not a linear combination of (v1,...,vk)-vj then the sequence of vectors is not linearly independent. And then I would use proof by contradiction which is basically the same thing as the first direction.
 

FAQ: Proof: Linear Dependence of Vectors in a Vector Space

1. What is the definition of linear dependence of vectors?

Linear dependence of vectors refers to a situation in which one vector can be expressed as a linear combination of other vectors in a vector space. In other words, a vector is linearly dependent if it can be written as a sum of scalar multiples of other vectors in the same vector space.

2. How do you determine if a set of vectors is linearly dependent or independent?

To determine if a set of vectors is linearly dependent or independent, you can use the following steps:

1. Write the vectors as column vectors in a matrix.

2. Use row reduction to put the matrix into row echelon form.

3. If there is a row of zeros in the reduced matrix, then the vectors are linearly dependent. Otherwise, they are linearly independent.

3. Can a set of two vectors be linearly dependent?

Yes, a set of two vectors can be linearly dependent. This means that one of the vectors can be expressed as a scalar multiple of the other vector. For example, if vector A = [1, 2] and vector B = [2, 4], then vector B can be expressed as 2*A, making them linearly dependent.

4. How does linear dependence relate to the span of a set of vectors?

The span of a set of vectors is the set of all possible linear combinations of those vectors. If a set of vectors is linearly dependent, then some of the vectors in the set can be written as linear combinations of the others. This means that the span of the set is not unique, as it can be expressed in terms of fewer vectors.

5. Why is linear dependence important in linear algebra?

Linear dependence is important in linear algebra because it helps us understand the relationships between vectors in a vector space. It also allows us to simplify and solve systems of linear equations. Additionally, linear dependence is used in many applications, such as computer graphics, engineering, and physics.

Back
Top