Linear dependence of two vectors

blue_leaf77
Science Advisor
Messages
2,637
Reaction score
786
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
I need to check how I can satisfy the equation
$$C_1 v_a + C_2 v_d = 0$$
If that's only possible when ##C_1 = C_2 = 0##, then ##v_a## and ##v_d## are linearly independent, but I don't think that's necessarily the case.
 
Physics news on Phys.org
blue_leaf77 said:
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
I need to check how I can satisfy the equation
$$C_1 v_a + C_2 v_d = 0$$
If that's only possible when ##C_1 = C_2 = 0##, then ##v_a## and ##v_d## are linearly independent, but I don't think that's necessarily the case.

Do you mean ##v_c## is linearly independent to both ##v_a## and ##v_b##?

If so, it's clear that ##v_a## and ##v_d## are linearly independent.
 
In general I want ##v_c## to be arbitrary vector (in the same vector space as ##v_a## and ##v_b##).
Actually this question arose when I read about the rank of a matrix. In the book I read, it says that when a row of a matrix ##A##, let's say ##v_i## is replaced by ##v_i + cv_j## where ##c## constant and ##v_j## another row of the same matrix, then the rank of the newly formed matrix (after replacement) is the same as that of ##A##. That's the same as saying that the number of linearly independent rows in the new matrix after the replacement is the same as the old matrix ##A##. I'm trying to prove that it's true.
 
Last edited:
blue_leaf77 said:
In general I want ##v_c## to be arbitrary vector (in the same vector space as ##v_a## and ##v_b##).
Actually this question arose when I read about the rank of a matrix. In the book I read, it says that when a row of a matrix ##A##, let's say ##v_i## is replaced by ##v_i + cv_j## where ##c## constant and ##v_j## another row of the same matrix, then the rank of the newly formed matrix (after replacement) is the same as that of ##A##. That's the same as saying that the number of linearly independent rows in the new matrix after the replacement is the same as the old matrix ##A##. I'm trying to prove that it's true.

In that case, you can consider the span of ##v_i## and ##v_j## and the span of ##v_i + cv_j## and ##v_j##.

These are equal, so the span of the rows is unchanged.
 
  • Like
Likes blue_leaf77
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?
 
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?

Yes, absolutely. There are lots of ways to prove this for arbitrary vectors. One way is simply to equate any linear combination in each case. Another is to think of the two vectors in terms of the basis vectors needed to represent them: for whatever basis you choose.
 
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?

Yes, you can prove this by induction. If ##v_1, v_2## are independent, then show that the set { ## av_1+bv_2##} is a subspace, for ##a,b## in the base field ##K##. Now, you can generalize to ##n ## vectors.
 
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?
Any set of vectors "span some space". If they happen to be linearly independent, then they form a basis for that space. If they are dependent, then some vectors in the set can be written as a linear combination of the others. Dropping those vectors from the set will give a basis for the space.
 
PeroK said:
In that case, you can consider the span of ##v_i## and ##v_j## and the span of ##v_i + cv_j## and ##v_j##.
These are equal, so the span of the rows is unchanged.
Ok I think this is caused by my lack of understanding of the definition of the span of a set of vectors. I just found both in wikipedia and my textbook that
"The set of all linear combinations of ##(v_1, \dots , v_m)## is called the span of ##(v_1, \dots , v_m)##"
There it doesn't mention that all vectors in that list must be linearly independent. Now I agree with what you said there.

But then what does it imply with the unchanging of the rank of A if the span of its rows remains the same? Does it automatically imply that the number of linearly independent rows in A also stays the same?
 
  • #10
blue_leaf77 said:
Ok I think this is caused by my lack of understanding of the definition of the span of a set of vectors. I just found both in wikipedia and my textbook that
"The set of all linear combinations of ##(v_1, \dots , v_m)## is called the span of ##(v_1, \dots , v_m)##"
There it doesn't mention that all vectors in that list must be linearly independent. Now I agree with what you said there.

But then what does it imply with the unchanging of the rank of A if the span of its rows remains the same? Does it automatically imply that the number of linearly independent rows in A also stays the same?

The row rank of a matrix is the dimension of the span of of its rows = the number of linearly independent rows = the number of basis vectors needed to represent the span of all the rows.

You can prove the result you want using any of these equivalent properties. But, perhaps using the last is the easiest, since:

The span of ##v_i + cv_j## and ##v_j## requires precisely the same set of basis vectors to represent them as the span of ##v_i## and ##v_j## and this doesn't affect the basis vectors required to represent the span of the other row vectors. Hence, the row rank is unchanged by this row operation.

PS in the end, however, you prove it, it all hinges on the fact that all linear combinations of ##v_i + cv_j## and ##v_j## are linear combinations of ##v_i## and ##v_j## and vice versa.
 
Last edited:
  • Like
Likes blue_leaf77
  • #11
Thanks that really helps.
 
  • #12
PeroK said:
the number of linearly independent rows = the number of basis vectors needed to represent the span of all the rows.
Does this mean that I can represent any row as a linear combination of the linearly independent rows?
 
  • #13
blue_leaf77 said:
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
Consider the simplest case, va=(1,0); vb=(0,1).
Define vc = va - vb and vd = vb + 1*vc = va.
 
Back
Top