I Linear independence of three vectors

Salmone
Messages
101
Reaction score
13
If I've got three vectors ##\vec{a}##, ##\vec{b}## and ##\vec{c}## and ##\vec{a}##, ##\vec{b}## are linearly independent and ##\vec{c}## is linearly independent from ##\vec{a}##, is ##\vec{c}## also linearly independent from ##\vec{b}##?
 
Mathematics news on Phys.org
Salmone said:
If I've got three vectors ##\vec{a}##, ##\vec{b}## and ##\vec{c}## and ##\vec{a}##, ##\vec{b}## are linearly independent and ##\vec{c}## is linearly independent from ##\vec{a}##, is ##\vec{c}## also linearly independent from ##\vec{b}##?
You mean assuming ##\vec c \ne \vec b##?
 
  • Like
Likes Salmone
PeroK said:
You mean assuming ##\vec c \ne \vec b##?
Yes.
 
Salmone said:
Yes.
Under what circumstances are two vectors linearly independent? There's a fairly simple criterion.
 
PeroK said:
Under what circumstances are two vectors linearly independent? There's a fairly simple criterion.
If the linear combination of the two is equal to ##0## only when both coefficients are equal to ##0##.
 
Salmone said:
Yes.
Can you use the ##\vec c = \vec b ## counterexample to think up a whole family of other counterexamples?
 
  • Like
Likes mfb
Salmone said:
If the linear combination of the two is equal to ##0## only when both coefficients are equal to ##0##.
Okay, but for only two vectors that implies something simple.
 
Salmone said:
If I've got three vectors ##\vec{a}##, ##\vec{b}## and ##\vec{c}## and ##\vec{a}##, ##\vec{b}## are linearly independent and ##\vec{c}## is linearly independent from ##\vec{a}##, is ##\vec{c}## also linearly independent from ##\vec{b}##?

PeroK said:
Under what circumstances are two vectors linearly independent? There's a fairly simple criterion.

Salmone said:
If the linear combination of the two is equal to 0 only when both coefficients are equal to 0.
What does it mean geometrically if two vectors are linearly independent? I believe @PeroK is trying to get you to think in this direction.
 
  • Like
Likes Salmone
Mark44 said:
What does it mean geometrically if two vectors are linearly independent? I believe @PeroK is trying to get you to think in this direction.
They are orthogonal?
 
  • #10
Salmone said:
That they are orthogonal?
There's no such thing as "orthogonal" until you define an inner product on your vector space.

The answer is that ##\vec a## and ##\vec b## are linearly independent as long as one is not a scalar multiple of the other. The only vectors that are linearly dependent with ##\vec a## are vectors of the form ##\lambda \vec a## for some scalar ##\lambda##.
 
  • Like
Likes Delta2, FactChecker and Salmone
  • #11
Hi,@Salmone , a set of vectors ##\{\vec{u_1},\vec{u_2},\ldots{\vec{u_{k}}}\}## is linearly independent ##\Longleftrightarrow{\lambda_{1}\vec{u_1}+\lambda_{2}\vec{u_2}+\ldots{+\lambda_{k}\vec{u_{k}}}=\vec{0}}##, then ##\lambda_1=\lambda_2=\ldots{=\lambda_{k}=0}##,...Hmm, I think I'm late, :smile:
 
Last edited:
  • #12
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.

The answer is that ##\vec a## and ##\vec b## are linearly independent as long as one is not a scalar multiple of the other. The only vectors that are linearly dependent with ##\vec a## are vectors of the form ##\lambda \vec a## for some scalar ##\lambda##.
Okay, so can we say the same for more than three vectors. For example vectors ##\vec{a},\vec{b},\vec{c}## are linearly independent and a fourth vector ##\vec{d}## is linearly independent to ##\vec{a}## then it is linearly independent to ##\vec{b}## and ##\vec{c}## if ##\vec{d} \neq \vec{b} \neq \vec{c}##
 
  • #13
Salmone said:
Okay, so can we say the same for more than three vectors. For example vectors ##\vec{a},\vec{b},\vec{c}## are linearly independent and a fourth vector ##\vec{d}## is linearly independent to ##\vec{a}## then it is linearly independent to ##\vec{b}## and ##\vec{c}## if ##\vec{d} \neq \vec{b} \neq \vec{c}##
It doesn't matter how many vectors you have. Linear independence between any pair of vectors is a simple matter or not being mutiples of each other.

It's only when you get to three or more vectors that linear independence becomes non-trivial.
 
  • Like
Likes Delta2 and Salmone
  • #14
PeroK said:
It doesn't matter how many vectors you have. Linear independence between any pair of vectors is a simple matter or not being mutiples of each other.

It's only when you get to three or more vectors that linear independence becomes non-trivial.
So it's yes? If a vector is linearly independent on another vector and this last one is LI to another vector and so on, they are all LI?
 
  • #15
Salmone said:
So it's yes? If a vector is linearly independent on another vector and this last one is LI to another vector and so on, they are all LI?
Definitely not.
 
  • #16
PeroK said:
Definitely not.
So I can't understand your previous answer "It doesn't matter how many vectors you have."
 
  • #17
mcastillo356 said:
Hi,@Salmone , a set of vectors ##\{\vec{u_1},\vec{u_2},\ldots{\vec{u_{k}}}\}## is linearly independent ##\Longleftrightarrow{\lambda_{1}\vec{u_1}+\lambda_{2}\vec{u_2}+\ldots{+\lambda_{k}\vec{u_{k}}}=\vec{0}}##, then ##\lambda_1=\lambda_2=\ldots{=\lambda_{k}=0}##,...Hmm, I think I'm late, :smile:
That's not quite right. That set of vectors is linearly independent iff the only solution to the equation ##c_1x_1 + c_2x_2 + \dots + c_nx_n = 0## is ##c_1 = c_2 = \dots = c_n = 0##

Consider the vectors x, 2x, and 3x. I notice that for the equation ##c_1x + c_2*2x + c_3*3x = 0##, that ##c_1 = c_2 = c_3 = 0## is a solution. Can you conclude that these vectors are linearly independent?
 
  • Love
Likes mcastillo356
  • #18
Salmone said:
So it's yes? If a vector is linearly independent on another vector and this last one is LI to another vector and so on, they are all LI?
No. Draw a picture to see why the above isn't true.

What @PeroK was saying is that it's easy to determine whether two vectors are dependent. If they are, they either point the same direction or in opposite directions. I.e., they are either parallel or antiparallel (point in opposite directions). That's what "scalar multiples" of one another means.

If you have three or more vectors it's harder to tell whether they are linearly dependent or linearly independent.
 
  • Like
Likes PeroK and Salmone
  • #19
Mark44 said:
No. Draw a picture to see why the above isn't true.

What @PeroK was saying is that it's easy to determine whether two vectors are dependent. If they are, they either point the same direction or in opposite directions. I.e., they are either parallel or antiparallel (point in opposite directions). That's what "scalar multiples" of one another means.

If you have three or more vectors it's harder to tell whether they are linearly dependent or linearly independent.
Ok and so what is the answer to my first question? Now I'm way more confused.
 
  • #20
Salmone said:
Ok and so what is the answer to my first question? Now I'm way more confused.
We're not going to just tell you the answer, but we'll help you arrive at it. If you draw a picture of three vectors, you'll probably get the answer in short order.
 
  • #21
Mark44 said:
We're not going to just tell you the answer, but we'll help you arrive at it. If you draw a picture of three vectors, you'll probably get the answer in short order.
I drew the three vectors and have no idea what I am supposed to figure out from this.
 
  • #22
You have said that if ##\vec c = \vec b##, then it is not true. In fact, if ##\vec c## is a multiple of ##\vec b##, then it is not true (if this is not clear to you, then you should prove it). In fact, what can you say if ##\vec c## has even a tiny component that is a multiple of ##\vec b##? So what should the answer to the problem be?
 
  • #23
FactChecker said:
You have said that if ##\vec c = \vec b##, then it is not true. In fact, if ##\vec c## is a multiple of ##\vec b##, then it is not true (if this is not clear to you, then you should prove it). In fact, what can you say if ##\vec c## has even a tiny component that is a multiple of ##\vec b##? So what should the answer to the problem be?
If ##\vec{c}## has a tiny component that is a multiple of ##\vec{b}## they are linearly indipendent.
 
  • #24
Mark44 said:
We're not going to just tell you the answer, but we'll help you arrive at it. If you draw a picture of three vectors, you'll probably get the answer in short order.
What I can imagine is that: if ##\vec{a}## and ##\vec{b}## are linearly independent then they are not a multiple of each other and if ##\vec{c}## is linearly independent with ##\vec{a}## the same can be said of these two BUT, since ##\vec{c}## and ##\vec{b}## are different they cannot even be multiples of each other so the answer should be yes, right?
 
  • #25
Due to an error on one poster's part, I have deleted that post as well as another post that pointed out the error.
 
  • Like
Likes Haborix and FactChecker
  • #26
Salmone said:
I drew the three vectors and have no idea what I am supposed to figure out from this.
Can you post the picture you drew?
 
  • #27
Salmone said:
What I can imagine is that: if ##\vec{a}## and ##\vec{b}## are linearly independent then they are not a multiple of each other and if ##\vec{c}## is linearly independent with ##\vec{a}## the same can be said of these two BUT, since ##\vec{c}## and ##\vec{b}## are different they cannot even be multiples of each other so the answer should be yes, right?
Your argument is similar to:
  1. 5 is not a multiple of 2
  2. 5 is not a multiple of 4
  3. 4 isn't equal to 2; therefore, 4 is not a multiple of 2.
See the problem?
 
  • Like
Likes PeroK and mfb
  • #28
Hi, PF, this is hardwork!:biggrin:(for me, of course)
##\{(0,0,1),(0,0,2),(0,0,3)\}## is dependent, though could be multyplied by ##\lambda_1=\lambda_2=\lambda_3=0## to be ##\vec{0}##
Thanks, @Mark44 !
Where I wrote "then", it is ##\iff##
Quite sure:smile:
 
  • #29
vela said:
Your argument is similar to:
  1. 5 is not a multiple of 2
  2. 5 is not a multiple of 4
  3. 4 isn't equal to 2; therefore, 4 is not a multiple of 2.
See the problem?
No, ##5*2/5=2## so they are the same vector, ##2/5## is a scalar. If ##\vec{a}## doesn't the same direction of ##\vec{b}## and ##\vec{c}## doesn't have the same direction of ##\vec{a}## and ##\vec{c} \neq \vec{b}##, how can ##\vec{c}## and ##\vec{b}## have the same direction?
 
  • #30
Mark44 said:
Can you post the picture you drew?
media_3aa_3aa3ee1f-e593-4eca-97a3-861e72759640_phpvKKeK6.png

It seems to me that three vectors like these are linearly independent
 
  • #31
Salmone said:
View attachment 301867
It seems to me that three vectors like these are linearly independent
Note that for three vectors linear independence is a test of all three vectors. It cannot be reduced to a test for each pair of vectors.

In your diagram, any of the three vectors can be expressed as a sum of the other two. So, they are linearly dependent. You can have at most two linearly independent vectors in ##\mathbb R^2##.

It's not a question of linear independence of any two vectors. It's a question of all three vectors taken together.
 
  • Like
Likes Salmone
  • #32
PeroK said:
Note that for three vectors linear independence is a test of all three vectors. It cannot be reduced to a test for each pair of vectors.

In your diagram, any of the three vectors can be expressed as a sum of the other two. So, they are linearly dependent. You can have at most two linearly independent vectors in ##\mathbb R^2##.

It's not a question of linear independence of any two vectors. It's a question of all three vectors taken together.
Ok so in general the answer to my first question is no, if I have 3 or more vectors I can't check their linearly independence just looking at 2 vectors per time, I need to check the whole independence.
 
  • #33
Salmone said:
Ok so in general the answer to my first question is no, if I have 3 or more vectors I can't check their linearly independence just looking at 2 vectors per time, I need to check the whole independence.
Precisely.

Note that unless two vectors are "in the same direction", i.e. "essentially the same vector", then they are always linearly independent.
 
  • Like
Likes Salmone
  • #34
PeroK said:
Precisely.

Note that unless two vectors are "in the same direction", i.e. "essentially the same vector", then they are always linearly independent.
So the same can be said about a basis, a basis is the set made of the maximum number of linearly independent vectors, let's say ##B={\vec{a},\vec{b},\vec{c}}##. This means that no vectors linearly independent with the vectors of the set can be left out of the set, but a single vector ##\vec{d}##, for example linearly independent only with ##\vec{a}##, doesn't need to be in the set since the linear independence must be among all the vectors in the set and for sure the vectors of the vectorial space outside the basis will be "not on the same direction" of the single vectors inside the basis, otherwise the whole vectorial space would consist of only the vectors of the basis. Hope the question is clear.
 
  • #35
They are linearly independent in any possible pair, but not as triplet.

How do I know that without trying to applying the definition of linear independence and see where i get?

Well, there are some theorems governing linear independence. One of those theorems say that if the dimension of a vector space is n, then any ##N>n## vectors of the vector space are linearly dependent. Now also another theorem says that the dimension of the vector space that has all the vectors of an euclidean plane is 2. Here we have 3>2 vectors of the plane so I guess you understand how the theorems apply here.
 
  • Like
Likes PhDeezNutz
  • #36
Salmone said:
Hope the question is clear.
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
 
  • Like
Likes Salmone
  • #37
PeroK said:
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis. This is trivial, in ##R^2## the canonical basis is ##(0,1),(1,0)## and the vector ##(1,1)## is linear independent from, example, ##(0,1)##, in fact they are not scalar multiple, but the basis is still ##(0,1),(1,0)##. I meant this. I think this apply as a general behavior. Is it right?
 
Last edited:
  • #38
Salmone said:
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis.
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
 
  • Like
Likes Salmone
  • #39
PeroK said:
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually taken but IT IS linearly dependent from the whole basis?
 
  • #40
Salmone said:
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually
That's clear. There are only two vectors in the basis!
Salmone said:
taken but IT IS linearly dependent from on the whole basis?
Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
 
  • Like
Likes Salmone
  • #41
PeroK said:
That's clear. There are only two vectors in the basis!

Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
Yes I understand, I just wanted to be sure that if we consider a basis or a complet set of vectors of a vector space, the vectors outside the basis or the complete set must be linearly dependent from the basis (or the complete set) but can be linearly independent from individual vector within the basis, in other words I can take a vector outside the basis and express it as a linear combination of the vectors of the basis but this vector could even be linearly independent from the individual vector inside the basis such that ##\vec{v_1} \neq \lambda \vec{v_2}## where ##\vec{v_1}## is outside the basis and ##\vec{v_2}## is a vector of the basis. I think this is correct and trivial.
 
  • #42
Salmone said:
I think this is correct and trivial.
Yes it is!
 
  • Like
Likes Salmone
  • #43
There are several algorithms for determining linear independence.
  1. Two vectors \vec{a} and \vec{b} are linearly dependent if the outer product is 0
  2. A set of n+1 vectors in an n-dimensional space are linearly dependent
  3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.
 
  • Like
Likes Salmone and Delta2
  • #44
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.
Salmone said:
View attachment 301867
It seems to me that three vectors like these are linearly independent
Here's a picture of what I had in mind. I don't show the vectors extending out from the origin, but they can be moved around so that they are.

Here a and b are linearly independent, and a and c are linearly independent, but b and c are parallel, so those two vectors are linearly dependent.
vectors.png
 
  • Like
Likes Salmone
  • #45
Mark44 said:
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.

Note that "linear independence" is a property of the vector space alone and
is independent of (i.e. does not depend on) the inner product.
(As @PeroK says, the inner product is extra structure on top of a vector space that must be specified if one wants an inner product. Such structure does not automatically accompany a vector space.)

Svein said:
There are several algorithms for determining linear independence.
3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.

The determinant is a useful expression to determine linear dependence.
It is useful to think of the determinant as the [signed] volume of a [hyper-]parallelepiped
(The determinant does not rely on an inner product.)

Svein said:
A set of n+1 vectors in an n-dimensional space are linearly dependent
This is a useful geometric interpretation that does not require calculating the determinant.
 
Last edited:
  • Like
Likes Salmone
  • #46
In general, for finite vector spaces. There exist a basis. What a basis is :

1) A set S of linearly independent vectors.

2) S spans (generates) the vector space.

What we mean by span, is that vectors in S can be written as linear combinations of other vectors in S.

In the case, where Span{S} = V, every vector in V can be written as a linear combination of the elements in S.

Moreover, given a finite dimensional vector space. One can prove that any linear independent set of vectors (also called list) can be extended so that the new list (containing linear independent vectors not found in the original list) becomes a basis of V. Ie., this new list we added vectors spans V.

We can also prove that any span of vectors which spans V (in a finite-dimensional vector space) can be reduced to a linearly independent list, which still spans V. Which forms a basis of V.

These theorems are sometimes called plus and minus theorems, respectively.

Now, we can also prove that length of a linearly independent list is always less than or equal to the length of a list which spans V. Here V is finite-dimensional.

Which we can further use to prove, that the length of a basis of finite-dimensional space is the same, no matter what basis we used for V.

To answer your question. It is a an ill posed question. What is the vector space we are working in? Your answer can be yes/no depending on the dimension of V.

In general, the easiest, but often times the longest way of showing linear independence, is to use the definition of linear independence posted a few comments ago by Castillo?

This requires solving a system of equations using what ever method (typical algebra, Matrices, determinant).
What is cool about Gaussian Elimination is that we can use it to show if a list of vectors spans V.

Just from reading the thread, it appears that your issue is caused by you not understanding (knowing) definition of linear independence/dependence. Something that can be easily remedied by reading an introductory textbook on Linear Algebra.

My favorite intro LA books is the one written by Anton: Elementary Linear Algebra.
Another book, which is much simpler than Anton, and maybe a bit more intuitive (it sticks to R^3?) is the one by Paul Shields. A basic book, but its a well written...

Both can be found extremely cheap used...
 
  • Like
Likes Delta2

Similar threads

Back
Top