I Linear independence of three vectors

Click For Summary
For three vectors to be linearly independent, it is not sufficient to check the independence of each pair; all vectors must be considered together. If two vectors are linearly independent, the third vector must also not be expressible as a linear combination of the first two. A vector can be independent of one vector while being dependent on another, highlighting the complexity of linear independence in higher dimensions. The discussion emphasizes that linear independence requires a holistic approach rather than pairwise checks. Therefore, the initial question about the independence of vector c from b, given its independence from a, is answered negatively; independence must be verified among all vectors collectively.
  • #31
Salmone said:
View attachment 301867
It seems to me that three vectors like these are linearly independent
Note that for three vectors linear independence is a test of all three vectors. It cannot be reduced to a test for each pair of vectors.

In your diagram, any of the three vectors can be expressed as a sum of the other two. So, they are linearly dependent. You can have at most two linearly independent vectors in ##\mathbb R^2##.

It's not a question of linear independence of any two vectors. It's a question of all three vectors taken together.
 
  • Like
Likes Salmone
Mathematics news on Phys.org
  • #32
PeroK said:
Note that for three vectors linear independence is a test of all three vectors. It cannot be reduced to a test for each pair of vectors.

In your diagram, any of the three vectors can be expressed as a sum of the other two. So, they are linearly dependent. You can have at most two linearly independent vectors in ##\mathbb R^2##.

It's not a question of linear independence of any two vectors. It's a question of all three vectors taken together.
Ok so in general the answer to my first question is no, if I have 3 or more vectors I can't check their linearly independence just looking at 2 vectors per time, I need to check the whole independence.
 
  • #33
Salmone said:
Ok so in general the answer to my first question is no, if I have 3 or more vectors I can't check their linearly independence just looking at 2 vectors per time, I need to check the whole independence.
Precisely.

Note that unless two vectors are "in the same direction", i.e. "essentially the same vector", then they are always linearly independent.
 
  • Like
Likes Salmone
  • #34
PeroK said:
Precisely.

Note that unless two vectors are "in the same direction", i.e. "essentially the same vector", then they are always linearly independent.
So the same can be said about a basis, a basis is the set made of the maximum number of linearly independent vectors, let's say ##B={\vec{a},\vec{b},\vec{c}}##. This means that no vectors linearly independent with the vectors of the set can be left out of the set, but a single vector ##\vec{d}##, for example linearly independent only with ##\vec{a}##, doesn't need to be in the set since the linear independence must be among all the vectors in the set and for sure the vectors of the vectorial space outside the basis will be "not on the same direction" of the single vectors inside the basis, otherwise the whole vectorial space would consist of only the vectors of the basis. Hope the question is clear.
 
  • #35
They are linearly independent in any possible pair, but not as triplet.

How do I know that without trying to applying the definition of linear independence and see where i get?

Well, there are some theorems governing linear independence. One of those theorems say that if the dimension of a vector space is n, then any ##N>n## vectors of the vector space are linearly dependent. Now also another theorem says that the dimension of the vector space that has all the vectors of an euclidean plane is 2. Here we have 3>2 vectors of the plane so I guess you understand how the theorems apply here.
 
  • Like
Likes PhDeezNutz
  • #36
Salmone said:
Hope the question is clear.
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
 
  • Like
Likes Salmone
  • #37
PeroK said:
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis. This is trivial, in ##R^2## the canonical basis is ##(0,1),(1,0)## and the vector ##(1,1)## is linear independent from, example, ##(0,1)##, in fact they are not scalar multiple, but the basis is still ##(0,1),(1,0)##. I meant this. I think this apply as a general behavior. Is it right?
 
Last edited:
  • #38
Salmone said:
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis.
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
 
  • Like
Likes Salmone
  • #39
PeroK said:
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually taken but IT IS linearly dependent from the whole basis?
 
  • #40
Salmone said:
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually
That's clear. There are only two vectors in the basis!
Salmone said:
taken but IT IS linearly dependent from on the whole basis?
Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
 
  • Like
Likes Salmone
  • #41
PeroK said:
That's clear. There are only two vectors in the basis!

Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
Yes I understand, I just wanted to be sure that if we consider a basis or a complet set of vectors of a vector space, the vectors outside the basis or the complete set must be linearly dependent from the basis (or the complete set) but can be linearly independent from individual vector within the basis, in other words I can take a vector outside the basis and express it as a linear combination of the vectors of the basis but this vector could even be linearly independent from the individual vector inside the basis such that ##\vec{v_1} \neq \lambda \vec{v_2}## where ##\vec{v_1}## is outside the basis and ##\vec{v_2}## is a vector of the basis. I think this is correct and trivial.
 
  • #42
Salmone said:
I think this is correct and trivial.
Yes it is!
 
  • Like
Likes Salmone
  • #43
There are several algorithms for determining linear independence.
  1. Two vectors \vec{a} and \vec{b} are linearly dependent if the outer product is 0
  2. A set of n+1 vectors in an n-dimensional space are linearly dependent
  3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.
 
  • Like
Likes Salmone and Delta2
  • #44
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.
Salmone said:
View attachment 301867
It seems to me that three vectors like these are linearly independent
Here's a picture of what I had in mind. I don't show the vectors extending out from the origin, but they can be moved around so that they are.

Here a and b are linearly independent, and a and c are linearly independent, but b and c are parallel, so those two vectors are linearly dependent.
vectors.png
 
  • Like
Likes Salmone
  • #45
Mark44 said:
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.

Note that "linear independence" is a property of the vector space alone and
is independent of (i.e. does not depend on) the inner product.
(As @PeroK says, the inner product is extra structure on top of a vector space that must be specified if one wants an inner product. Such structure does not automatically accompany a vector space.)

Svein said:
There are several algorithms for determining linear independence.
3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.

The determinant is a useful expression to determine linear dependence.
It is useful to think of the determinant as the [signed] volume of a [hyper-]parallelepiped
(The determinant does not rely on an inner product.)

Svein said:
A set of n+1 vectors in an n-dimensional space are linearly dependent
This is a useful geometric interpretation that does not require calculating the determinant.
 
Last edited:
  • Like
Likes Salmone
  • #46
In general, for finite vector spaces. There exist a basis. What a basis is :

1) A set S of linearly independent vectors.

2) S spans (generates) the vector space.

What we mean by span, is that vectors in S can be written as linear combinations of other vectors in S.

In the case, where Span{S} = V, every vector in V can be written as a linear combination of the elements in S.

Moreover, given a finite dimensional vector space. One can prove that any linear independent set of vectors (also called list) can be extended so that the new list (containing linear independent vectors not found in the original list) becomes a basis of V. Ie., this new list we added vectors spans V.

We can also prove that any span of vectors which spans V (in a finite-dimensional vector space) can be reduced to a linearly independent list, which still spans V. Which forms a basis of V.

These theorems are sometimes called plus and minus theorems, respectively.

Now, we can also prove that length of a linearly independent list is always less than or equal to the length of a list which spans V. Here V is finite-dimensional.

Which we can further use to prove, that the length of a basis of finite-dimensional space is the same, no matter what basis we used for V.

To answer your question. It is a an ill posed question. What is the vector space we are working in? Your answer can be yes/no depending on the dimension of V.

In general, the easiest, but often times the longest way of showing linear independence, is to use the definition of linear independence posted a few comments ago by Castillo?

This requires solving a system of equations using what ever method (typical algebra, Matrices, determinant).
What is cool about Gaussian Elimination is that we can use it to show if a list of vectors spans V.

Just from reading the thread, it appears that your issue is caused by you not understanding (knowing) definition of linear independence/dependence. Something that can be easily remedied by reading an introductory textbook on Linear Algebra.

My favorite intro LA books is the one written by Anton: Elementary Linear Algebra.
Another book, which is much simpler than Anton, and maybe a bit more intuitive (it sticks to R^3?) is the one by Paul Shields. A basic book, but its a well written...

Both can be found extremely cheap used...
 
  • Like
Likes Delta2

Similar threads

Replies
3
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 42 ·
2
Replies
42
Views
4K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K