Linear independence of three vectors

In summary: Consider the vectors x, 2x, and 3x. I notice that for the equation ##c_1x + c_2*2x + c_3*3x = 0##, that ##c_1 = c_2 = c_3 = 0## is a solution. Can you conclude that these vectors are linearly independent?No, they are not linearly independent. No, they are not linearly independent.
  • #36
Salmone said:
Hope the question is clear.
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
 
  • Like
Likes Salmone
Mathematics news on Phys.org
  • #37
PeroK said:
I think so. If we take ##\mathbb R^2## as an example. ##\{(1,0), (0,1) \}## is the usual basis. Any other vector ##(x, y)## can be expressed in this basis:
$$(x, y) = x(1, 0) + y(0,1)$$This can be rewritten as:
$$1(x, y) - x(1, 0) - y(0,1) = 0$$Which shows the linear dependence of ##(x, y), (1,0)## and ##(0, 1)##.

But, as long as ##y \ne 0## the vectors ##(x, y)## and ##(1, 0)## form a basis. And, as long as ##x \ne 0##, then vectors ##(x, y)## and ##(0, 1)## form a basis.

In ##\mathbb R^2## any two "different" vectors forms a basis. Checking linear independence is easy in ##\mathbb R^2##.

It's no so simple in higher dimensions, because you can have three "different" vectors that are linearly dependent. E.g.
$$(1, 2, 3), (2, 1, 5), (5, 4, 13)$$It takes a bit of work to determine whether they are linearly independent or not.
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis. This is trivial, in ##R^2## the canonical basis is ##(0,1),(1,0)## and the vector ##(1,1)## is linear independent from, example, ##(0,1)##, in fact they are not scalar multiple, but the basis is still ##(0,1),(1,0)##. I meant this. I think this apply as a general behavior. Is it right?
 
Last edited:
  • #38
Salmone said:
I meant another thing: If we have a basis ##B=\vec{a},\vec{b},\vec{c}## and a vector ##\vec{d}## out of this basis which is linear independent with ##\vec{a}##, so that they are not multiple, the fact that ##\vec{d}##, despite is linear independent with a vector of the basis, is out of the basis, doesn't mean the basis is not a basis.
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
 
  • Like
Likes Salmone
  • #39
PeroK said:
Yes, a basis isn't every vector in the space. The whole point of a basis, for ##\mathbb R^3## say, is that we only need three vectors to span the space. Every other vector is then a linear combination of these three basis vectors. Very few vectors are scalar multiples of the basis vectors. Some are linear combinations of only two basis vectors. But, typically, a "random" vector is a linear combination of all three.

You can formalise this, by noting that any basis vector defines a 1D subspace and any two basis vectors define a 2D subspace. The three basis vectors define the whole space.

In geometrical terms, each vector defines a line; any two linearly independent vectors define a plane; and three linearly independent vectors define 3D space (or a 3D subspace of a higher dimensional space).
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually taken but IT IS linearly dependent from the whole basis?
 
  • #40
Salmone said:
Ok and of course, could exist a vector outside the basis that is linearly independent with a vector of the basis, like I've said in ##R^2##: ##(1,1) \neq \lambda(0,1)## and ##(1,1) \neq \lambda(1,0)## so ##(1,1)## is linearly independent from the two vectors of the basis individually
That's clear. There are only two vectors in the basis!
Salmone said:
taken but IT IS linearly dependent from on the whole basis?
Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
 
  • Like
Likes Salmone
  • #41
PeroK said:
That's clear. There are only two vectors in the basis!

Yes.

Here's some better terminology.

Let's take any two linearly independent vectors ##\vec a, \vec b##.

The span of ##\vec a## is a line, which is all the scalar multiples of ##\vec a##:
$$span(\vec a) = \{\lambda \vec a: \lambda \in \mathbb R \}$$This is all the vectors that are linearly dependent with ##\vec a##.

The span of ##\vec a, \vec b## is a plane:
$$span(\vec a, \vec b) = \{\lambda \vec a + \mu \vec b: \lambda, \mu \in \mathbb R \}$$The critical thing is that$$span(\vec a, \vec b) \ne span(\vec a) \cup span(\vec b)$$The left hand side is a plane and the right hand side is just two lines (in that plane).

In general, geometrically, you are confusing lines (defined by the individual vectors) with the plane defined by the two vectors taken together.

The geometric view and the algebraic view go together and say the same thing.
Yes I understand, I just wanted to be sure that if we consider a basis or a complet set of vectors of a vector space, the vectors outside the basis or the complete set must be linearly dependent from the basis (or the complete set) but can be linearly independent from individual vector within the basis, in other words I can take a vector outside the basis and express it as a linear combination of the vectors of the basis but this vector could even be linearly independent from the individual vector inside the basis such that ##\vec{v_1} \neq \lambda \vec{v_2}## where ##\vec{v_1}## is outside the basis and ##\vec{v_2}## is a vector of the basis. I think this is correct and trivial.
 
  • #42
Salmone said:
I think this is correct and trivial.
Yes it is!
 
  • Like
Likes Salmone
  • #43
There are several algorithms for determining linear independence.
  1. Two vectors [itex]\vec{a} [/itex] and [itex]\vec{b} [/itex] are linearly dependent if the outer product is 0
  2. A set of n+1 vectors in an n-dimensional space are linearly dependent
  3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.
 
  • Like
Likes Salmone and Delta2
  • #44
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.
Salmone said:
View attachment 301867
It seems to me that three vectors like these are linearly independent
Here's a picture of what I had in mind. I don't show the vectors extending out from the origin, but they can be moved around so that they are.

Here a and b are linearly independent, and a and c are linearly independent, but b and c are parallel, so those two vectors are linearly dependent.
vectors.png
 
  • Like
Likes Salmone
  • #45
Mark44 said:
PeroK said:
There's no such thing as "orthogonal" until you define an inner product on your vector space.
I believe that "orthogonal" is synonymous with "perpendicular" in the sense that the OP meant.

Note that "linear independence" is a property of the vector space alone and
is independent of (i.e. does not depend on) the inner product.
(As @PeroK says, the inner product is extra structure on top of a vector space that must be specified if one wants an inner product. Such structure does not automatically accompany a vector space.)

Svein said:
There are several algorithms for determining linear independence.
3. List the coordinates of the vectors and use Gaussian elimination. If one of the resultant vectors becomes a null vector, they are linearly dependent.

The determinant is a useful expression to determine linear dependence.
It is useful to think of the determinant as the [signed] volume of a [hyper-]parallelepiped
(The determinant does not rely on an inner product.)

Svein said:
A set of n+1 vectors in an n-dimensional space are linearly dependent
This is a useful geometric interpretation that does not require calculating the determinant.
 
Last edited:
  • Like
Likes Salmone
  • #46
In general, for finite vector spaces. There exist a basis. What a basis is :

1) A set S of linearly independent vectors.

2) S spans (generates) the vector space.

What we mean by span, is that vectors in S can be written as linear combinations of other vectors in S.

In the case, where Span{S} = V, every vector in V can be written as a linear combination of the elements in S.

Moreover, given a finite dimensional vector space. One can prove that any linear independent set of vectors (also called list) can be extended so that the new list (containing linear independent vectors not found in the original list) becomes a basis of V. Ie., this new list we added vectors spans V.

We can also prove that any span of vectors which spans V (in a finite-dimensional vector space) can be reduced to a linearly independent list, which still spans V. Which forms a basis of V.

These theorems are sometimes called plus and minus theorems, respectively.

Now, we can also prove that length of a linearly independent list is always less than or equal to the length of a list which spans V. Here V is finite-dimensional.

Which we can further use to prove, that the length of a basis of finite-dimensional space is the same, no matter what basis we used for V.

To answer your question. It is a an ill posed question. What is the vector space we are working in? Your answer can be yes/no depending on the dimension of V.

In general, the easiest, but often times the longest way of showing linear independence, is to use the definition of linear independence posted a few comments ago by Castillo?

This requires solving a system of equations using what ever method (typical algebra, Matrices, determinant).
What is cool about Gaussian Elimination is that we can use it to show if a list of vectors spans V.

Just from reading the thread, it appears that your issue is caused by you not understanding (knowing) definition of linear independence/dependence. Something that can be easily remedied by reading an introductory textbook on Linear Algebra.

My favorite intro LA books is the one written by Anton: Elementary Linear Algebra.
Another book, which is much simpler than Anton, and maybe a bit more intuitive (it sticks to R^3?) is the one by Paul Shields. A basic book, but its a well written...

Both can be found extremely cheap used...
 
  • Like
Likes Delta2

Similar threads

Replies
3
Views
617
Replies
8
Views
1K
  • General Math
Replies
1
Views
713
Replies
5
Views
1K
Replies
9
Views
1K
  • General Math
Replies
25
Views
2K
  • General Math
2
Replies
42
Views
3K
Replies
15
Views
1K
Replies
2
Views
1K
Replies
2
Views
321
Back
Top