Linear Independence/Dependence

  • Thread starter Gipson
  • Start date
  • Tags
    Linear
In summary, the conversation discusses the concepts of linear independence and linear dependence as they apply to vectors, matrices, and systems of linear equations. The main points are that linear independence means there is no way to express any of the elements as a linear combination of the others, while linear dependence means that such a combination exists. The conversation also touches on the idea of linear independence in the context of finite-dimensional vector spaces, as well as the difference between consistent and inconsistent systems of equations.
  • #1
Gipson
2
0
Greetings.

I am new to the forums. I will try and keep this short.

Linear Independence vs Linear Dependence. It's easier to understand from a vector perspective but when it's brought back to a system of equations, I get twisted in symantics.

(1) Consistent: Linearly independent vs Linearly dependent. If it's consistent, it has one or more solutions: Linear independent (one solution). Linear dependent (infinite solutions). Linear independent also means equations can't be expressed as a linear combination of the others. If they are Linear dependent, they can.

(2) Inconsistent: Linearly independent vs Linearly dependent. Since inconsistent means no solutions. You can't label a system of equations as linearly independent/dependent since both means a solution exists.


I thought I had all this straight until I read a paper here. This gentleman quotes James/James, Mathematics Directory as his source and doesn't seem to agree with (2).

If anyone can untangle me, I would appreciate it.

Anthony
 
Physics news on Phys.org
  • #2
Yes you got the right ideas going for linearly independence and linearly dependence. A better way to put the second part is the only time a linear system is inconsistent is when the column vector b becomes a pivot column after you have row reduced the matrix. In other wards when get a row that looks like this. (0,0...0|b). Thus the linear system is neither linearly indep. or linearly dependent. I looked at the paper and how he words his theorem is correct but confusing.
 
  • #3
Linear independence/dependence is a property of vectors and matrices, not of equations.

BiP
 
  • #4
Why do you think equations, or even just two numbers cannot be linearly dependent?

Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.

Two sets are linearly independent if they are not dependent.

For example the sets {1, ∏} and {∏, ∏2} are linearly dependent, if we can draw coefficients from R but not (ie linearly independent) if the coefficients have to come from Q.
 
  • #5
Studiot said:
For example the sets {1, ∏} and {∏, ∏2} are linearly dependent, if we can draw coefficients from R but not (ie linearly independent) if the coefficients have to come from Q.

Those two sets are linearly dependent since one element (π) of the first set is equal to a linear combination (1π+0π2) of the elements of the second set. And the coefficients 0 and 1 are members of any field, not just of R.
 
  • #6
What coefficient in Q can you multiply the first set by to obtain the second?

At least we are agreed that the sets are R-linearly dependent.
 
  • #7
Studiot said:
Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.

Two sets are linearly independent if they are not dependent.

For example the sets {1, ∏} and {∏, ∏2} are linearly dependent, if we can draw coefficients from R but not (ie linearly independent) if the coefficients have to come from Q.
I'm not sure what you're doing here. It looks like you have misinterpreteted some definition.

This is the definition of "linearly independent" in the context of finite-dimensional vector spaces:

Let V be a finite-dimensional vector space over a field ##\mathbb F##. A set ##\{x_1,\dots,x_n\}\subset V## is said to be linearly independent, if for all ##(a_1,\dots,a_n)\in\mathbb F^n## such that
$$a_1x_1+\dots+a_nx_n=0,$$ we have ##a_1=\cdots=a_n=0##.

Note that it's always one set that's linearly independent or linearly dependent, not two.
 
Last edited:
  • #8
Hello Fredrik,

You might like to review your penultimate line ?

Where is it decreed that linear dependence must be viewed in terms of finite dimensional edit: vector spaces?

Yes, strictly the elements that the adjective 'linear dependence' applies to may be gathered together into one set.

Provided that there is more than one element!

However the elements of the set may themselves be sets, which is the case in my example.

The elements may also be equations, which was disputed by Bipolarity and the subject of the original question and the reason for my first post in this thread.
 
Last edited:
  • #9
The elements may also be equations

The elements could certainly be functions, but it would be very unusual to have a set of equations as a vector space (as far as I know).
 
  • #10
Hi Studiot. I don't see anything wrong with the penultimate line (unless you just meant that it would have been clearer if I had included at least one more of the a variables, or said it like this instead: "...we have ##a_k=0## for all integers k such that ##1\leq k\leq n## ". I thought it was obvious enough that this was what I meant). Maybe an example will make things clearer. Consider the vector space (over ℝ) of polynomials of degree 2 or less. Let f,g be defined by ##f(x)=x## for all ##x\in\mathbb R## and ##g(x)=x^2## for all ##x\in\mathbb R##. The set {f,g} should be linearly independent, right? This is what we have to prove to verify that it is:

For all ##(a,b)\in\mathbb R^2## such that ##af+bg=0##, we have ##a=b=0##.

Linear independence of a system of linear equations can be viewed as a special case of the kind of linear independence I defined. Consider e.g. the equations
\begin{align}
a_{11}x_1+a_{12}x_2+a_{13}x_3 &=y_1\\
a_{21}x_1+a_{22}x_2+a_{23}x_3 &=y_2.
\end{align} This system is just another way of writing the matrix equation
$$\begin{pmatrix}a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\end{pmatrix}\begin{pmatrix}x_1 \\ x_2 \\ x_3\end{pmatrix} =\begin{pmatrix}y_1 \\ y_2\end{pmatrix}.$$ The set of linear equations is said to be linearly independent if the set
$$\big\{\begin{pmatrix}a_{11} & a_{12} & a_{13}\end{pmatrix},\begin{pmatrix}a_{21} & a_{22} & a_{23}\end{pmatrix}\big\}$$ (where these 1×3 matrices are viewed as members of the vector space ℝ3) is linearly independent.

Edit: I won't be making any more posts tonight, so if anyone has questions about this, I hope someone else will answer them.
 
Last edited:
  • #11
A linear equation defines a hyper plane of Euclidean space.

A system of linear equations defines a set of hyperplanes. Their solution set is the points in their common intersection.

If these planes have no common point of intersection then the system is said to be inconsistent.

If one of the planes does not alter the intersection set determined by the others, then it is said to be dependent.
 
  • #12
Studiot said:
Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.

I'll have to admit that I have never heard of this definition before. (I mean: linear dependence of sets rather than vectors). Do you have any reference of a book that does this? I would be very interested in reading about t.
 
  • #13
Hello Fredrik,

Firstly I read you definition a little to quickly and I see that your penultimate line is in fact consistent with your definition.

My comment was motivated by the fact that I am used to an alternative definition scheme, which has certain advantages.

That is 'linear dependence' is defined.

The advantage of this is that you can then declare that anything that does not meet this definition as linearly independent.

You have defined linear independence, but it does not follow that anything that does not meet this definition is linearly dependent.
 
  • #14
I'll have to admit that I have never heard of this definition before. (I mean: linear dependence of sets rather than vectors). Do you have any reference of a book that does this? I would be very interested in reading about t.

My example of {1,∏} and {∏,∏2} can be established as vectors, if you have a vector space.

By simply employing them as sets I do no need to do this.

The definition, comes from the Reference Dictionary of Marthematics by Borowski and Borwein.

I have found it very subtle in some instances the past.
 
  • #15
Studiot said:
My example of {1,∏} and {∏,∏2} can be established as vectors, if you have a vector space.

Which vector space??
 
  • #16
Studiot said:
My example of {1,∏} and {∏,∏2} can be established as vectors, if you have a vector space.

By simply employing them as sets I do no need to do this.

The definition, comes from the Reference Dictionary of Marthematics by Borowski and Borwein.

I have found it very subtle in some instances the past.

I checked your book and there wasn't a mention of linear dependence of sets. They did give the [itex](1,\pi), ~ (\pi,\pi^2)[/itex] example, but it was clear that they mean this as couples in [itex]\mathbb{R}^2[/itex] and not as sets of two elements.
 
  • #17
Vectors of the form {a,b} with a, b contained in R.

(I wish I could find the appropriate symbols more easily)
 
  • #18
Studiot said:
Vectors of the form {a,b} with a, b contained in R.

(I wish I could find the appropriate symbols more easily)

You just mean the couple (a,b) ?? That's very different from {a,b}.

No book I've ever seen talks about sets {a,b} and {c,d} being linearly dependent...
 
  • #19
When you check such a book sometimes, as in this case, you need to check more than one definition.

You will find the mention of sets in the cross-referenced definition of 'linear combination'.
You have to follow the chain of definitions through.

However the entry did reference sets in another way as well.

linearly dependent. adjective. such that there is a linear combination (note cross reference) of the given elements with not all the coefficients equal to zero.

Now sets have elements do vectors?
 
  • #20
Studiot said:
When you check such a book sometimes, as in this case, you need to check more than one definition.

You will find the mention of sets in the cross-referenced definition of 'linear combination'.
You have to follow the chain of definitions through.

OK, so here are the two relevant definitions:

linearly dependent, adj. such that there is a LINEAR COMBINATION of the given elements, with not all coefficients zero, that equals zero. For example, u, v and w are linearly dependent vectors if there exist scalars a, b and c, not all zero, such that
[tex]au + bv + cw = 0. [/tex]
Elements are said to be K-linearly dependent if there is a set of such constants that are elements of some given K; for example, the vectors [itex](1,\pi)[/itex] and [itex](\pi, \pi^2)[/itex] are ([itex]\mathbb{R}[/itex]-linearly dependent, but not [itex]\mathbb{Q}[/itex]-linearly dependent (where [itex]\mathbb{R}[/itex] is the set of real numbers and [itex]\mathbb{Q}[/itex] is the set of rationals), since one of the required coefficients is a multiple of the irrational number [itex]\pi[/itex]. See also BASIS.

linear combination, n. a sum of the respective products of the elements of some set with constant coefficients. (It is sometimes required that not all the constants are zero.) For example, a linear combination of vectors u, v and w is any sum of the form
[tex]au + bv + cw, [/tex]
where a, b, and c are scalars.

OK, I'm still not seeing anything about sets [itex]\{1,\pi\}[/itex] and [itex]\{\pi,\pi^2\}[/itex] being linearly independent... And I'm also not reading anything about equations being linearly independent.

Now sets have elements do vectors?

No, vectors do not have elements in general.
 
  • #21
So let us start with the definition given of linear combination.

It specifies a method of combining elements of a set.

It does not prohibit those elements being sets, numbers or other elements capable of forming combinations according to the rules specified.

It does allow those elements to be vectors and uses this case as an example.

But it does not restrict those elements to being vectors. I am quite sure that had the authors wanted this restriction they would have specified as such, as for instance they have done in their definition of a linear mapping.

Since it specifies the word elements in the plural it implies that there is more than one, which tallies with one of my earlier comments.

I am equally sure that I have seen many (mostly engineering it is true) books that talk of linearly dependent equations. Along the lines, for instance of

3x+2y=6 and 12x+8y=24 are linearly dependent since there exists a coefficient (4) that you can multiply equation 1 by to obtain equation 2.

The above example is very obvious but some are not and I was going to develop this to help the OP when these other issues arose.

Please also look at the second half of my post#13, where I note the definition scheme is the complement of that offered by Fredrik and also give my reasons for preferring it.
 
  • #22
Studiot said:
So let us start with the definition given of linear combination.

It specifies a method of combining elements of a set.

It does not prohibit those elements being sets, numbers or other elements capable of forming combinations according to the rules specified.

It does allow those elements to be vectors and uses this case as an example.

OK. But in order for the definition to even make sense, we need to have a notion of addition and multiplication. So we need to make sense of [itex]\mathbf{u}+\mathbf{v}[/itex] and [itex]a\mathbf{u}[/itex], where a is a scalar. Furthermore, you need a notion of equality and of a zero.

So, for sets and equations, how do you define these notions?
 
  • #23
To keep it simple, a linear combination is just the product of the columns of matrix A and some corresponding entry as weights from the column vector x. In other wards it's just Ax.
 
  • #24
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can, and hence only vectors and matrices are expressible as linear combinations of one another. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view

BiP
 
Last edited:
  • #25
Bipolarity said:
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view

BiP

Well, you certainly can add sets by [itex]A+B=\{a+b~\vert~a\in A,~b\in B\}[/itex]. That's even going to be a vector space if A and B are. But this is merely a convenient notation. I have not yet seen sets being linearly independent of each other or something of the kind.

To be honest, I am quite interested in the "sets being linearly independent"-idea. Too bad Studiot doesn't have any references except an entry in a math encyclopedia. So if anybody has some actual references, I would be very happy to read about it!
 
  • #26
OK. But in order for the definition to even make sense, we need to have a notion of addition and multiplication. So we need to make sense of u+v and au, where a is a scalar. Furthermore, you need a notion of equality and of a zero.

Agreed, but I find the proceedure I was taught when I was 12 or 13 perfectly adequate and still in perfect accord with both the Fredrik definition and the Borowski definition.

Going back to my post# taking equations1,2 and adding a third 3x+y=18 I can:-

Form the linear combinations:

4(equation1) + (-1)(equation2) this is seen to equal 0 so the equations are linearly dependent and no solution is possible.

1(equation1) + (-1)(equation3) this is seen to be ≠ 0 and therefore the equations can be solved for x and y

Nothing in the Borowski definition implies we need the entire space of expressions of the form gx+hy=k to be able to perform these manipulations, although Fredrik's definition does seem to suggest this, though I won't deny that such a space is useful.
 
  • #27
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can, and hence only vectors and matrices are expressible as linear combinations of one another. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view

I wonder if, perhaps, there was a particular context in which this was said by your professor?

halo31 has put is pretty shortly and the equations I presented can be expressed in that format, but the expression 3x+2y is neither a matrix nor a vector.

Of course you can add equations. The technique is extremely widely used in Physics and Engineering and of great importance. It is called superposition. Further if those equations represent some region then that is equivalent to adding those regions, but not all equations represent regions and not all regions have single equations.
 
Last edited:
  • #28
I have not yet seen sets being linearly independent of each other or something of the kind.

Oddly enough, I do recall a minor problem in a textbook that described linear depedence and indepedence a property of sets. I don't remember much but here were some basic properties. If there exist two linearly indepedent sets, then must be linearly indepedent. The complement of a linearly indepedent set is linearly depedent.

I wish I could find this book again and see if it relates...
 
  • #29
wish I could find this book again and see if it relates

Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
Hoffman and Kunze p40?

Too bad Studiot doesn't have any references except an entry in a math encyclopedia. So if anybody has some actual references, I would be very happy to read about it!

Don't know how to take these and other remarks.

My post was written first and I used Borowski to check my statements against.
 
Last edited:
  • #30
MarneMath said:
Oddly enough, I do recall a minor problem in a textbook that described linear depedence and indepedence a property of sets. I don't remember much but here were some basic properties. If there exist two linearly indepedent sets, then must be linearly indepedent. The complement of a linearly indepedent set is linearly depedent.

I wish I could find this book again and see if it relates...

When we say that x and y are linearly independent, we really mean that the set {x,y} is linearly independent. The thing that's linearly independent or linearly dependent is always a subset of a vector space. If we say that {x,y} and {u,v} are linearly independent, it just means that {x,y} is linearly independent and {u,v} is linearly independent. There is no notion of sets being linearly independent of each other that I know of.

It's fairly obvious that the intersection of two linearly independent sets is either empty or linearly independent. Maybe that's the result you had in mind.
 
  • #31
Studiot said:
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
I looked at Kreyszig (Introductory functional analysis with applications). His definition is the same as mine. It tells us what it means for a subset of a vector space to be linearly independent. It doesn't tell us what it means for one subset to be linearly independent of another.
 
  • #32
When we say that x and y are linearly independent, we really mean that the set {x,y} is linearly independent. The thing that's linearly independent or linearly dependent is always a subset of a vector space. If we say that {x,y} and {u,v} are linearly independent, it just means that {x,y} is linearly independent and {u,v} is linearly independent. There is no notion of sets being linearly independent of each other that I know of.

Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.
 
  • #33
Do you mean that I should write these two lines as ##K=\{(p,q)\in\mathbb R^2|3p+2q=6\}## and ##L=\{(p,q)\in\mathbb R^2|12p+8q=24\}##? I wouldn't say that K and L are linearly dependent. I would just say that they're equal.
 
  • #34
Studiot said:
Agreed, but I find the proceedure I was taught when I was 12 or 13 perfectly adequate and still in perfect accord with both the Fredrik definition and the Borowski definition.

So why did you ignore my question?? It was a standard question: how do you define addition, multiplication and equality of sets? And how did you define the "zero" set??

Studiot said:
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
Hoffman and Kunze p40?

I searched for Griffel, but I couldn't find it. As for Kreyszig and Gupta, they have multiple books, so I don't know which one you mean.

As for Nering and Hoffman & Kunze:

Nering said:
A set of vectors is said to be linearly dependent if there exists a non-trivial linear relation among them. Otherwise, the set is said to be linearly independent.

Hoffman and Kunze said:
Definition. Let V be a vector space over F. A subset S of V is said to be linearly dependent (or simply, dependent) if there exist distinct vectors [itex]\alpha_1,\alpha_2,...,\alpha_n[/itex] in S and scalars [itex]c_1, c_2,...,c_n[/itex] in F, not all of which are 0, such that
[tex]c_1\alpha_1+c_2\alpha_2 + ... + c_n\alpha_n=0[/tex]
A set which is not linearly dependent is called linearly independent. If the set S contains only finitely many vectors [itex]\alpha_1,...,\alpha_n[/itex], we sometimes say that [itex]\alpha_1,...,\alpha_n[/itex] are dependent (or independent) instead of saying S is dependent (or independent) .


So the notion defined here is the linear independence of a set. I do not see a definition here of the linear independence of two sets or the linear independence of equations. These definitions are perfectly compatible with what Fredrik has said.

So none of these books actually agree with what you are saying. No offense, but I am starting to think that you are just misunderstanding the entire concept.

Don't know how to take these and other remarks.

Take it how you want. I meant what I said: I am interested in finding out more of this "linear dependence of sets", but I have yet to find a reference about it.
 
  • #35
Studiot said:
Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.

Would you please actually define when two sets are linearly dependent and would you please actually give a reference (or quote from your reference)
 

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
848
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
882
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
1
Views
1K
Back
Top