Register to reply

Linear Independence/Dependence

by Gipson
Tags: linear
Share this thread:
Studiot
#19
Dec2-12, 05:51 PM
P: 5,462
When you check such a book sometimes, as in this case, you need to check more than one definition.

You will find the mention of sets in the cross-referenced definition of 'linear combination'.
You have to follow the chain of definitions through.

However the entry did reference sets in another way as well.

linearly dependent. adjective. such that there is a linear combination (note cross reference) of the given elements with not all the coefficients equal to zero.
Now sets have elements do vectors?
micromass
#20
Dec2-12, 06:02 PM
Mentor
micromass's Avatar
P: 18,019
Quote Quote by Studiot View Post
When you check such a book sometimes, as in this case, you need to check more than one definition.

You will find the mention of sets in the cross-referenced definition of 'linear combination'.
You have to follow the chain of definitions through.
OK, so here are the two relevant definitions:

linearly dependent, adj. such that there is a LINEAR COMBINATION of the given elements, with not all coefficients zero, that equals zero. For example, u, v and w are linearly dependent vectors if there exist scalars a, b and c, not all zero, such that
[tex]au + bv + cw = 0. [/tex]
Elements are said to be K-linearly dependent if there is a set of such constants that are elements of some given K; for example, the vectors [itex](1,\pi)[/itex] and [itex](\pi, \pi^2)[/itex] are ([itex]\mathbb{R}[/itex]-linearly dependent, but not [itex]\mathbb{Q}[/itex]-linearly dependent (where [itex]\mathbb{R}[/itex] is the set of real numbers and [itex]\mathbb{Q}[/itex] is the set of rationals), since one of the required coefficients is a multiple of the irrational number [itex]\pi[/itex]. See also BASIS.

linear combination, n. a sum of the respective products of the elements of some set with constant coefficients. (It is sometimes required that not all the constants are zero.) For example, a linear combination of vectors u, v and w is any sum of the form
[tex]au + bv + cw, [/tex]
where a, b, and c are scalars.
OK, I'm still not seeing anything about sets [itex]\{1,\pi\}[/itex] and [itex]\{\pi,\pi^2\}[/itex] being linearly independent... And I'm also not reading anything about equations being linearly independent.

Now sets have elements do vectors?
No, vectors do not have elements in general.
Studiot
#21
Dec2-12, 06:19 PM
P: 5,462
So let us start with the definition given of linear combination.

It specifies a method of combining elements of a set.

It does not prohibit those elements being sets, numbers or other elements capable of forming combinations according to the rules specified.

It does allow those elements to be vectors and uses this case as an example.

But it does not restrict those elements to being vectors. I am quite sure that had the authors wanted this restriction they would have specified as such, as for instance they have done in their definition of a linear mapping.

Since it specifies the word elements in the plural it implies that there is more than one, which tallies with one of my earlier comments.

I am equally sure that I have seen many (mostly engineering it is true) books that talk of linearly dependent equations. Along the lines, for instance of

3x+2y=6 and 12x+8y=24 are linearly dependent since there exists a coefficient (4) that you can multiply equation 1 by to obtain equation 2.

The above example is very obvious but some are not and I was going to develop this to help the OP when these other issues arose.

Please also look at the second half of my post#13, where I note the definition scheme is the complement of that offered by Fredrik and also give my reasons for preferring it.
micromass
#22
Dec2-12, 06:30 PM
Mentor
micromass's Avatar
P: 18,019
Quote Quote by Studiot View Post
So let us start with the definition given of linear combination.

It specifies a method of combining elements of a set.

It does not prohibit those elements being sets, numbers or other elements capable of forming combinations according to the rules specified.

It does allow those elements to be vectors and uses this case as an example.
OK. But in order for the definition to even make sense, we need to have a notion of addition and multiplication. So we need to make sense of [itex]\mathbf{u}+\mathbf{v}[/itex] and [itex]a\mathbf{u}[/itex], where a is a scalar. Furthermore, you need a notion of equality and of a zero.

So, for sets and equations, how do you define these notions?
halo31
#23
Dec2-12, 07:22 PM
P: 51
To keep it simple, a linear combination is just the product of the columns of matrix A and some corresponding entry as weights from the column vector x. In other wards it's just Ax.
Bipolarity
#24
Dec3-12, 12:38 AM
P: 783
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can, and hence only vectors and matrices are expressible as linear combinations of one another. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view

BiP
micromass
#25
Dec3-12, 12:55 AM
Mentor
micromass's Avatar
P: 18,019
Quote Quote by Bipolarity View Post
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view

BiP
Well, you certainly can add sets by [itex]A+B=\{a+b~\vert~a\in A,~b\in B\}[/itex]. That's even going to be a vector space if A and B are. But this is merely a convenient notation. I have not yet seen sets being linearly independent of each other or something of the kind.

To be honest, I am quite interested in the "sets being linearly independent"-idea. Too bad Studiot doesn't have any references except an entry in a math encyclopedia. So if anybody has some actual references, I would be very happy to read about it!!
Studiot
#26
Dec3-12, 03:28 AM
P: 5,462
OK. But in order for the definition to even make sense, we need to have a notion of addition and multiplication. So we need to make sense of u+v and au, where a is a scalar. Furthermore, you need a notion of equality and of a zero.
Agreed, but I find the proceedure I was taught when I was 12 or 13 perfectly adequate and still in perfect accord with both the Fredrik definition and the Borowski definition.

Going back to my post# taking equations1,2 and adding a third 3x+y=18 I can:-

Form the linear combinations:

4(equation1) + (-1)(equation2) this is seen to equal 0 so the equations are linearly dependent and no solution is possible.

1(equation1) + (-1)(equation3) this is seen to be ≠ 0 and therefore the equations can be solved for x and y

Nothing in the Borowski definition implies we need the entire space of expressions of the form gx+hy=k to be able to perform these manipulations, although Fredrik's definition does seem to suggest this, though I won't deny that such a space is useful.
Studiot
#27
Dec3-12, 03:34 AM
P: 5,462
Studiot, I'm quite sure that equations/planes/spaces cannot be added and subtracted. Only vectors and matrices can, and hence only vectors and matrices are expressible as linear combinations of one another. My professor has drilled this to my head quite thoroughly, and I find the concept quite convenient from a rigorous point of view
I wonder if, perhaps, there was a particular context in which this was said by your professor?

halo31 has put is pretty shortly and the equations I presented can be expressed in that format, but the expression 3x+2y is neither a matrix nor a vector.

Of course you can add equations. The technique is extremely widely used in Physics and Engineering and of great importance. It is called superposition. Further if those equations represent some region then that is equivalent to adding those regions, but not all equations represent regions and not all regions have single equations.
MarneMath
#28
Dec3-12, 05:48 AM
P: 439
I have not yet seen sets being linearly independent of each other or something of the kind.
Oddly enough, I do recall a minor problem in a text book that described linear depedence and indepedence a property of sets. I don't remember much but here were some basic properties. If there exist two linearly indepedent sets, then must be linearly indepedent. The complement of a linearly indepedent set is linearly depedent.

I wish I could find this book again and see if it relates...
Studiot
#29
Dec3-12, 07:27 AM
P: 5,462
wish I could find this book again and see if it relates
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
Hoffman and Kunze p40?

Too bad Studiot doesn't have any references except an entry in a math encyclopedia. So if anybody has some actual references, I would be very happy to read about it!!
Don't know how to take these and other remarks.

My post was written first and I used Borowski to check my statements against.
Fredrik
#30
Dec3-12, 07:37 AM
Emeritus
Sci Advisor
PF Gold
Fredrik's Avatar
P: 9,225
Quote Quote by MarneMath View Post
Oddly enough, I do recall a minor problem in a text book that described linear depedence and indepedence a property of sets. I don't remember much but here were some basic properties. If there exist two linearly indepedent sets, then must be linearly indepedent. The complement of a linearly indepedent set is linearly depedent.

I wish I could find this book again and see if it relates...
When we say that x and y are linearly independent, we really mean that the set {x,y} is linearly independent. The thing that's linearly independent or linearly dependent is always a subset of a vector space. If we say that {x,y} and {u,v} are linearly independent, it just means that {x,y} is linearly independent and {u,v} is linearly independent. There is no notion of sets being linearly independent of each other that I know of.

It's fairly obvious that the intersection of two linearly independent sets is either empty or linearly independent. Maybe that's the result you had in mind.
Fredrik
#31
Dec3-12, 07:48 AM
Emeritus
Sci Advisor
PF Gold
Fredrik's Avatar
P: 9,225
Quote Quote by Studiot View Post
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
I looked at Kreyszig (Introductory functional analysis with applications). His definition is the same as mine. It tells us what it means for a subset of a vector space to be linearly independent. It doesn't tell us what it means for one subset to be linearly independent of another.
Studiot
#32
Dec3-12, 07:49 AM
P: 5,462
When we say that x and y are linearly independent, we really mean that the set {x,y} is linearly independent. The thing that's linearly independent or linearly dependent is always a subset of a vector space. If we say that {x,y} and {u,v} are linearly independent, it just means that {x,y} is linearly independent and {u,v} is linearly independent. There is no notion of sets being linearly independent of each other that I know of.
Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.
Fredrik
#33
Dec3-12, 08:16 AM
Emeritus
Sci Advisor
PF Gold
Fredrik's Avatar
P: 9,225
Do you mean that I should write these two lines as ##K=\{(p,q)\in\mathbb R^2|3p+2q=6\}## and ##L=\{(p,q)\in\mathbb R^2|12p+8q=24\}##? I wouldn't say that K and L are linearly dependent. I would just say that they're equal.
micromass
#34
Dec3-12, 09:41 AM
Mentor
micromass's Avatar
P: 18,019
Quote Quote by Studiot View Post
Agreed, but I find the proceedure I was taught when I was 12 or 13 perfectly adequate and still in perfect accord with both the Fredrik definition and the Borowski definition.
So why did you ignore my question?? It was a standard question: how do you define addition, multiplication and equality of sets? And how did you define the "zero" set??

Quote Quote by Studiot View Post
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
Hoffman and Kunze p40?
I searched for Griffel, but I couldn't find it. As for Kreyszig and Gupta, they have multiple books, so I don't know which one you mean.

As for Nering and Hoffman & Kunze:

Quote Quote by Nering
A set of vectors is said to be linearly dependent if there exists a non-trivial linear relation among them. Otherwise, the set is said to be linearly independent.
Quote Quote by Hoffman and Kunze
Definition. Let V be a vector space over F. A subset S of V is said to be linearly dependent (or simply, dependent) if there exist distinct vectors [itex]\alpha_1,\alpha_2,...,\alpha_n[/itex] in S and scalars [itex]c_1, c_2,...,c_n[/itex] in F, not all of which are 0, such that
[tex]c_1\alpha_1+c_2\alpha_2 + ... + c_n\alpha_n=0[/tex]
A set which is not linearly dependent is called linearly independent. If the set S contains only finitely many vectors [itex]\alpha_1,...,\alpha_n[/itex], we sometimes say that [itex]\alpha_1,...,\alpha_n[/itex] are dependent (or independent) instead of saying S is dependent (or independent) .

So the notion defined here is the linear independence of a set. I do not see a definition here of the linear independence of two sets or the linear independence of equations. These definitions are perfectly compatible with what Fredrik has said.

So none of these books actually agree with what you are saying. No offense, but I am starting to think that you are just misunderstanding the entire concept.

Don't know how to take these and other remarks.
Take it how you want. I meant what I said: I am interested in finding out more of this "linear dependence of sets", but I have yet to find a reference about it.
micromass
#35
Dec3-12, 09:42 AM
Mentor
micromass's Avatar
P: 18,019
Quote Quote by Studiot View Post
Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.
Would you please actually define when two sets are linearly dependent and would you please actually give a reference (or quote from your reference)
Studiot
#36
Dec3-12, 09:52 AM
P: 5,462
Quote by Studiot
Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.
Quote by Micromass
I'll have to admit that I have never heard of this definition before. (I mean: linear dependence of sets rather than vectors). Do you have any reference of a book that does this? I would be very interested in reading about t.
I clearly defined the combination of elements and you clearly understood this.

Why are you now asking for a zero set?

A vector is a set of points that satisfy certain conditions, specific to the problem in hand.

Since this is getting further and further from the OP and personal to boot I withdraw from this thread.


Register to reply

Related Discussions
Linear Dependence/Independence Precalculus Mathematics Homework 3
Linear Independence/Dependence of set Calculus & Beyond Homework 6
Linear independence and dependence Linear & Abstract Algebra 3
Linear Dependence/Independence Help Introductory Physics Homework 4