Is Linear Independence Correctly Applied in These Examples?

  • Thread starter Thread starter Gipson
  • Start date Start date
  • Tags Tags
    Linear
Click For Summary
The discussion revolves around the concepts of linear independence and dependence, particularly in the context of systems of equations. It clarifies that a consistent system can be linearly independent (one solution) or dependent (infinite solutions), while an inconsistent system cannot be classified as either since it has no solutions. Participants debate the application of these definitions to equations versus vectors, with some arguing that linear dependence can apply to sets of equations. The conversation also touches on the nuances of definitions from various mathematical sources, emphasizing the importance of context in understanding linear independence. Overall, the thread seeks to clarify the correct application of linear independence in different mathematical scenarios.
  • #31
Studiot said:
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
I looked at Kreyszig (Introductory functional analysis with applications). His definition is the same as mine. It tells us what it means for a subset of a vector space to be linearly independent. It doesn't tell us what it means for one subset to be linearly independent of another.
 
Physics news on Phys.org
  • #32
When we say that x and y are linearly independent, we really mean that the set {x,y} is linearly independent. The thing that's linearly independent or linearly dependent is always a subset of a vector space. If we say that {x,y} and {u,v} are linearly independent, it just means that {x,y} is linearly independent and {u,v} is linearly independent. There is no notion of sets being linearly independent of each other that I know of.

Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.
 
  • #33
Do you mean that I should write these two lines as ##K=\{(p,q)\in\mathbb R^2|3p+2q=6\}## and ##L=\{(p,q)\in\mathbb R^2|12p+8q=24\}##? I wouldn't say that K and L are linearly dependent. I would just say that they're equal.
 
  • #34
Studiot said:
Agreed, but I find the proceedure I was taught when I was 12 or 13 perfectly adequate and still in perfect accord with both the Fredrik definition and the Borowski definition.

So why did you ignore my question?? It was a standard question: how do you define addition, multiplication and equality of sets? And how did you define the "zero" set??

Studiot said:
Nering p11?
Kreysig p53?
Griffel p89?
Gupta 2.23, 1.17?
Hoffman and Kunze p40?

I searched for Griffel, but I couldn't find it. As for Kreyszig and Gupta, they have multiple books, so I don't know which one you mean.

As for Nering and Hoffman & Kunze:

Nering said:
A set of vectors is said to be linearly dependent if there exists a non-trivial linear relation among them. Otherwise, the set is said to be linearly independent.

Hoffman and Kunze said:
Definition. Let V be a vector space over F. A subset S of V is said to be linearly dependent (or simply, dependent) if there exist distinct vectors \alpha_1,\alpha_2,...,\alpha_n in S and scalars c_1, c_2,...,c_n in F, not all of which are 0, such that
c_1\alpha_1+c_2\alpha_2 + ... + c_n\alpha_n=0
A set which is not linearly dependent is called linearly independent. If the set S contains only finitely many vectors \alpha_1,...,\alpha_n, we sometimes say that \alpha_1,...,\alpha_n are dependent (or independent) instead of saying S is dependent (or independent) .


So the notion defined here is the linear independence of a set. I do not see a definition here of the linear independence of two sets or the linear independence of equations. These definitions are perfectly compatible with what Fredrik has said.

So none of these books actually agree with what you are saying. No offense, but I am starting to think that you are just misunderstanding the entire concept.

Don't know how to take these and other remarks.

Take it how you want. I meant what I said: I am interested in finding out more of this "linear dependence of sets", but I have yet to find a reference about it.
 
  • #35
Studiot said:
Yes, but consider

The set of all values of p, q for which 3p+2q=6

and the set of all values for which 12p+8q=24

Put these into your x,y format and you can see that you have two sets which are linearly dependent, since they are essentially the same set.

Would you please actually define when two sets are linearly dependent and would you please actually give a reference (or quote from your reference)
 
  • #36
Quote by Studiot
Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.

Quote by Micromass
I'll have to admit that I have never heard of this definition before. (I mean: linear dependence of sets rather than vectors). Do you have any reference of a book that does this? I would be very interested in reading about t.

I clearly defined the combination of elements and you clearly understood this.

Why are you now asking for a zero set?

A vector is a set of points that satisfy certain conditions, specific to the problem in hand.

Since this is getting further and further from the OP and personal to boot I withdraw from this thread.
 
  • #37
Studiot said:
I clearly defined the combination of elements and you clearly understood this.

Why are you now asking for a zero set?

The book you quoted clearly talked about addition of two elements. So it makes sense that you would want to define A+B for A and B sets and that you want to have a zero set. If you cannot define these things, then your definition of "linear dependence of sets" is not compatible with the definition of Borowski.

A vector is a set of points that satisfy certain conditions, specific to the problem in hand.

A vector is not a set of points. At least: nobody really thinks of a vector as a set of points. Depending on the set theory you choose, everything is a set. But I doubt many people in linear algebra see (a,b) actually as \{\{a\},\{a,b\}\}.

Since this is getting further and further from the OP and personal to boot I withdraw from this thread.

That is perhaps the best decision.
 
  • #38
Studiot said:
A vector is a set of points that satisfy certain conditions, specific to the problem in hand.
A vector isn't a set of points. It's just a member of a vector space. A vector space on the other hand can be described as a set of points that satisfy certain conditions, but the only part of those conditions that's "specific to the problem" is the choice of addition operation and scalar multiplication operation. The triple (set,addition,scalar multiplication) must satisfy the usual 8 vector space axioms, and those aren't specific to the problem.
 
  • #39
Thx everyone!

Are linearily dependent and independent used properly in the following statements:

1)Inconsistent: Two parallel lines are linearily independent yet have no solutions.

y-x=1
y-x=-2

2)Inconsistent: Three lines intersect at three different points are linearily independent yet have no solutions.

y+2x=-1
y-x=1
y-x=-2

3)Consistent: Three lines intersect at one point are linearily dependent yet have one solution.

y+2x=1
y-x=1
y-2x=1

Given these examples and what I stated in post #1, you can see what my confusion is.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K