Linear Independence/Dependence

  • Thread starter Gipson
  • Start date
  • Tags
    Linear
In summary, the conversation discusses the concepts of linear independence and linear dependence as they apply to vectors, matrices, and systems of linear equations. The main points are that linear independence means there is no way to express any of the elements as a linear combination of the others, while linear dependence means that such a combination exists. The conversation also touches on the idea of linear independence in the context of finite-dimensional vector spaces, as well as the difference between consistent and inconsistent systems of equations.
  • #36
Quote by Studiot
Formally two sets of elements are linearly dependent if there exists a linear combination of these elements equal to zero, where the coefficients of the combination are not all themselves zero.

Quote by Micromass
I'll have to admit that I have never heard of this definition before. (I mean: linear dependence of sets rather than vectors). Do you have any reference of a book that does this? I would be very interested in reading about t.

I clearly defined the combination of elements and you clearly understood this.

Why are you now asking for a zero set?

A vector is a set of points that satisfy certain conditions, specific to the problem in hand.

Since this is getting further and further from the OP and personal to boot I withdraw from this thread.
 
Physics news on Phys.org
  • #37
Studiot said:
I clearly defined the combination of elements and you clearly understood this.

Why are you now asking for a zero set?

The book you quoted clearly talked about addition of two elements. So it makes sense that you would want to define A+B for A and B sets and that you want to have a zero set. If you cannot define these things, then your definition of "linear dependence of sets" is not compatible with the definition of Borowski.

A vector is a set of points that satisfy certain conditions, specific to the problem in hand.

A vector is not a set of points. At least: nobody really thinks of a vector as a set of points. Depending on the set theory you choose, everything is a set. But I doubt many people in linear algebra see (a,b) actually as [itex]\{\{a\},\{a,b\}\}[/itex].

Since this is getting further and further from the OP and personal to boot I withdraw from this thread.

That is perhaps the best decision.
 
  • #38
Studiot said:
A vector is a set of points that satisfy certain conditions, specific to the problem in hand.
A vector isn't a set of points. It's just a member of a vector space. A vector space on the other hand can be described as a set of points that satisfy certain conditions, but the only part of those conditions that's "specific to the problem" is the choice of addition operation and scalar multiplication operation. The triple (set,addition,scalar multiplication) must satisfy the usual 8 vector space axioms, and those aren't specific to the problem.
 
  • #39
Thx everyone!

Are linearily dependent and independent used properly in the following statements:

1)Inconsistent: Two parallel lines are linearily independent yet have no solutions.

y-x=1
y-x=-2

2)Inconsistent: Three lines intersect at three different points are linearily independent yet have no solutions.

y+2x=-1
y-x=1
y-x=-2

3)Consistent: Three lines intersect at one point are linearily dependent yet have one solution.

y+2x=1
y-x=1
y-2x=1

Given these examples and what I stated in post #1, you can see what my confusion is.
 

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
866
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
887
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
1
Views
1K
Back
Top